Oct 07 11:20:32 crc systemd[1]: Starting Kubernetes Kubelet... Oct 07 11:20:32 crc restorecon[4661]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:32 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 11:20:33 crc restorecon[4661]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 11:20:33 crc restorecon[4661]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 07 11:20:33 crc kubenswrapper[4700]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 07 11:20:33 crc kubenswrapper[4700]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 07 11:20:33 crc kubenswrapper[4700]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 07 11:20:33 crc kubenswrapper[4700]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 07 11:20:33 crc kubenswrapper[4700]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 07 11:20:33 crc kubenswrapper[4700]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.666130 4700 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680125 4700 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680199 4700 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680210 4700 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680219 4700 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680227 4700 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680237 4700 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680246 4700 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680280 4700 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680289 4700 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680342 4700 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680354 4700 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680363 4700 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680371 4700 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680382 4700 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680393 4700 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680433 4700 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680442 4700 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680451 4700 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680462 4700 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680473 4700 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680481 4700 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680521 4700 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680530 4700 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680548 4700 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680557 4700 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680565 4700 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680573 4700 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680611 4700 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680620 4700 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680628 4700 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680640 4700 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680650 4700 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680658 4700 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680693 4700 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680701 4700 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680712 4700 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680720 4700 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680728 4700 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680736 4700 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680772 4700 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680781 4700 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680790 4700 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680798 4700 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680805 4700 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680813 4700 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680821 4700 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680828 4700 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680864 4700 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680871 4700 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680881 4700 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680892 4700 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680899 4700 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680907 4700 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680916 4700 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680952 4700 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680959 4700 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680967 4700 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680975 4700 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680983 4700 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680991 4700 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.680999 4700 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.681033 4700 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.681042 4700 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.681050 4700 feature_gate.go:330] unrecognized feature gate: Example Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.681077 4700 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.681086 4700 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.681122 4700 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.681130 4700 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.681138 4700 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.681146 4700 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.681154 4700 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681404 4700 flags.go:64] FLAG: --address="0.0.0.0" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681424 4700 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681440 4700 flags.go:64] FLAG: --anonymous-auth="true" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681482 4700 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681495 4700 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681505 4700 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681517 4700 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681528 4700 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681538 4700 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681577 4700 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681588 4700 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681601 4700 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681610 4700 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681619 4700 flags.go:64] FLAG: --cgroup-root="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681658 4700 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681667 4700 flags.go:64] FLAG: --client-ca-file="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681676 4700 flags.go:64] FLAG: --cloud-config="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681685 4700 flags.go:64] FLAG: --cloud-provider="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681696 4700 flags.go:64] FLAG: --cluster-dns="[]" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681712 4700 flags.go:64] FLAG: --cluster-domain="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681750 4700 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681761 4700 flags.go:64] FLAG: --config-dir="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681770 4700 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681780 4700 flags.go:64] FLAG: --container-log-max-files="5" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681791 4700 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681801 4700 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681838 4700 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681848 4700 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681858 4700 flags.go:64] FLAG: --contention-profiling="false" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681867 4700 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681877 4700 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681886 4700 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681895 4700 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681935 4700 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681944 4700 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681954 4700 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681963 4700 flags.go:64] FLAG: --enable-load-reader="false" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681972 4700 flags.go:64] FLAG: --enable-server="true" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.681981 4700 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682021 4700 flags.go:64] FLAG: --event-burst="100" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682030 4700 flags.go:64] FLAG: --event-qps="50" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682040 4700 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682049 4700 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682058 4700 flags.go:64] FLAG: --eviction-hard="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682068 4700 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682105 4700 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682115 4700 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682128 4700 flags.go:64] FLAG: --eviction-soft="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682137 4700 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682146 4700 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682155 4700 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682193 4700 flags.go:64] FLAG: --experimental-mounter-path="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682205 4700 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682214 4700 flags.go:64] FLAG: --fail-swap-on="true" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682223 4700 flags.go:64] FLAG: --feature-gates="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682235 4700 flags.go:64] FLAG: --file-check-frequency="20s" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682244 4700 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682281 4700 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682291 4700 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682300 4700 flags.go:64] FLAG: --healthz-port="10248" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682348 4700 flags.go:64] FLAG: --help="false" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682358 4700 flags.go:64] FLAG: --hostname-override="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682367 4700 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682376 4700 flags.go:64] FLAG: --http-check-frequency="20s" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682386 4700 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682395 4700 flags.go:64] FLAG: --image-credential-provider-config="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682432 4700 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682442 4700 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682451 4700 flags.go:64] FLAG: --image-service-endpoint="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682460 4700 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682470 4700 flags.go:64] FLAG: --kube-api-burst="100" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682480 4700 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682518 4700 flags.go:64] FLAG: --kube-api-qps="50" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682527 4700 flags.go:64] FLAG: --kube-reserved="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682536 4700 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682545 4700 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682554 4700 flags.go:64] FLAG: --kubelet-cgroups="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682564 4700 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682573 4700 flags.go:64] FLAG: --lock-file="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682609 4700 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682620 4700 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682629 4700 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682645 4700 flags.go:64] FLAG: --log-json-split-stream="false" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682656 4700 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682666 4700 flags.go:64] FLAG: --log-text-split-stream="false" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682705 4700 flags.go:64] FLAG: --logging-format="text" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682715 4700 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682725 4700 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682735 4700 flags.go:64] FLAG: --manifest-url="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682744 4700 flags.go:64] FLAG: --manifest-url-header="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682785 4700 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682795 4700 flags.go:64] FLAG: --max-open-files="1000000" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682806 4700 flags.go:64] FLAG: --max-pods="110" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682817 4700 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682826 4700 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682835 4700 flags.go:64] FLAG: --memory-manager-policy="None" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682844 4700 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682881 4700 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682891 4700 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682900 4700 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682920 4700 flags.go:64] FLAG: --node-status-max-images="50" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682929 4700 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682967 4700 flags.go:64] FLAG: --oom-score-adj="-999" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682977 4700 flags.go:64] FLAG: --pod-cidr="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.682986 4700 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683000 4700 flags.go:64] FLAG: --pod-manifest-path="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683009 4700 flags.go:64] FLAG: --pod-max-pids="-1" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683019 4700 flags.go:64] FLAG: --pods-per-core="0" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683056 4700 flags.go:64] FLAG: --port="10250" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683066 4700 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683075 4700 flags.go:64] FLAG: --provider-id="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683084 4700 flags.go:64] FLAG: --qos-reserved="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683093 4700 flags.go:64] FLAG: --read-only-port="10255" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683102 4700 flags.go:64] FLAG: --register-node="true" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683140 4700 flags.go:64] FLAG: --register-schedulable="true" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683149 4700 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683165 4700 flags.go:64] FLAG: --registry-burst="10" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683174 4700 flags.go:64] FLAG: --registry-qps="5" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683184 4700 flags.go:64] FLAG: --reserved-cpus="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683194 4700 flags.go:64] FLAG: --reserved-memory="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683237 4700 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683246 4700 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683256 4700 flags.go:64] FLAG: --rotate-certificates="false" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683265 4700 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683274 4700 flags.go:64] FLAG: --runonce="false" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683283 4700 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683348 4700 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683359 4700 flags.go:64] FLAG: --seccomp-default="false" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683368 4700 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683378 4700 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683387 4700 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683397 4700 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683436 4700 flags.go:64] FLAG: --storage-driver-password="root" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683446 4700 flags.go:64] FLAG: --storage-driver-secure="false" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683455 4700 flags.go:64] FLAG: --storage-driver-table="stats" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683464 4700 flags.go:64] FLAG: --storage-driver-user="root" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683474 4700 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683483 4700 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683492 4700 flags.go:64] FLAG: --system-cgroups="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683529 4700 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683544 4700 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683553 4700 flags.go:64] FLAG: --tls-cert-file="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683562 4700 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683574 4700 flags.go:64] FLAG: --tls-min-version="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683583 4700 flags.go:64] FLAG: --tls-private-key-file="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683621 4700 flags.go:64] FLAG: --topology-manager-policy="none" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683631 4700 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683640 4700 flags.go:64] FLAG: --topology-manager-scope="container" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683649 4700 flags.go:64] FLAG: --v="2" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683669 4700 flags.go:64] FLAG: --version="false" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683711 4700 flags.go:64] FLAG: --vmodule="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683722 4700 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.683732 4700 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684016 4700 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684055 4700 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684066 4700 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684076 4700 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684085 4700 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684094 4700 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684103 4700 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684112 4700 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684121 4700 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684129 4700 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684143 4700 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684154 4700 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684164 4700 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684173 4700 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684181 4700 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684189 4700 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684197 4700 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684204 4700 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684213 4700 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684221 4700 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684228 4700 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684236 4700 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684244 4700 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684252 4700 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684287 4700 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684295 4700 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684320 4700 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684328 4700 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684337 4700 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684345 4700 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684353 4700 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684361 4700 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684368 4700 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684376 4700 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684384 4700 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684391 4700 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684399 4700 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684406 4700 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684417 4700 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684425 4700 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684435 4700 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684445 4700 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684462 4700 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684470 4700 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684478 4700 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684489 4700 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684498 4700 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684506 4700 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684514 4700 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684522 4700 feature_gate.go:330] unrecognized feature gate: Example Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684530 4700 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684538 4700 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684547 4700 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684554 4700 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684562 4700 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684570 4700 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684578 4700 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684586 4700 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684594 4700 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684602 4700 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684610 4700 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684617 4700 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684625 4700 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684635 4700 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684671 4700 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684680 4700 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684688 4700 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684696 4700 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684704 4700 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684712 4700 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.684720 4700 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.685984 4700 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.698535 4700 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.698609 4700 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.698781 4700 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.698799 4700 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.698808 4700 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.698819 4700 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.698832 4700 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.698845 4700 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.698855 4700 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.698865 4700 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.698874 4700 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.698883 4700 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.698893 4700 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.698904 4700 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.698917 4700 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.698929 4700 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.698939 4700 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.698948 4700 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.698957 4700 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.698965 4700 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.698974 4700 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.698983 4700 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.698991 4700 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699003 4700 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699013 4700 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699023 4700 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699033 4700 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699042 4700 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699052 4700 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699060 4700 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699069 4700 feature_gate.go:330] unrecognized feature gate: Example Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699080 4700 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699088 4700 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699097 4700 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699106 4700 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699114 4700 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699123 4700 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699131 4700 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699140 4700 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699148 4700 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699156 4700 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699165 4700 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699174 4700 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699182 4700 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699191 4700 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699199 4700 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699212 4700 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699221 4700 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699231 4700 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699240 4700 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699249 4700 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699260 4700 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699271 4700 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699281 4700 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699291 4700 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699348 4700 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699358 4700 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699368 4700 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699377 4700 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699386 4700 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699395 4700 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699403 4700 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699412 4700 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699420 4700 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699428 4700 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699437 4700 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699445 4700 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699455 4700 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699464 4700 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699472 4700 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699480 4700 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699489 4700 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699497 4700 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.699512 4700 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699771 4700 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699786 4700 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699797 4700 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699808 4700 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699816 4700 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699825 4700 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699833 4700 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699843 4700 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699851 4700 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699861 4700 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699869 4700 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699878 4700 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699887 4700 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699895 4700 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699904 4700 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699915 4700 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699926 4700 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699935 4700 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699943 4700 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699952 4700 feature_gate.go:330] unrecognized feature gate: Example Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699960 4700 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699969 4700 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699977 4700 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699986 4700 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.699995 4700 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700003 4700 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700011 4700 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700020 4700 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700028 4700 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700038 4700 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700047 4700 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700055 4700 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700063 4700 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700072 4700 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700080 4700 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700090 4700 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700098 4700 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700107 4700 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700115 4700 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700126 4700 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700137 4700 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700148 4700 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700158 4700 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700169 4700 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700179 4700 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700194 4700 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700208 4700 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700223 4700 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700232 4700 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700242 4700 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700252 4700 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700262 4700 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700271 4700 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700279 4700 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700288 4700 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700297 4700 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700337 4700 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700348 4700 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700359 4700 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700370 4700 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700379 4700 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700388 4700 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700396 4700 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700405 4700 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700413 4700 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700424 4700 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700432 4700 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700441 4700 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700449 4700 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700457 4700 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.700466 4700 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.700481 4700 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.702270 4700 server.go:940] "Client rotation is on, will bootstrap in background" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.709674 4700 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.709835 4700 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.712067 4700 server.go:997] "Starting client certificate rotation" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.712113 4700 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.714517 4700 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-19 21:38:05.537792416 +0000 UTC Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.714694 4700 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1762h17m31.823107327s for next certificate rotation Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.742975 4700 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.747786 4700 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.773451 4700 log.go:25] "Validated CRI v1 runtime API" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.816110 4700 log.go:25] "Validated CRI v1 image API" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.819121 4700 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.826614 4700 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-07-11-08-43-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.826709 4700 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.863334 4700 manager.go:217] Machine: {Timestamp:2025-10-07 11:20:33.859411337 +0000 UTC m=+0.655810396 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:a5893d7d-6930-4dc9-ad13-e4893f51c3ad BootID:b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:0b:d0:1e Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:0b:d0:1e Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:34:44:77 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:8a:20:d5 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:d2:0b:aa Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:e2:a0:15 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:7e:65:9a:28:72:b8 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:d6:4e:6c:32:b9:18 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.863791 4700 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.864026 4700 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.865634 4700 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.865963 4700 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.866025 4700 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.866422 4700 topology_manager.go:138] "Creating topology manager with none policy" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.866443 4700 container_manager_linux.go:303] "Creating device plugin manager" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.867500 4700 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.867551 4700 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.867945 4700 state_mem.go:36] "Initialized new in-memory state store" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.868083 4700 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.872629 4700 kubelet.go:418] "Attempting to sync node with API server" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.872676 4700 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.872733 4700 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.872765 4700 kubelet.go:324] "Adding apiserver pod source" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.872790 4700 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.880073 4700 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.881656 4700 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.882376 4700 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Oct 07 11:20:33 crc kubenswrapper[4700]: E1007 11:20:33.882526 4700 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.882380 4700 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Oct 07 11:20:33 crc kubenswrapper[4700]: E1007 11:20:33.882632 4700 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.884593 4700 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.886325 4700 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.886366 4700 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.886381 4700 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.886394 4700 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.886417 4700 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.886431 4700 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.886445 4700 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.886467 4700 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.886483 4700 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.886500 4700 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.886520 4700 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.886543 4700 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.886586 4700 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.887230 4700 server.go:1280] "Started kubelet" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.887585 4700 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.887836 4700 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.888871 4700 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.889041 4700 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.889077 4700 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.889111 4700 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 20:28:59.856816832 +0000 UTC Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.889237 4700 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1257h8m25.967585094s for next certificate rotation Oct 07 11:20:33 crc systemd[1]: Started Kubernetes Kubelet. Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.890610 4700 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.890634 4700 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.891070 4700 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.891141 4700 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.891473 4700 factory.go:55] Registering systemd factory Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.891510 4700 factory.go:221] Registration of the systemd container factory successfully Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.891488 4700 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Oct 07 11:20:33 crc kubenswrapper[4700]: E1007 11:20:33.891601 4700 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Oct 07 11:20:33 crc kubenswrapper[4700]: E1007 11:20:33.890962 4700 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 07 11:20:33 crc kubenswrapper[4700]: E1007 11:20:33.897880 4700 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" interval="200ms" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.897985 4700 factory.go:153] Registering CRI-O factory Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.898076 4700 factory.go:221] Registration of the crio container factory successfully Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.902856 4700 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.902925 4700 factory.go:103] Registering Raw factory Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.902957 4700 manager.go:1196] Started watching for new ooms in manager Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.903808 4700 manager.go:319] Starting recovery of all containers Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.906642 4700 server.go:460] "Adding debug handlers to kubelet server" Oct 07 11:20:33 crc kubenswrapper[4700]: E1007 11:20:33.904972 4700 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.13:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186c318c7ec73bcc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-07 11:20:33.887189964 +0000 UTC m=+0.683588993,LastTimestamp:2025-10-07 11:20:33.887189964 +0000 UTC m=+0.683588993,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.916295 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.916427 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.916457 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.916483 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.916508 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.916534 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.916559 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.916586 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.916614 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.916638 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.916664 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.916692 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.916717 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.916800 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.916843 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.916877 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.916905 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.916938 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.916966 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.917010 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.917039 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.917069 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.917098 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.917127 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.917156 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.917212 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.917249 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.917281 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.917341 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.917373 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.917403 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.917435 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.917468 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.917496 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.917523 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.917553 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.917580 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.917610 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.917640 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.917670 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.917697 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.917723 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.917751 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.917778 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.917806 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.917835 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.917916 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.917949 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.917977 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.918006 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.918033 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.918076 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.918119 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.918149 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.918181 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.918213 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.918242 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.918270 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.918295 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.918364 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.918397 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.918426 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.918455 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.918488 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.918518 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.918549 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.918577 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.918606 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.918635 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.918662 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.918691 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.918717 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.918744 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.918771 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.918796 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.918824 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.918853 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.918882 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.918909 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.918940 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.918968 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.918996 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.919024 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.919057 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.919085 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.919113 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.919141 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.919168 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.919193 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.919224 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.919252 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.919280 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.919341 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.919372 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.919399 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.919441 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.919468 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.919495 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.919524 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.919554 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.919583 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.919610 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.919638 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.919664 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.919705 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.919737 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.919766 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.919799 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.919835 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.919865 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.919895 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.919929 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.919959 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.919990 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.920022 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.920052 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.920081 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.920109 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.920136 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.920165 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.920222 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.920254 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.920281 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.920342 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.920384 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.920414 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.920440 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.920472 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.920500 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.920526 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.920554 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.920582 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.920611 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.920638 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.920669 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.920698 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.920727 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.920754 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.920783 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.920810 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.920838 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.921006 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.921043 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.921078 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.921107 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.921134 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.921161 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.921221 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.921248 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.921280 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.921372 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.921408 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.921436 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.921463 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.921491 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.921519 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.921556 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.921584 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.921613 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.921642 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.921669 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.921697 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.921724 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.921751 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.921781 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.921810 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.921839 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.921866 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.921893 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.921919 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.921947 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.921973 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.922001 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.922028 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.922054 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.922087 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.925452 4700 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.925537 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.925635 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.925716 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.925988 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.926054 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.926085 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.926278 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.926618 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.926664 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.926694 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.926728 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.926758 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.926784 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.926813 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.926844 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.926870 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.926899 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.926930 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.926957 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.926987 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.927016 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.927048 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.927076 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.927105 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.927182 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.927210 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.927241 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.927270 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.927298 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.927364 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.927397 4700 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.927425 4700 reconstruct.go:97] "Volume reconstruction finished" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.927443 4700 reconciler.go:26] "Reconciler: start to sync state" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.931872 4700 manager.go:324] Recovery completed Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.952066 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.952042 4700 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.954164 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.954225 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.954243 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.955160 4700 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.955187 4700 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.955210 4700 state_mem.go:36] "Initialized new in-memory state store" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.955765 4700 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.955845 4700 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.955893 4700 kubelet.go:2335] "Starting kubelet main sync loop" Oct 07 11:20:33 crc kubenswrapper[4700]: E1007 11:20:33.955970 4700 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 07 11:20:33 crc kubenswrapper[4700]: W1007 11:20:33.958832 4700 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Oct 07 11:20:33 crc kubenswrapper[4700]: E1007 11:20:33.958913 4700 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.967698 4700 policy_none.go:49] "None policy: Start" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.968569 4700 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 07 11:20:33 crc kubenswrapper[4700]: I1007 11:20:33.968606 4700 state_mem.go:35] "Initializing new in-memory state store" Oct 07 11:20:33 crc kubenswrapper[4700]: E1007 11:20:33.997357 4700 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.029764 4700 manager.go:334] "Starting Device Plugin manager" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.029863 4700 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.029880 4700 server.go:79] "Starting device plugin registration server" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.030414 4700 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.030435 4700 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.031262 4700 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.031435 4700 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.031453 4700 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 07 11:20:34 crc kubenswrapper[4700]: E1007 11:20:34.040455 4700 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.056552 4700 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.056653 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.057968 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.058004 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.058019 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.058149 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.058416 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.058496 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.058893 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.058922 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.058934 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.059141 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.059251 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.059295 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.059489 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.059518 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.059530 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.059986 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.060005 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.060013 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.060142 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.060159 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.060168 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.060257 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.060395 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.060428 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.060935 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.060961 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.060974 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.061081 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.061201 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.061220 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.061231 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.061249 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.061283 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.061870 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.061892 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.061906 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.061921 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.061910 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.061956 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.062157 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.062192 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.063055 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.063078 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.063091 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:34 crc kubenswrapper[4700]: E1007 11:20:34.098628 4700 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" interval="400ms" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.129776 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.129815 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.129835 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.129851 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.129871 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.129886 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.129902 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.129953 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.129990 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.130032 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.130117 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.130140 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.130158 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.130200 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.130216 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.130603 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.132104 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.132145 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.132159 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.132196 4700 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 11:20:34 crc kubenswrapper[4700]: E1007 11:20:34.132588 4700 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.13:6443: connect: connection refused" node="crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.231535 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.231826 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.232146 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.231970 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.232259 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.232294 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.232346 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.232380 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.232406 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.232434 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.232463 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.232491 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.232521 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.232553 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.232580 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.232611 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.232640 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.233056 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.233106 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.233141 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.233175 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.233204 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.233237 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.233266 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.233299 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.233428 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.233478 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.233444 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.233546 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.233624 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.333623 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.335052 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.335123 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.335139 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.335173 4700 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 11:20:34 crc kubenswrapper[4700]: E1007 11:20:34.335579 4700 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.13:6443: connect: connection refused" node="crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.383391 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.398772 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.416596 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.422784 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.426523 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 11:20:34 crc kubenswrapper[4700]: W1007 11:20:34.435066 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-91d4f1fc4db5cbe6d07e293fa2eb6764aae965ac4d9a402074094e757e6d2236 WatchSource:0}: Error finding container 91d4f1fc4db5cbe6d07e293fa2eb6764aae965ac4d9a402074094e757e6d2236: Status 404 returned error can't find the container with id 91d4f1fc4db5cbe6d07e293fa2eb6764aae965ac4d9a402074094e757e6d2236 Oct 07 11:20:34 crc kubenswrapper[4700]: W1007 11:20:34.448601 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-4b6da629f101e7457654e5a1836c4365f95589220ad85525849acf224e078d7d WatchSource:0}: Error finding container 4b6da629f101e7457654e5a1836c4365f95589220ad85525849acf224e078d7d: Status 404 returned error can't find the container with id 4b6da629f101e7457654e5a1836c4365f95589220ad85525849acf224e078d7d Oct 07 11:20:34 crc kubenswrapper[4700]: W1007 11:20:34.450242 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-290b815c5e205ecd6972eaa837439c444889d643b8e4de53f11705b1f0b08b5e WatchSource:0}: Error finding container 290b815c5e205ecd6972eaa837439c444889d643b8e4de53f11705b1f0b08b5e: Status 404 returned error can't find the container with id 290b815c5e205ecd6972eaa837439c444889d643b8e4de53f11705b1f0b08b5e Oct 07 11:20:34 crc kubenswrapper[4700]: W1007 11:20:34.452649 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-682d1418589a28a2cf0904eb7939330ae24f75390a1e442112c0c59fe191237c WatchSource:0}: Error finding container 682d1418589a28a2cf0904eb7939330ae24f75390a1e442112c0c59fe191237c: Status 404 returned error can't find the container with id 682d1418589a28a2cf0904eb7939330ae24f75390a1e442112c0c59fe191237c Oct 07 11:20:34 crc kubenswrapper[4700]: E1007 11:20:34.500002 4700 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" interval="800ms" Oct 07 11:20:34 crc kubenswrapper[4700]: W1007 11:20:34.692245 4700 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Oct 07 11:20:34 crc kubenswrapper[4700]: E1007 11:20:34.692376 4700 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Oct 07 11:20:34 crc kubenswrapper[4700]: W1007 11:20:34.703269 4700 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Oct 07 11:20:34 crc kubenswrapper[4700]: E1007 11:20:34.703344 4700 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.736657 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.738927 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.739014 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.739034 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.739081 4700 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 11:20:34 crc kubenswrapper[4700]: E1007 11:20:34.739818 4700 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.13:6443: connect: connection refused" node="crc" Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.892513 4700 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.961940 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"682d1418589a28a2cf0904eb7939330ae24f75390a1e442112c0c59fe191237c"} Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.963683 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4b6da629f101e7457654e5a1836c4365f95589220ad85525849acf224e078d7d"} Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.964971 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"290b815c5e205ecd6972eaa837439c444889d643b8e4de53f11705b1f0b08b5e"} Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.966497 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"91d4f1fc4db5cbe6d07e293fa2eb6764aae965ac4d9a402074094e757e6d2236"} Oct 07 11:20:34 crc kubenswrapper[4700]: I1007 11:20:34.967870 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"9684f23e02ff651d5cdaf6f6dc9b916ea9971bb900c536561a856027681d40bb"} Oct 07 11:20:35 crc kubenswrapper[4700]: W1007 11:20:35.138656 4700 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Oct 07 11:20:35 crc kubenswrapper[4700]: E1007 11:20:35.139605 4700 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Oct 07 11:20:35 crc kubenswrapper[4700]: E1007 11:20:35.300890 4700 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" interval="1.6s" Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.540158 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.542227 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.542280 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.542297 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.542353 4700 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 11:20:35 crc kubenswrapper[4700]: E1007 11:20:35.543035 4700 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.13:6443: connect: connection refused" node="crc" Oct 07 11:20:35 crc kubenswrapper[4700]: W1007 11:20:35.547783 4700 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Oct 07 11:20:35 crc kubenswrapper[4700]: E1007 11:20:35.547890 4700 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.892016 4700 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.974996 4700 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090" exitCode=0 Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.975069 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090"} Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.975201 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.976683 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.976758 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.976788 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.978135 4700 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="2430db1190343f7d39181320ee028ba70f41d1f6420980596528948b9e351108" exitCode=0 Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.978208 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.978212 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"2430db1190343f7d39181320ee028ba70f41d1f6420980596528948b9e351108"} Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.979134 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.979158 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.979171 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.979475 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.980964 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.980997 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.981011 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.981159 4700 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="3a67ca6dd1cda2310e84aa65eac4f8a81721b765bb612169f423a4a9fa5b7d0f" exitCode=0 Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.981369 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"3a67ca6dd1cda2310e84aa65eac4f8a81721b765bb612169f423a4a9fa5b7d0f"} Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.981397 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.983082 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.983148 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.983173 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.985486 4700 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="05f33d7fc6a2c018fdfc7441ecbf9075c5e2cbe237c9f02d30e3dee8b553f3ff" exitCode=0 Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.985534 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"05f33d7fc6a2c018fdfc7441ecbf9075c5e2cbe237c9f02d30e3dee8b553f3ff"} Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.985614 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.987536 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.987588 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.987609 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.995365 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50"} Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.995505 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71"} Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.995522 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26"} Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.995534 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef"} Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.995681 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.997353 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.997401 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:35 crc kubenswrapper[4700]: I1007 11:20:35.997422 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:36 crc kubenswrapper[4700]: W1007 11:20:36.428910 4700 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Oct 07 11:20:36 crc kubenswrapper[4700]: E1007 11:20:36.429014 4700 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Oct 07 11:20:36 crc kubenswrapper[4700]: I1007 11:20:36.892533 4700 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Oct 07 11:20:36 crc kubenswrapper[4700]: E1007 11:20:36.902079 4700 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" interval="3.2s" Oct 07 11:20:37 crc kubenswrapper[4700]: I1007 11:20:37.001592 4700 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="8d8c9d34d2ac3bf4eb104620d717b6530171a575e6333deed43c4aabb253235e" exitCode=0 Oct 07 11:20:37 crc kubenswrapper[4700]: I1007 11:20:37.001729 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:37 crc kubenswrapper[4700]: I1007 11:20:37.001716 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"8d8c9d34d2ac3bf4eb104620d717b6530171a575e6333deed43c4aabb253235e"} Oct 07 11:20:37 crc kubenswrapper[4700]: I1007 11:20:37.003062 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:37 crc kubenswrapper[4700]: I1007 11:20:37.003161 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:37 crc kubenswrapper[4700]: I1007 11:20:37.003261 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:37 crc kubenswrapper[4700]: I1007 11:20:37.006749 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"881c4a00a6d28e60588dc9cc465730f0e190c9ac71665d691a6d0f46bbd75de2"} Oct 07 11:20:37 crc kubenswrapper[4700]: I1007 11:20:37.006880 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:37 crc kubenswrapper[4700]: I1007 11:20:37.007665 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:37 crc kubenswrapper[4700]: I1007 11:20:37.007697 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:37 crc kubenswrapper[4700]: I1007 11:20:37.007709 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:37 crc kubenswrapper[4700]: I1007 11:20:37.009187 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ddcaa62485eb2b39092d82fb0d67856085cabef7478c4f34b0b80a082579d088"} Oct 07 11:20:37 crc kubenswrapper[4700]: I1007 11:20:37.009206 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:37 crc kubenswrapper[4700]: I1007 11:20:37.009218 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0d7d621df1ddc7e0317936ccecce52285289c5e438edbc91e5ce187102c9ed27"} Oct 07 11:20:37 crc kubenswrapper[4700]: I1007 11:20:37.009230 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a3e380a35d97b01333eeb9aaf48ff90dc4766a55fbcb27631fea7e3a64385475"} Oct 07 11:20:37 crc kubenswrapper[4700]: I1007 11:20:37.009807 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:37 crc kubenswrapper[4700]: I1007 11:20:37.009834 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:37 crc kubenswrapper[4700]: I1007 11:20:37.009844 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:37 crc kubenswrapper[4700]: I1007 11:20:37.011327 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3"} Oct 07 11:20:37 crc kubenswrapper[4700]: I1007 11:20:37.011356 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1"} Oct 07 11:20:37 crc kubenswrapper[4700]: I1007 11:20:37.011369 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070"} Oct 07 11:20:37 crc kubenswrapper[4700]: I1007 11:20:37.011380 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809"} Oct 07 11:20:37 crc kubenswrapper[4700]: I1007 11:20:37.011384 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:37 crc kubenswrapper[4700]: I1007 11:20:37.011882 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:37 crc kubenswrapper[4700]: I1007 11:20:37.011903 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:37 crc kubenswrapper[4700]: I1007 11:20:37.011911 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:37 crc kubenswrapper[4700]: I1007 11:20:37.143129 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:37 crc kubenswrapper[4700]: I1007 11:20:37.144389 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:37 crc kubenswrapper[4700]: I1007 11:20:37.144423 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:37 crc kubenswrapper[4700]: I1007 11:20:37.144437 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:37 crc kubenswrapper[4700]: I1007 11:20:37.144462 4700 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 11:20:37 crc kubenswrapper[4700]: E1007 11:20:37.144950 4700 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.13:6443: connect: connection refused" node="crc" Oct 07 11:20:37 crc kubenswrapper[4700]: E1007 11:20:37.162987 4700 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.13:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186c318c7ec73bcc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-07 11:20:33.887189964 +0000 UTC m=+0.683588993,LastTimestamp:2025-10-07 11:20:33.887189964 +0000 UTC m=+0.683588993,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 07 11:20:37 crc kubenswrapper[4700]: I1007 11:20:37.286322 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 11:20:37 crc kubenswrapper[4700]: W1007 11:20:37.361927 4700 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Oct 07 11:20:37 crc kubenswrapper[4700]: E1007 11:20:37.362022 4700 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Oct 07 11:20:37 crc kubenswrapper[4700]: W1007 11:20:37.372067 4700 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Oct 07 11:20:37 crc kubenswrapper[4700]: E1007 11:20:37.372192 4700 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Oct 07 11:20:37 crc kubenswrapper[4700]: W1007 11:20:37.516839 4700 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Oct 07 11:20:37 crc kubenswrapper[4700]: E1007 11:20:37.516953 4700 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Oct 07 11:20:38 crc kubenswrapper[4700]: I1007 11:20:38.018565 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5140f235a457add0a9341ec3190ce71c7533a02f5ab60edd57857dbc1b716273"} Oct 07 11:20:38 crc kubenswrapper[4700]: I1007 11:20:38.018659 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:38 crc kubenswrapper[4700]: I1007 11:20:38.019720 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:38 crc kubenswrapper[4700]: I1007 11:20:38.019752 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:38 crc kubenswrapper[4700]: I1007 11:20:38.019765 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:38 crc kubenswrapper[4700]: I1007 11:20:38.022611 4700 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="679d1fa7372bf615106ff8df858107c1acea99f339f8e7043b82c933d4b20caf" exitCode=0 Oct 07 11:20:38 crc kubenswrapper[4700]: I1007 11:20:38.022703 4700 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 11:20:38 crc kubenswrapper[4700]: I1007 11:20:38.022715 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:38 crc kubenswrapper[4700]: I1007 11:20:38.022736 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:38 crc kubenswrapper[4700]: I1007 11:20:38.022758 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:38 crc kubenswrapper[4700]: I1007 11:20:38.022790 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:38 crc kubenswrapper[4700]: I1007 11:20:38.022707 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"679d1fa7372bf615106ff8df858107c1acea99f339f8e7043b82c933d4b20caf"} Oct 07 11:20:38 crc kubenswrapper[4700]: I1007 11:20:38.024076 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:38 crc kubenswrapper[4700]: I1007 11:20:38.024099 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:38 crc kubenswrapper[4700]: I1007 11:20:38.024086 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:38 crc kubenswrapper[4700]: I1007 11:20:38.024147 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:38 crc kubenswrapper[4700]: I1007 11:20:38.024158 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:38 crc kubenswrapper[4700]: I1007 11:20:38.024173 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:38 crc kubenswrapper[4700]: I1007 11:20:38.024192 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:38 crc kubenswrapper[4700]: I1007 11:20:38.024235 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:38 crc kubenswrapper[4700]: I1007 11:20:38.024256 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:38 crc kubenswrapper[4700]: I1007 11:20:38.024269 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:38 crc kubenswrapper[4700]: I1007 11:20:38.024239 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:38 crc kubenswrapper[4700]: I1007 11:20:38.024301 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:38 crc kubenswrapper[4700]: I1007 11:20:38.125193 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 11:20:38 crc kubenswrapper[4700]: I1007 11:20:38.132332 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 11:20:38 crc kubenswrapper[4700]: I1007 11:20:38.333088 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 11:20:38 crc kubenswrapper[4700]: I1007 11:20:38.665035 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 11:20:38 crc kubenswrapper[4700]: I1007 11:20:38.954275 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 11:20:39 crc kubenswrapper[4700]: I1007 11:20:39.031069 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"55ca59f3191332affc6cdac5039fe751e54439ab80ed2998e18b086a4236f488"} Oct 07 11:20:39 crc kubenswrapper[4700]: I1007 11:20:39.031142 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7d525716345ab704ad984ed95d3ce1bb2b5c83db65fcfe914d74ec5a58fbd8c8"} Oct 07 11:20:39 crc kubenswrapper[4700]: I1007 11:20:39.031168 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e954668210c3db738563c35ca84fd8b8a56153b8bbdd5a3b90be987bfe6cfcc1"} Oct 07 11:20:39 crc kubenswrapper[4700]: I1007 11:20:39.031194 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:39 crc kubenswrapper[4700]: I1007 11:20:39.031375 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:39 crc kubenswrapper[4700]: I1007 11:20:39.031201 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"025928f654a09001375bc28171f776f2bbff271458376e9e1acb7eedc3c72616"} Oct 07 11:20:39 crc kubenswrapper[4700]: I1007 11:20:39.031446 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:39 crc kubenswrapper[4700]: I1007 11:20:39.031505 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e8ac527143b02ad63853d391da5183e040e999f1d1eb8aadaec9ae0ddc4dae26"} Oct 07 11:20:39 crc kubenswrapper[4700]: I1007 11:20:39.032345 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:39 crc kubenswrapper[4700]: I1007 11:20:39.032404 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 11:20:39 crc kubenswrapper[4700]: I1007 11:20:39.032757 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:39 crc kubenswrapper[4700]: I1007 11:20:39.032819 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:39 crc kubenswrapper[4700]: I1007 11:20:39.032833 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:39 crc kubenswrapper[4700]: I1007 11:20:39.033269 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:39 crc kubenswrapper[4700]: I1007 11:20:39.033362 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:39 crc kubenswrapper[4700]: I1007 11:20:39.033391 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:39 crc kubenswrapper[4700]: I1007 11:20:39.033392 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:39 crc kubenswrapper[4700]: I1007 11:20:39.033443 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:39 crc kubenswrapper[4700]: I1007 11:20:39.033460 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:39 crc kubenswrapper[4700]: I1007 11:20:39.033501 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:39 crc kubenswrapper[4700]: I1007 11:20:39.033539 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:39 crc kubenswrapper[4700]: I1007 11:20:39.033556 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:39 crc kubenswrapper[4700]: I1007 11:20:39.149240 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 07 11:20:40 crc kubenswrapper[4700]: I1007 11:20:40.034594 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:40 crc kubenswrapper[4700]: I1007 11:20:40.035398 4700 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 11:20:40 crc kubenswrapper[4700]: I1007 11:20:40.035438 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:40 crc kubenswrapper[4700]: I1007 11:20:40.035681 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:40 crc kubenswrapper[4700]: I1007 11:20:40.036125 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:40 crc kubenswrapper[4700]: I1007 11:20:40.036197 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:40 crc kubenswrapper[4700]: I1007 11:20:40.036222 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:40 crc kubenswrapper[4700]: I1007 11:20:40.036725 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:40 crc kubenswrapper[4700]: I1007 11:20:40.036823 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:40 crc kubenswrapper[4700]: I1007 11:20:40.036847 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:40 crc kubenswrapper[4700]: I1007 11:20:40.037019 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:40 crc kubenswrapper[4700]: I1007 11:20:40.037077 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:40 crc kubenswrapper[4700]: I1007 11:20:40.037101 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:40 crc kubenswrapper[4700]: I1007 11:20:40.287070 4700 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 07 11:20:40 crc kubenswrapper[4700]: I1007 11:20:40.287202 4700 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 11:20:40 crc kubenswrapper[4700]: I1007 11:20:40.345642 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:40 crc kubenswrapper[4700]: I1007 11:20:40.347731 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:40 crc kubenswrapper[4700]: I1007 11:20:40.347801 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:40 crc kubenswrapper[4700]: I1007 11:20:40.347822 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:40 crc kubenswrapper[4700]: I1007 11:20:40.347861 4700 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 11:20:40 crc kubenswrapper[4700]: I1007 11:20:40.436690 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 11:20:40 crc kubenswrapper[4700]: I1007 11:20:40.884771 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 11:20:41 crc kubenswrapper[4700]: I1007 11:20:41.037182 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:41 crc kubenswrapper[4700]: I1007 11:20:41.037236 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:41 crc kubenswrapper[4700]: I1007 11:20:41.037191 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:41 crc kubenswrapper[4700]: I1007 11:20:41.038978 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:41 crc kubenswrapper[4700]: I1007 11:20:41.039024 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:41 crc kubenswrapper[4700]: I1007 11:20:41.039101 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:41 crc kubenswrapper[4700]: I1007 11:20:41.039119 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:41 crc kubenswrapper[4700]: I1007 11:20:41.039034 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:41 crc kubenswrapper[4700]: I1007 11:20:41.039252 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:41 crc kubenswrapper[4700]: I1007 11:20:41.039367 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:41 crc kubenswrapper[4700]: I1007 11:20:41.039396 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:41 crc kubenswrapper[4700]: I1007 11:20:41.039412 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:41 crc kubenswrapper[4700]: I1007 11:20:41.662823 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 07 11:20:42 crc kubenswrapper[4700]: I1007 11:20:42.040478 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:42 crc kubenswrapper[4700]: I1007 11:20:42.041853 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:42 crc kubenswrapper[4700]: I1007 11:20:42.041897 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:42 crc kubenswrapper[4700]: I1007 11:20:42.041919 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:44 crc kubenswrapper[4700]: E1007 11:20:44.040620 4700 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 07 11:20:47 crc kubenswrapper[4700]: I1007 11:20:47.892701 4700 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 07 11:20:48 crc kubenswrapper[4700]: I1007 11:20:48.058724 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 07 11:20:48 crc kubenswrapper[4700]: I1007 11:20:48.061640 4700 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5140f235a457add0a9341ec3190ce71c7533a02f5ab60edd57857dbc1b716273" exitCode=255 Oct 07 11:20:48 crc kubenswrapper[4700]: I1007 11:20:48.061694 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"5140f235a457add0a9341ec3190ce71c7533a02f5ab60edd57857dbc1b716273"} Oct 07 11:20:48 crc kubenswrapper[4700]: I1007 11:20:48.061864 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:48 crc kubenswrapper[4700]: I1007 11:20:48.063168 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:48 crc kubenswrapper[4700]: I1007 11:20:48.063435 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:48 crc kubenswrapper[4700]: I1007 11:20:48.063658 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:48 crc kubenswrapper[4700]: I1007 11:20:48.064940 4700 scope.go:117] "RemoveContainer" containerID="5140f235a457add0a9341ec3190ce71c7533a02f5ab60edd57857dbc1b716273" Oct 07 11:20:48 crc kubenswrapper[4700]: I1007 11:20:48.204410 4700 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 07 11:20:48 crc kubenswrapper[4700]: I1007 11:20:48.204483 4700 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 07 11:20:48 crc kubenswrapper[4700]: I1007 11:20:48.209370 4700 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 07 11:20:48 crc kubenswrapper[4700]: I1007 11:20:48.209445 4700 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 07 11:20:49 crc kubenswrapper[4700]: I1007 11:20:49.068527 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 07 11:20:49 crc kubenswrapper[4700]: I1007 11:20:49.071156 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2"} Oct 07 11:20:49 crc kubenswrapper[4700]: I1007 11:20:49.071392 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:49 crc kubenswrapper[4700]: I1007 11:20:49.072348 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:49 crc kubenswrapper[4700]: I1007 11:20:49.072398 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:49 crc kubenswrapper[4700]: I1007 11:20:49.072413 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:50 crc kubenswrapper[4700]: I1007 11:20:50.288216 4700 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 07 11:20:50 crc kubenswrapper[4700]: I1007 11:20:50.288399 4700 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 07 11:20:50 crc kubenswrapper[4700]: I1007 11:20:50.442020 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 11:20:50 crc kubenswrapper[4700]: I1007 11:20:50.442242 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:50 crc kubenswrapper[4700]: I1007 11:20:50.444179 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:50 crc kubenswrapper[4700]: I1007 11:20:50.444248 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:50 crc kubenswrapper[4700]: I1007 11:20:50.444259 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:50 crc kubenswrapper[4700]: I1007 11:20:50.892151 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 11:20:50 crc kubenswrapper[4700]: I1007 11:20:50.892412 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:50 crc kubenswrapper[4700]: I1007 11:20:50.892580 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 11:20:50 crc kubenswrapper[4700]: I1007 11:20:50.893909 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:50 crc kubenswrapper[4700]: I1007 11:20:50.894001 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:50 crc kubenswrapper[4700]: I1007 11:20:50.894032 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:50 crc kubenswrapper[4700]: I1007 11:20:50.900001 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 11:20:51 crc kubenswrapper[4700]: I1007 11:20:51.076526 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:51 crc kubenswrapper[4700]: I1007 11:20:51.078115 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:51 crc kubenswrapper[4700]: I1007 11:20:51.078177 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:51 crc kubenswrapper[4700]: I1007 11:20:51.078187 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:51 crc kubenswrapper[4700]: I1007 11:20:51.695031 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 07 11:20:51 crc kubenswrapper[4700]: I1007 11:20:51.695359 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:51 crc kubenswrapper[4700]: I1007 11:20:51.697039 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:51 crc kubenswrapper[4700]: I1007 11:20:51.697122 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:51 crc kubenswrapper[4700]: I1007 11:20:51.697150 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:51 crc kubenswrapper[4700]: I1007 11:20:51.717686 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 07 11:20:52 crc kubenswrapper[4700]: I1007 11:20:52.080106 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:52 crc kubenswrapper[4700]: I1007 11:20:52.080157 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:52 crc kubenswrapper[4700]: I1007 11:20:52.081939 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:52 crc kubenswrapper[4700]: I1007 11:20:52.082010 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:52 crc kubenswrapper[4700]: I1007 11:20:52.082034 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:52 crc kubenswrapper[4700]: I1007 11:20:52.082190 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:52 crc kubenswrapper[4700]: I1007 11:20:52.082243 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:52 crc kubenswrapper[4700]: I1007 11:20:52.082263 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:53 crc kubenswrapper[4700]: E1007 11:20:53.199459 4700 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 07 11:20:53 crc kubenswrapper[4700]: I1007 11:20:53.202040 4700 trace.go:236] Trace[1286690976]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (07-Oct-2025 11:20:41.075) (total time: 12126ms): Oct 07 11:20:53 crc kubenswrapper[4700]: Trace[1286690976]: ---"Objects listed" error: 12126ms (11:20:53.201) Oct 07 11:20:53 crc kubenswrapper[4700]: Trace[1286690976]: [12.126180644s] [12.126180644s] END Oct 07 11:20:53 crc kubenswrapper[4700]: I1007 11:20:53.202080 4700 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 07 11:20:53 crc kubenswrapper[4700]: I1007 11:20:53.203059 4700 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 07 11:20:53 crc kubenswrapper[4700]: I1007 11:20:53.204760 4700 trace.go:236] Trace[885730263]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (07-Oct-2025 11:20:41.426) (total time: 11777ms): Oct 07 11:20:53 crc kubenswrapper[4700]: Trace[885730263]: ---"Objects listed" error: 11777ms (11:20:53.204) Oct 07 11:20:53 crc kubenswrapper[4700]: Trace[885730263]: [11.777780867s] [11.777780867s] END Oct 07 11:20:53 crc kubenswrapper[4700]: I1007 11:20:53.204800 4700 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 07 11:20:53 crc kubenswrapper[4700]: I1007 11:20:53.205786 4700 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 07 11:20:53 crc kubenswrapper[4700]: I1007 11:20:53.206599 4700 trace.go:236] Trace[423087183]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (07-Oct-2025 11:20:41.598) (total time: 11607ms): Oct 07 11:20:53 crc kubenswrapper[4700]: Trace[423087183]: ---"Objects listed" error: 11607ms (11:20:53.206) Oct 07 11:20:53 crc kubenswrapper[4700]: Trace[423087183]: [11.607663081s] [11.607663081s] END Oct 07 11:20:53 crc kubenswrapper[4700]: I1007 11:20:53.206645 4700 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 07 11:20:53 crc kubenswrapper[4700]: E1007 11:20:53.207865 4700 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 07 11:20:53 crc kubenswrapper[4700]: I1007 11:20:53.885355 4700 apiserver.go:52] "Watching apiserver" Oct 07 11:20:53 crc kubenswrapper[4700]: I1007 11:20:53.892436 4700 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 07 11:20:53 crc kubenswrapper[4700]: I1007 11:20:53.892799 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Oct 07 11:20:53 crc kubenswrapper[4700]: I1007 11:20:53.893251 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 11:20:53 crc kubenswrapper[4700]: I1007 11:20:53.893283 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:20:53 crc kubenswrapper[4700]: I1007 11:20:53.893348 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:20:53 crc kubenswrapper[4700]: E1007 11:20:53.893517 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:20:53 crc kubenswrapper[4700]: E1007 11:20:53.893629 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:20:53 crc kubenswrapper[4700]: I1007 11:20:53.894065 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 11:20:53 crc kubenswrapper[4700]: I1007 11:20:53.894601 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 11:20:53 crc kubenswrapper[4700]: I1007 11:20:53.894681 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:20:53 crc kubenswrapper[4700]: E1007 11:20:53.894748 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:20:53 crc kubenswrapper[4700]: I1007 11:20:53.896773 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 07 11:20:53 crc kubenswrapper[4700]: I1007 11:20:53.897198 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 07 11:20:53 crc kubenswrapper[4700]: I1007 11:20:53.897527 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 07 11:20:53 crc kubenswrapper[4700]: I1007 11:20:53.897622 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 07 11:20:53 crc kubenswrapper[4700]: I1007 11:20:53.901779 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 07 11:20:53 crc kubenswrapper[4700]: I1007 11:20:53.902067 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 07 11:20:53 crc kubenswrapper[4700]: I1007 11:20:53.902136 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 07 11:20:53 crc kubenswrapper[4700]: I1007 11:20:53.902158 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 07 11:20:53 crc kubenswrapper[4700]: I1007 11:20:53.902227 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 07 11:20:53 crc kubenswrapper[4700]: I1007 11:20:53.928396 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 11:20:53 crc kubenswrapper[4700]: I1007 11:20:53.962805 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 11:20:53 crc kubenswrapper[4700]: I1007 11:20:53.991773 4700 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 07 11:20:53 crc kubenswrapper[4700]: I1007 11:20:53.995560 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.010044 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.010099 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.010129 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.010154 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.010176 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.010203 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.010230 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.010253 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.010279 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.010327 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.010348 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.010377 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.010401 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.010424 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.010446 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.010469 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.010492 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.010516 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.010539 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.010525 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.010559 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.010785 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.010815 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.010835 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.010865 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011256 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011282 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011436 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011468 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011491 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011552 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011575 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011602 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011622 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011655 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011681 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011701 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011722 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011740 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011758 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011777 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011795 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011817 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011837 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011861 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011881 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011899 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011916 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011934 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011955 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011974 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011994 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012016 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012034 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012052 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012071 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012094 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012112 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012130 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012147 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012165 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012183 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012201 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012217 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012233 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012250 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012270 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012287 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012308 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012341 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012360 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012380 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012397 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012414 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012432 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012451 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012472 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012493 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012509 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012525 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012542 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012559 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012575 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012592 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012611 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012630 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012648 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012663 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012679 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012696 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012712 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012729 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012746 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012762 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012779 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012795 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012814 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012830 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012850 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012868 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012887 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012904 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012922 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012939 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012955 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012972 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012991 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013008 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013023 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013039 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013056 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013072 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013089 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013105 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013123 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013140 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013158 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013177 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013205 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013224 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013241 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013268 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013291 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013324 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013345 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013368 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013386 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013405 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.010993 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013446 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011053 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011049 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011115 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011171 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011614 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011662 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011731 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011771 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011828 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011912 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.011967 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012029 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012031 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012148 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012173 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012200 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012238 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.012433 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013144 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013201 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013194 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013405 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013412 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013520 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013539 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013714 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013425 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013776 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013790 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013794 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013818 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013781 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013843 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013866 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013893 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013914 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013935 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013953 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013973 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013990 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014008 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014025 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014045 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014064 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014083 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014100 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.013862 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014033 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014067 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014114 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014124 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014426 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: E1007 11:20:54.014130 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:20:54.51410794 +0000 UTC m=+21.310506999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014481 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014505 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014502 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014514 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014590 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014616 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014640 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014634 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014691 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014718 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014741 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014762 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014786 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014832 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014858 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014878 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014902 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014934 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014964 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014986 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015008 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015029 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015051 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015074 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015095 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015114 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015137 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015155 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015174 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015192 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015213 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015233 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015259 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015277 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015298 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015336 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015355 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015377 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015396 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015414 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015432 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015453 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015471 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015489 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015512 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015532 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015587 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015605 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015628 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015650 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015670 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015692 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015710 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015729 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015751 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015773 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015829 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015862 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015887 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015910 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015932 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015954 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015981 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016002 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016080 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016125 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016158 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016187 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016216 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016247 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016345 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016362 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016376 4700 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016390 4700 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016404 4700 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016421 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016435 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016449 4700 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016463 4700 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016478 4700 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016492 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016506 4700 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016523 4700 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016537 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016551 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016565 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016579 4700 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016592 4700 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016605 4700 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016618 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016631 4700 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016644 4700 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016658 4700 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016672 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016688 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016701 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016715 4700 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016728 4700 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016743 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016760 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016773 4700 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016787 4700 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016801 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016816 4700 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016830 4700 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016843 4700 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016857 4700 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016874 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016887 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016903 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016916 4700 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.017479 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.017625 4700 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014649 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014672 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014355 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014373 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014738 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014769 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.020963 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.020975 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014855 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014936 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014960 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014922 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015050 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015165 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015202 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015270 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015284 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015383 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015431 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015466 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015490 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015602 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015632 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015729 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015748 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015852 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015956 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015961 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.015980 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016028 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016098 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016307 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016328 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.017136 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.016896 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.017864 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.018849 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.019683 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.019992 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.020026 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.020260 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.020883 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.014246 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.021221 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.021840 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.022024 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.022384 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.022561 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.022627 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.022053 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.022750 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.022998 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.022966 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.023130 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.023250 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.023338 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.023469 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.023690 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.023714 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.023749 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.023877 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.023949 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.024000 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.024162 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.024206 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.024250 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.024436 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.024819 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.025245 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.025886 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.025941 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.026448 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.026504 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.026723 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.027769 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.028053 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.028209 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.028209 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.028468 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.028725 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.028744 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: E1007 11:20:54.028854 4700 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 11:20:54 crc kubenswrapper[4700]: E1007 11:20:54.028906 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 11:20:54.528891648 +0000 UTC m=+21.325290637 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.028929 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.028922 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.028964 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.029050 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.029053 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.029388 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.029364 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.029412 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.029457 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.029733 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.029872 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.029991 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.030025 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.030089 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.030254 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.030329 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.030407 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.030588 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.030678 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.030793 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.030865 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.030967 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: E1007 11:20:54.031085 4700 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.031160 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: E1007 11:20:54.031172 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 11:20:54.531152439 +0000 UTC m=+21.327551428 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.031204 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.031326 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.031477 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.031611 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.031678 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.032019 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.032067 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.032478 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.032588 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.032616 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.032868 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.033428 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.034973 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.035575 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.035650 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.035673 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.035982 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.036015 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.036030 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.036047 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.036501 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.037015 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.037201 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.037384 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.038248 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.038294 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.039100 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.039414 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.040040 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.040213 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.040683 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.040709 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.041814 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.042207 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.042359 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.042460 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.042594 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.042788 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.043113 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.043140 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.043200 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.043644 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.043947 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.044633 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: E1007 11:20:54.045099 4700 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 11:20:54 crc kubenswrapper[4700]: E1007 11:20:54.045118 4700 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 11:20:54 crc kubenswrapper[4700]: E1007 11:20:54.045131 4700 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 11:20:54 crc kubenswrapper[4700]: E1007 11:20:54.045187 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 11:20:54.545170147 +0000 UTC m=+21.341569136 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.045465 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.046322 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.046526 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.046557 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.046944 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.047525 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.047584 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.049149 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 11:20:54 crc kubenswrapper[4700]: E1007 11:20:54.052859 4700 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 11:20:54 crc kubenswrapper[4700]: E1007 11:20:54.052901 4700 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 11:20:54 crc kubenswrapper[4700]: E1007 11:20:54.052914 4700 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 11:20:54 crc kubenswrapper[4700]: E1007 11:20:54.052980 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 11:20:54.552965098 +0000 UTC m=+21.349364087 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.053874 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.057254 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.069002 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.069474 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.075974 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.080702 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.082607 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.092745 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.093268 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.094328 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.097963 4700 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2" exitCode=255 Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.098003 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2"} Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.098057 4700 scope.go:117] "RemoveContainer" containerID="5140f235a457add0a9341ec3190ce71c7533a02f5ab60edd57857dbc1b716273" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.098680 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.101662 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.110728 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.111467 4700 scope.go:117] "RemoveContainer" containerID="34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2" Oct 07 11:20:54 crc kubenswrapper[4700]: E1007 11:20:54.111713 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.112167 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118328 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118362 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118421 4700 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118432 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118443 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118451 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118461 4700 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118469 4700 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118478 4700 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118486 4700 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118482 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118486 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118494 4700 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118567 4700 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118583 4700 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118596 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118607 4700 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118618 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118630 4700 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118643 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118653 4700 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118665 4700 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118677 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118689 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118700 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118713 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118725 4700 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118736 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118749 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118760 4700 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118771 4700 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118782 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118794 4700 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118807 4700 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118818 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118830 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118842 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118854 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118866 4700 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118877 4700 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118890 4700 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118902 4700 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118915 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118928 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118941 4700 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118954 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118966 4700 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118977 4700 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.118988 4700 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119000 4700 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119012 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119023 4700 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119035 4700 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119046 4700 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119058 4700 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119069 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119081 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119093 4700 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119104 4700 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119117 4700 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119129 4700 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119141 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119152 4700 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119164 4700 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119177 4700 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119189 4700 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119201 4700 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119212 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119224 4700 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119236 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119249 4700 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119260 4700 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119270 4700 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119283 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119295 4700 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119332 4700 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119344 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119358 4700 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119369 4700 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119383 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119394 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119405 4700 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119416 4700 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119427 4700 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119438 4700 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119451 4700 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119462 4700 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119473 4700 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119483 4700 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119494 4700 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119507 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119518 4700 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119531 4700 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119542 4700 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119553 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119567 4700 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119579 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119592 4700 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119604 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119617 4700 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119630 4700 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119643 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119655 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119669 4700 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119682 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119694 4700 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119706 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119718 4700 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119731 4700 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119743 4700 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119754 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119766 4700 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119778 4700 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119789 4700 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119802 4700 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119815 4700 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119829 4700 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119841 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119852 4700 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119864 4700 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119878 4700 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119893 4700 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119908 4700 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119922 4700 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119934 4700 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119945 4700 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119957 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119971 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119983 4700 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.119994 4700 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.120006 4700 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.120018 4700 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.120037 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.120050 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.120077 4700 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.120088 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.120101 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.120112 4700 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.120124 4700 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.120136 4700 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.120148 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.120159 4700 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.120170 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.120182 4700 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.120194 4700 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.120205 4700 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.120218 4700 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.120230 4700 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.120251 4700 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.120264 4700 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.120278 4700 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.120290 4700 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.120324 4700 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.120337 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.120348 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.120362 4700 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.120373 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.121006 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.128123 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.141457 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.153717 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.167369 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5140f235a457add0a9341ec3190ce71c7533a02f5ab60edd57857dbc1b716273\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:47Z\\\",\\\"message\\\":\\\"W1007 11:20:37.126630 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 11:20:37.127036 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759836037 cert, and key in /tmp/serving-cert-1192647323/serving-signer.crt, /tmp/serving-cert-1192647323/serving-signer.key\\\\nI1007 11:20:37.493152 1 observer_polling.go:159] Starting file observer\\\\nW1007 11:20:37.495299 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 11:20:37.495536 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 11:20:37.496485 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1192647323/tls.crt::/tmp/serving-cert-1192647323/tls.key\\\\\\\"\\\\nF1007 11:20:47.907340 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.179288 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.192635 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.204397 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.207557 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.214088 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.215126 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.223223 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 11:20:54 crc kubenswrapper[4700]: W1007 11:20:54.226262 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-6851cd5a2492f80a65ea01a118d508d4e4ca9501f311e0f3dd0a3b77fc430e3f WatchSource:0}: Error finding container 6851cd5a2492f80a65ea01a118d508d4e4ca9501f311e0f3dd0a3b77fc430e3f: Status 404 returned error can't find the container with id 6851cd5a2492f80a65ea01a118d508d4e4ca9501f311e0f3dd0a3b77fc430e3f Oct 07 11:20:54 crc kubenswrapper[4700]: W1007 11:20:54.239002 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-9068b33a3729adaddea24dabc858d37bb42ca3714c35e0a31fcd2728b7f37d3b WatchSource:0}: Error finding container 9068b33a3729adaddea24dabc858d37bb42ca3714c35e0a31fcd2728b7f37d3b: Status 404 returned error can't find the container with id 9068b33a3729adaddea24dabc858d37bb42ca3714c35e0a31fcd2728b7f37d3b Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.523792 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:20:54 crc kubenswrapper[4700]: E1007 11:20:54.523943 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:20:55.523922629 +0000 UTC m=+22.320321628 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.625207 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.625299 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.625405 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:20:54 crc kubenswrapper[4700]: I1007 11:20:54.625450 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:20:54 crc kubenswrapper[4700]: E1007 11:20:54.625641 4700 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 11:20:54 crc kubenswrapper[4700]: E1007 11:20:54.625668 4700 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 11:20:54 crc kubenswrapper[4700]: E1007 11:20:54.625692 4700 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 11:20:54 crc kubenswrapper[4700]: E1007 11:20:54.625767 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 11:20:55.625744025 +0000 UTC m=+22.422143054 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 11:20:54 crc kubenswrapper[4700]: E1007 11:20:54.625975 4700 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 11:20:54 crc kubenswrapper[4700]: E1007 11:20:54.626071 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 11:20:55.626051023 +0000 UTC m=+22.422450052 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 11:20:54 crc kubenswrapper[4700]: E1007 11:20:54.626144 4700 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 11:20:54 crc kubenswrapper[4700]: E1007 11:20:54.626162 4700 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 11:20:54 crc kubenswrapper[4700]: E1007 11:20:54.626236 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 11:20:55.626195127 +0000 UTC m=+22.422594156 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 11:20:54 crc kubenswrapper[4700]: E1007 11:20:54.626255 4700 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 11:20:54 crc kubenswrapper[4700]: E1007 11:20:54.626276 4700 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 11:20:54 crc kubenswrapper[4700]: E1007 11:20:54.626449 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 11:20:55.626393653 +0000 UTC m=+22.422792842 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.104210 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.108926 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"9068b33a3729adaddea24dabc858d37bb42ca3714c35e0a31fcd2728b7f37d3b"} Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.109180 4700 scope.go:117] "RemoveContainer" containerID="34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2" Oct 07 11:20:55 crc kubenswrapper[4700]: E1007 11:20:55.110246 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.111858 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb"} Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.111908 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740"} Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.111929 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4ac272a80b5ca22f3f64ebf30452508a3e197e44f6c6c12ca468d48f7ce8e350"} Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.114257 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418"} Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.114363 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"6851cd5a2492f80a65ea01a118d508d4e4ca9501f311e0f3dd0a3b77fc430e3f"} Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.133186 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:55Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.154505 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:55Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.175341 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:55Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.198613 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:55Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.220203 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:55Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.244339 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:55Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.265029 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:55Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.298995 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:55Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.319536 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:55Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.342298 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:55Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.364586 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:55Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.386261 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:55Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.407562 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:55Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.428094 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:55Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.535768 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:20:55 crc kubenswrapper[4700]: E1007 11:20:55.536012 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:20:57.535964883 +0000 UTC m=+24.332363912 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.636456 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.636510 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.636536 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.636566 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:20:55 crc kubenswrapper[4700]: E1007 11:20:55.636670 4700 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 11:20:55 crc kubenswrapper[4700]: E1007 11:20:55.636715 4700 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 11:20:55 crc kubenswrapper[4700]: E1007 11:20:55.636742 4700 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 11:20:55 crc kubenswrapper[4700]: E1007 11:20:55.636774 4700 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 11:20:55 crc kubenswrapper[4700]: E1007 11:20:55.636791 4700 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 11:20:55 crc kubenswrapper[4700]: E1007 11:20:55.636798 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 11:20:57.636772891 +0000 UTC m=+24.433172090 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 11:20:55 crc kubenswrapper[4700]: E1007 11:20:55.636744 4700 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 11:20:55 crc kubenswrapper[4700]: E1007 11:20:55.636860 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 11:20:57.636839843 +0000 UTC m=+24.433238842 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 11:20:55 crc kubenswrapper[4700]: E1007 11:20:55.636864 4700 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 11:20:55 crc kubenswrapper[4700]: E1007 11:20:55.636926 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 11:20:57.636910845 +0000 UTC m=+24.433310034 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 11:20:55 crc kubenswrapper[4700]: E1007 11:20:55.636854 4700 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 11:20:55 crc kubenswrapper[4700]: E1007 11:20:55.636996 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 11:20:57.636987627 +0000 UTC m=+24.433386626 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.957207 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.957258 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:20:55 crc kubenswrapper[4700]: E1007 11:20:55.957384 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.957421 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:20:55 crc kubenswrapper[4700]: E1007 11:20:55.957626 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:20:55 crc kubenswrapper[4700]: E1007 11:20:55.957839 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.962003 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.962523 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.963365 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.964002 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.964665 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.965195 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.965820 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.966419 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.967071 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.967632 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.968135 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.968816 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.969290 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.969824 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.970378 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.970885 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.972326 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.973045 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.973730 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.974457 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.974923 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.975527 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.975958 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.976615 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.977044 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.977626 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.978223 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.978704 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.979250 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.979772 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.980285 4700 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.980412 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.981766 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.985260 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.985748 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.987212 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.988441 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.988913 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.989899 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.990537 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.991374 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.992067 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.993230 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.994033 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.995113 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.995842 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.996997 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.997785 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.998778 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 07 11:20:55 crc kubenswrapper[4700]: I1007 11:20:55.999263 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 07 11:20:56 crc kubenswrapper[4700]: I1007 11:20:56.000089 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 07 11:20:56 crc kubenswrapper[4700]: I1007 11:20:56.000699 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 07 11:20:56 crc kubenswrapper[4700]: I1007 11:20:56.001246 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 07 11:20:56 crc kubenswrapper[4700]: I1007 11:20:56.002079 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 07 11:20:57 crc kubenswrapper[4700]: I1007 11:20:57.121922 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed"} Oct 07 11:20:57 crc kubenswrapper[4700]: I1007 11:20:57.161899 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:57Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:57 crc kubenswrapper[4700]: I1007 11:20:57.181790 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:57Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:57 crc kubenswrapper[4700]: I1007 11:20:57.205016 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:57Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:57 crc kubenswrapper[4700]: I1007 11:20:57.225169 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:57Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:57 crc kubenswrapper[4700]: I1007 11:20:57.248802 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:57Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:57 crc kubenswrapper[4700]: I1007 11:20:57.271393 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:57Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:57 crc kubenswrapper[4700]: I1007 11:20:57.290746 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:57Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:57 crc kubenswrapper[4700]: I1007 11:20:57.294044 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 11:20:57 crc kubenswrapper[4700]: I1007 11:20:57.300983 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 11:20:57 crc kubenswrapper[4700]: I1007 11:20:57.304701 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 07 11:20:57 crc kubenswrapper[4700]: I1007 11:20:57.310899 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:57Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:57 crc kubenswrapper[4700]: I1007 11:20:57.324463 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:57Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:57 crc kubenswrapper[4700]: I1007 11:20:57.341601 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:57Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:57 crc kubenswrapper[4700]: I1007 11:20:57.357091 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:57Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:57 crc kubenswrapper[4700]: I1007 11:20:57.374503 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:57Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:57 crc kubenswrapper[4700]: I1007 11:20:57.389054 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:57Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:57 crc kubenswrapper[4700]: I1007 11:20:57.399854 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:57Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:57 crc kubenswrapper[4700]: I1007 11:20:57.410266 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:57Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:57 crc kubenswrapper[4700]: I1007 11:20:57.421625 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:57Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:57 crc kubenswrapper[4700]: I1007 11:20:57.436449 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:57Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:57 crc kubenswrapper[4700]: I1007 11:20:57.448120 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:57Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:57 crc kubenswrapper[4700]: I1007 11:20:57.463194 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:57Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:57 crc kubenswrapper[4700]: I1007 11:20:57.479870 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:57Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:57 crc kubenswrapper[4700]: I1007 11:20:57.496651 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:57Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:57 crc kubenswrapper[4700]: I1007 11:20:57.507982 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:57Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:57 crc kubenswrapper[4700]: I1007 11:20:57.554905 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:20:57 crc kubenswrapper[4700]: E1007 11:20:57.555183 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:21:01.555146179 +0000 UTC m=+28.351545198 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:20:57 crc kubenswrapper[4700]: I1007 11:20:57.656636 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:20:57 crc kubenswrapper[4700]: I1007 11:20:57.656736 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:20:57 crc kubenswrapper[4700]: I1007 11:20:57.656795 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:20:57 crc kubenswrapper[4700]: I1007 11:20:57.656888 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:20:57 crc kubenswrapper[4700]: E1007 11:20:57.656959 4700 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 11:20:57 crc kubenswrapper[4700]: E1007 11:20:57.657009 4700 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 11:20:57 crc kubenswrapper[4700]: E1007 11:20:57.657036 4700 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 11:20:57 crc kubenswrapper[4700]: E1007 11:20:57.657046 4700 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 11:20:57 crc kubenswrapper[4700]: E1007 11:20:57.657083 4700 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 11:20:57 crc kubenswrapper[4700]: E1007 11:20:57.657129 4700 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 11:20:57 crc kubenswrapper[4700]: E1007 11:20:57.657176 4700 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 11:20:57 crc kubenswrapper[4700]: E1007 11:20:57.657206 4700 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 11:20:57 crc kubenswrapper[4700]: E1007 11:20:57.657149 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 11:21:01.657113819 +0000 UTC m=+28.453512848 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 11:20:57 crc kubenswrapper[4700]: E1007 11:20:57.657489 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 11:21:01.657367556 +0000 UTC m=+28.453766585 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 11:20:57 crc kubenswrapper[4700]: E1007 11:20:57.657552 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 11:21:01.65752717 +0000 UTC m=+28.453926469 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 11:20:57 crc kubenswrapper[4700]: E1007 11:20:57.657595 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 11:21:01.657577841 +0000 UTC m=+28.453977100 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 11:20:57 crc kubenswrapper[4700]: I1007 11:20:57.956874 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:20:57 crc kubenswrapper[4700]: I1007 11:20:57.956929 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:20:57 crc kubenswrapper[4700]: I1007 11:20:57.956938 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:20:57 crc kubenswrapper[4700]: E1007 11:20:57.957140 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:20:57 crc kubenswrapper[4700]: E1007 11:20:57.957353 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:20:57 crc kubenswrapper[4700]: E1007 11:20:57.957494 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:20:58 crc kubenswrapper[4700]: E1007 11:20:58.134761 4700 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.354419 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-zfjvk"] Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.354804 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zfjvk" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.357694 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.358163 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.358164 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.358400 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.374846 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:58Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.389187 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:58Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.405661 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:58Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.419012 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:58Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.435216 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:58Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.450653 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:58Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.464166 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/71b27ab9-cbbc-40db-b3b8-aee0d4de26eb-host\") pod \"node-ca-zfjvk\" (UID: \"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\") " pod="openshift-image-registry/node-ca-zfjvk" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.464222 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g76nc\" (UniqueName: \"kubernetes.io/projected/71b27ab9-cbbc-40db-b3b8-aee0d4de26eb-kube-api-access-g76nc\") pod \"node-ca-zfjvk\" (UID: \"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\") " pod="openshift-image-registry/node-ca-zfjvk" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.464239 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/71b27ab9-cbbc-40db-b3b8-aee0d4de26eb-serviceca\") pod \"node-ca-zfjvk\" (UID: \"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\") " pod="openshift-image-registry/node-ca-zfjvk" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.471807 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:58Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.485047 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:58Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.498933 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:58Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.565514 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/71b27ab9-cbbc-40db-b3b8-aee0d4de26eb-host\") pod \"node-ca-zfjvk\" (UID: \"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\") " pod="openshift-image-registry/node-ca-zfjvk" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.565580 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g76nc\" (UniqueName: \"kubernetes.io/projected/71b27ab9-cbbc-40db-b3b8-aee0d4de26eb-kube-api-access-g76nc\") pod \"node-ca-zfjvk\" (UID: \"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\") " pod="openshift-image-registry/node-ca-zfjvk" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.565605 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/71b27ab9-cbbc-40db-b3b8-aee0d4de26eb-serviceca\") pod \"node-ca-zfjvk\" (UID: \"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\") " pod="openshift-image-registry/node-ca-zfjvk" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.565646 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/71b27ab9-cbbc-40db-b3b8-aee0d4de26eb-host\") pod \"node-ca-zfjvk\" (UID: \"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\") " pod="openshift-image-registry/node-ca-zfjvk" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.568222 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/71b27ab9-cbbc-40db-b3b8-aee0d4de26eb-serviceca\") pod \"node-ca-zfjvk\" (UID: \"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\") " pod="openshift-image-registry/node-ca-zfjvk" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.599440 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g76nc\" (UniqueName: \"kubernetes.io/projected/71b27ab9-cbbc-40db-b3b8-aee0d4de26eb-kube-api-access-g76nc\") pod \"node-ca-zfjvk\" (UID: \"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\") " pod="openshift-image-registry/node-ca-zfjvk" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.666615 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zfjvk" Oct 07 11:20:58 crc kubenswrapper[4700]: W1007 11:20:58.679006 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71b27ab9_cbbc_40db_b3b8_aee0d4de26eb.slice/crio-da65f8ff82b512667d17a9f4e02d0b084e41ef8bdbb270c7498674a59a151801 WatchSource:0}: Error finding container da65f8ff82b512667d17a9f4e02d0b084e41ef8bdbb270c7498674a59a151801: Status 404 returned error can't find the container with id da65f8ff82b512667d17a9f4e02d0b084e41ef8bdbb270c7498674a59a151801 Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.752000 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-ndw62"] Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.752386 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ndw62" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.754704 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.754704 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.754712 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.771481 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:58Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.782925 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:58Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.796671 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:58Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.811025 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:58Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.824538 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:58Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.838605 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:58Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.850450 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:58Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.869089 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lchk4\" (UniqueName: \"kubernetes.io/projected/4e752a84-4326-406c-9673-bd83defa2365-kube-api-access-lchk4\") pod \"node-resolver-ndw62\" (UID: \"4e752a84-4326-406c-9673-bd83defa2365\") " pod="openshift-dns/node-resolver-ndw62" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.869498 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4e752a84-4326-406c-9673-bd83defa2365-hosts-file\") pod \"node-resolver-ndw62\" (UID: \"4e752a84-4326-406c-9673-bd83defa2365\") " pod="openshift-dns/node-resolver-ndw62" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.871208 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:58Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.885547 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:58Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.897896 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:58Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.970906 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lchk4\" (UniqueName: \"kubernetes.io/projected/4e752a84-4326-406c-9673-bd83defa2365-kube-api-access-lchk4\") pod \"node-resolver-ndw62\" (UID: \"4e752a84-4326-406c-9673-bd83defa2365\") " pod="openshift-dns/node-resolver-ndw62" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.970949 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4e752a84-4326-406c-9673-bd83defa2365-hosts-file\") pod \"node-resolver-ndw62\" (UID: \"4e752a84-4326-406c-9673-bd83defa2365\") " pod="openshift-dns/node-resolver-ndw62" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.971062 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4e752a84-4326-406c-9673-bd83defa2365-hosts-file\") pod \"node-resolver-ndw62\" (UID: \"4e752a84-4326-406c-9673-bd83defa2365\") " pod="openshift-dns/node-resolver-ndw62" Oct 07 11:20:58 crc kubenswrapper[4700]: I1007 11:20:58.989234 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lchk4\" (UniqueName: \"kubernetes.io/projected/4e752a84-4326-406c-9673-bd83defa2365-kube-api-access-lchk4\") pod \"node-resolver-ndw62\" (UID: \"4e752a84-4326-406c-9673-bd83defa2365\") " pod="openshift-dns/node-resolver-ndw62" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.069066 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ndw62" Oct 07 11:20:59 crc kubenswrapper[4700]: W1007 11:20:59.084202 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e752a84_4326_406c_9673_bd83defa2365.slice/crio-a23b2fa65e3e1798acf5b534cb408cb76ea4112a835e8ed4d17a8361c2de190e WatchSource:0}: Error finding container a23b2fa65e3e1798acf5b534cb408cb76ea4112a835e8ed4d17a8361c2de190e: Status 404 returned error can't find the container with id a23b2fa65e3e1798acf5b534cb408cb76ea4112a835e8ed4d17a8361c2de190e Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.129413 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ndw62" event={"ID":"4e752a84-4326-406c-9673-bd83defa2365","Type":"ContainerStarted","Data":"a23b2fa65e3e1798acf5b534cb408cb76ea4112a835e8ed4d17a8361c2de190e"} Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.131529 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zfjvk" event={"ID":"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb","Type":"ContainerStarted","Data":"0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b"} Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.131602 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zfjvk" event={"ID":"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb","Type":"ContainerStarted","Data":"da65f8ff82b512667d17a9f4e02d0b084e41ef8bdbb270c7498674a59a151801"} Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.167860 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fk4xc"] Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.168696 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.179785 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-v6h5r"] Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.180141 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.181138 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.181393 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.184028 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-wvx6b"] Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.184699 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" Oct 07 11:20:59 crc kubenswrapper[4700]: W1007 11:20:59.185073 4700 reflector.go:561] object-"openshift-machine-config-operator"/"proxy-tls": failed to list *v1.Secret: secrets "proxy-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Oct 07 11:20:59 crc kubenswrapper[4700]: E1007 11:20:59.185124 4700 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"proxy-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"proxy-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 07 11:20:59 crc kubenswrapper[4700]: W1007 11:20:59.185190 4700 reflector.go:561] object-"openshift-machine-config-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Oct 07 11:20:59 crc kubenswrapper[4700]: E1007 11:20:59.185213 4700 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.185281 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:59Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:59 crc kubenswrapper[4700]: W1007 11:20:59.185468 4700 reflector.go:561] object-"openshift-machine-config-operator"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Oct 07 11:20:59 crc kubenswrapper[4700]: E1007 11:20:59.185494 4700 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.186477 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.186572 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-zhd4s"] Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.186827 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.187405 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.188186 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.188472 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: W1007 11:20:59.191183 4700 reflector.go:561] object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq": failed to list *v1.Secret: secrets "machine-config-daemon-dockercfg-r5tcq" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Oct 07 11:20:59 crc kubenswrapper[4700]: W1007 11:20:59.191452 4700 reflector.go:561] object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz": failed to list *v1.Secret: secrets "multus-ancillary-tools-dockercfg-vnmsz" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Oct 07 11:20:59 crc kubenswrapper[4700]: E1007 11:20:59.199454 4700 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vnmsz\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"multus-ancillary-tools-dockercfg-vnmsz\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 07 11:20:59 crc kubenswrapper[4700]: W1007 11:20:59.191514 4700 reflector.go:561] object-"openshift-machine-config-operator"/"kube-rbac-proxy": failed to list *v1.ConfigMap: configmaps "kube-rbac-proxy" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Oct 07 11:20:59 crc kubenswrapper[4700]: E1007 11:20:59.199492 4700 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"kube-rbac-proxy\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-rbac-proxy\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 07 11:20:59 crc kubenswrapper[4700]: W1007 11:20:59.191537 4700 reflector.go:561] object-"openshift-multus"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Oct 07 11:20:59 crc kubenswrapper[4700]: E1007 11:20:59.199514 4700 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 07 11:20:59 crc kubenswrapper[4700]: W1007 11:20:59.192590 4700 reflector.go:561] object-"openshift-multus"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Oct 07 11:20:59 crc kubenswrapper[4700]: E1007 11:20:59.199535 4700 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 07 11:20:59 crc kubenswrapper[4700]: W1007 11:20:59.193023 4700 reflector.go:561] object-"openshift-multus"/"cni-copy-resources": failed to list *v1.ConfigMap: configmaps "cni-copy-resources" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Oct 07 11:20:59 crc kubenswrapper[4700]: E1007 11:20:59.199552 4700 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"cni-copy-resources\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"cni-copy-resources\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 07 11:20:59 crc kubenswrapper[4700]: W1007 11:20:59.193136 4700 reflector.go:561] object-"openshift-multus"/"default-cni-sysctl-allowlist": failed to list *v1.ConfigMap: configmaps "default-cni-sysctl-allowlist" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Oct 07 11:20:59 crc kubenswrapper[4700]: E1007 11:20:59.199570 4700 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"default-cni-sysctl-allowlist\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.193482 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 07 11:20:59 crc kubenswrapper[4700]: W1007 11:20:59.195458 4700 reflector.go:561] object-"openshift-multus"/"default-dockercfg-2q5b6": failed to list *v1.Secret: secrets "default-dockercfg-2q5b6" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Oct 07 11:20:59 crc kubenswrapper[4700]: E1007 11:20:59.199751 4700 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"default-dockercfg-2q5b6\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"default-dockercfg-2q5b6\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.195880 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 07 11:20:59 crc kubenswrapper[4700]: E1007 11:20:59.200284 4700 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"machine-config-daemon-dockercfg-r5tcq\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-config-daemon-dockercfg-r5tcq\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.228522 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:59Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.257831 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:59Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.272856 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-host-var-lib-cni-multus\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.272910 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/59ea8470-d501-4d05-acb7-554792918f7c-os-release\") pod \"multus-additional-cni-plugins-wvx6b\" (UID: \"59ea8470-d501-4d05-acb7-554792918f7c\") " pod="openshift-multus/multus-additional-cni-plugins-wvx6b" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.272934 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-var-lib-openvswitch\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.272960 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-host-run-k8s-cni-cncf-io\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.272982 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/97a77b38-e9b1-4243-ac3a-28d83d87cf15-mcd-auth-proxy-config\") pod \"machine-config-daemon-v6h5r\" (UID: \"97a77b38-e9b1-4243-ac3a-28d83d87cf15\") " pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.273008 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-etc-openvswitch\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.273092 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-ovnkube-script-lib\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.273227 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-multus-socket-dir-parent\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.273313 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/59ea8470-d501-4d05-acb7-554792918f7c-cni-binary-copy\") pod \"multus-additional-cni-plugins-wvx6b\" (UID: \"59ea8470-d501-4d05-acb7-554792918f7c\") " pod="openshift-multus/multus-additional-cni-plugins-wvx6b" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.273400 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-host-var-lib-kubelet\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.273438 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/97a77b38-e9b1-4243-ac3a-28d83d87cf15-rootfs\") pod \"machine-config-daemon-v6h5r\" (UID: \"97a77b38-e9b1-4243-ac3a-28d83d87cf15\") " pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.273481 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-run-openvswitch\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.273511 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-run-ovn-kubernetes\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.273537 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-ovnkube-config\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.273567 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-hostroot\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.273665 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/59ea8470-d501-4d05-acb7-554792918f7c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wvx6b\" (UID: \"59ea8470-d501-4d05-acb7-554792918f7c\") " pod="openshift-multus/multus-additional-cni-plugins-wvx6b" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.273733 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-systemd-units\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.273756 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-slash\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.273771 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-env-overrides\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.273790 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-cnibin\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.273847 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-node-log\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.273882 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/869af552-a034-4af4-b46a-492798633d24-multus-daemon-config\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.273924 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-host-run-multus-certs\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.274082 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-cni-netd\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.274186 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-ovn-node-metrics-cert\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.274261 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-kubelet\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.274290 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-run-systemd\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.274342 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-multus-cni-dir\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.274373 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/97a77b38-e9b1-4243-ac3a-28d83d87cf15-proxy-tls\") pod \"machine-config-daemon-v6h5r\" (UID: \"97a77b38-e9b1-4243-ac3a-28d83d87cf15\") " pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.274400 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.274443 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-cni-bin\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.274469 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-log-socket\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.274492 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-system-cni-dir\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.274517 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-host-run-netns\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.274592 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-host-var-lib-cni-bin\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.274669 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smbht\" (UniqueName: \"kubernetes.io/projected/97a77b38-e9b1-4243-ac3a-28d83d87cf15-kube-api-access-smbht\") pod \"machine-config-daemon-v6h5r\" (UID: \"97a77b38-e9b1-4243-ac3a-28d83d87cf15\") " pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.274697 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-etc-kubernetes\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.274719 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/59ea8470-d501-4d05-acb7-554792918f7c-cnibin\") pod \"multus-additional-cni-plugins-wvx6b\" (UID: \"59ea8470-d501-4d05-acb7-554792918f7c\") " pod="openshift-multus/multus-additional-cni-plugins-wvx6b" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.274740 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nr55\" (UniqueName: \"kubernetes.io/projected/59ea8470-d501-4d05-acb7-554792918f7c-kube-api-access-4nr55\") pod \"multus-additional-cni-plugins-wvx6b\" (UID: \"59ea8470-d501-4d05-acb7-554792918f7c\") " pod="openshift-multus/multus-additional-cni-plugins-wvx6b" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.274763 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-os-release\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.274780 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/869af552-a034-4af4-b46a-492798633d24-cni-binary-copy\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.274803 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-run-ovn\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.274823 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/59ea8470-d501-4d05-acb7-554792918f7c-system-cni-dir\") pod \"multus-additional-cni-plugins-wvx6b\" (UID: \"59ea8470-d501-4d05-acb7-554792918f7c\") " pod="openshift-multus/multus-additional-cni-plugins-wvx6b" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.274862 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-run-netns\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.274906 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/59ea8470-d501-4d05-acb7-554792918f7c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wvx6b\" (UID: \"59ea8470-d501-4d05-acb7-554792918f7c\") " pod="openshift-multus/multus-additional-cni-plugins-wvx6b" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.274982 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-multus-conf-dir\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.275030 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96pnm\" (UniqueName: \"kubernetes.io/projected/869af552-a034-4af4-b46a-492798633d24-kube-api-access-96pnm\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.275091 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z5xw\" (UniqueName: \"kubernetes.io/projected/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-kube-api-access-6z5xw\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.283677 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:59Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.303321 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:59Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.325601 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:59Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.344277 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:59Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.361235 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:59Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.375729 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/59ea8470-d501-4d05-acb7-554792918f7c-cni-binary-copy\") pod \"multus-additional-cni-plugins-wvx6b\" (UID: \"59ea8470-d501-4d05-acb7-554792918f7c\") " pod="openshift-multus/multus-additional-cni-plugins-wvx6b" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.375778 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-etc-openvswitch\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.375802 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-ovnkube-script-lib\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.375821 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-multus-socket-dir-parent\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.375837 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-run-openvswitch\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.375856 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-run-ovn-kubernetes\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.375872 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-ovnkube-config\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.375890 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-host-var-lib-kubelet\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.375910 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/97a77b38-e9b1-4243-ac3a-28d83d87cf15-rootfs\") pod \"machine-config-daemon-v6h5r\" (UID: \"97a77b38-e9b1-4243-ac3a-28d83d87cf15\") " pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.375905 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-etc-openvswitch\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.375927 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-hostroot\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.375974 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-hostroot\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376006 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/59ea8470-d501-4d05-acb7-554792918f7c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wvx6b\" (UID: \"59ea8470-d501-4d05-acb7-554792918f7c\") " pod="openshift-multus/multus-additional-cni-plugins-wvx6b" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376030 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-cnibin\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376114 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-systemd-units\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376141 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-slash\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376160 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-env-overrides\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376183 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-node-log\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376204 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-cni-netd\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376222 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-ovn-node-metrics-cert\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376244 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/869af552-a034-4af4-b46a-492798633d24-multus-daemon-config\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376266 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-host-run-multus-certs\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376287 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-run-systemd\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376326 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-multus-cni-dir\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376364 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-kubelet\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376382 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/97a77b38-e9b1-4243-ac3a-28d83d87cf15-proxy-tls\") pod \"machine-config-daemon-v6h5r\" (UID: \"97a77b38-e9b1-4243-ac3a-28d83d87cf15\") " pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376439 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-cni-bin\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376458 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376482 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-host-run-netns\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376500 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-host-var-lib-cni-bin\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376528 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-log-socket\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376552 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-system-cni-dir\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376577 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smbht\" (UniqueName: \"kubernetes.io/projected/97a77b38-e9b1-4243-ac3a-28d83d87cf15-kube-api-access-smbht\") pod \"machine-config-daemon-v6h5r\" (UID: \"97a77b38-e9b1-4243-ac3a-28d83d87cf15\") " pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376604 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-os-release\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376622 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/869af552-a034-4af4-b46a-492798633d24-cni-binary-copy\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376640 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-etc-kubernetes\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376658 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/59ea8470-d501-4d05-acb7-554792918f7c-cnibin\") pod \"multus-additional-cni-plugins-wvx6b\" (UID: \"59ea8470-d501-4d05-acb7-554792918f7c\") " pod="openshift-multus/multus-additional-cni-plugins-wvx6b" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376681 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nr55\" (UniqueName: \"kubernetes.io/projected/59ea8470-d501-4d05-acb7-554792918f7c-kube-api-access-4nr55\") pod \"multus-additional-cni-plugins-wvx6b\" (UID: \"59ea8470-d501-4d05-acb7-554792918f7c\") " pod="openshift-multus/multus-additional-cni-plugins-wvx6b" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376704 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-run-netns\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376721 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-run-ovn\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376729 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-ovnkube-script-lib\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376739 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/59ea8470-d501-4d05-acb7-554792918f7c-system-cni-dir\") pod \"multus-additional-cni-plugins-wvx6b\" (UID: \"59ea8470-d501-4d05-acb7-554792918f7c\") " pod="openshift-multus/multus-additional-cni-plugins-wvx6b" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376769 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/59ea8470-d501-4d05-acb7-554792918f7c-system-cni-dir\") pod \"multus-additional-cni-plugins-wvx6b\" (UID: \"59ea8470-d501-4d05-acb7-554792918f7c\") " pod="openshift-multus/multus-additional-cni-plugins-wvx6b" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376791 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/59ea8470-d501-4d05-acb7-554792918f7c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wvx6b\" (UID: \"59ea8470-d501-4d05-acb7-554792918f7c\") " pod="openshift-multus/multus-additional-cni-plugins-wvx6b" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376814 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z5xw\" (UniqueName: \"kubernetes.io/projected/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-kube-api-access-6z5xw\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376835 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-multus-conf-dir\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376852 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96pnm\" (UniqueName: \"kubernetes.io/projected/869af552-a034-4af4-b46a-492798633d24-kube-api-access-96pnm\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376871 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-host-var-lib-cni-multus\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376885 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-cnibin\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376893 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-var-lib-openvswitch\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376911 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-systemd-units\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376916 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-host-run-k8s-cni-cncf-io\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376937 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-slash\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376940 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/97a77b38-e9b1-4243-ac3a-28d83d87cf15-mcd-auth-proxy-config\") pod \"machine-config-daemon-v6h5r\" (UID: \"97a77b38-e9b1-4243-ac3a-28d83d87cf15\") " pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.376963 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/59ea8470-d501-4d05-acb7-554792918f7c-os-release\") pod \"multus-additional-cni-plugins-wvx6b\" (UID: \"59ea8470-d501-4d05-acb7-554792918f7c\") " pod="openshift-multus/multus-additional-cni-plugins-wvx6b" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.377146 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/59ea8470-d501-4d05-acb7-554792918f7c-os-release\") pod \"multus-additional-cni-plugins-wvx6b\" (UID: \"59ea8470-d501-4d05-acb7-554792918f7c\") " pod="openshift-multus/multus-additional-cni-plugins-wvx6b" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.377186 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-multus-conf-dir\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.377264 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-host-run-netns\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.377345 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-host-run-k8s-cni-cncf-io\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.377349 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-var-lib-openvswitch\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.377378 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-run-systemd\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.377398 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-cni-bin\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.377428 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-kubelet\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.377265 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-host-var-lib-cni-multus\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.377508 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-env-overrides\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.377542 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/97a77b38-e9b1-4243-ac3a-28d83d87cf15-rootfs\") pod \"machine-config-daemon-v6h5r\" (UID: \"97a77b38-e9b1-4243-ac3a-28d83d87cf15\") " pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.377551 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-run-netns\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.377551 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-host-run-multus-certs\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.377544 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-run-ovn\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.377590 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-host-var-lib-cni-bin\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.377609 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-run-ovn-kubernetes\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.377636 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-multus-cni-dir\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.377621 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.377665 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-host-var-lib-kubelet\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.377710 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-node-log\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.377714 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-log-socket\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.377737 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/59ea8470-d501-4d05-acb7-554792918f7c-cnibin\") pod \"multus-additional-cni-plugins-wvx6b\" (UID: \"59ea8470-d501-4d05-acb7-554792918f7c\") " pod="openshift-multus/multus-additional-cni-plugins-wvx6b" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.377670 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-run-openvswitch\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.377693 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-cni-netd\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.377723 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-etc-kubernetes\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.377686 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-system-cni-dir\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.377675 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-multus-socket-dir-parent\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.377855 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/869af552-a034-4af4-b46a-492798633d24-os-release\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.378090 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-ovnkube-config\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.378348 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/869af552-a034-4af4-b46a-492798633d24-multus-daemon-config\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.378443 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/59ea8470-d501-4d05-acb7-554792918f7c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wvx6b\" (UID: \"59ea8470-d501-4d05-acb7-554792918f7c\") " pod="openshift-multus/multus-additional-cni-plugins-wvx6b" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.378966 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:59Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.381687 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-ovn-node-metrics-cert\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.391698 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:59Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.401618 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z5xw\" (UniqueName: \"kubernetes.io/projected/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-kube-api-access-6z5xw\") pod \"ovnkube-node-fk4xc\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.406977 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:59Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.419856 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:59Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.433676 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:59Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.456983 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:59Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.471106 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:59Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.484998 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:59Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.487133 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.498037 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:59Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:59 crc kubenswrapper[4700]: W1007 11:20:59.498662 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0a75e4c_2144_40de_9abc_f0bb7a143a0e.slice/crio-a49cc3a7f7bd18c4a35ef0fd84fdff299f285659a581d3dbc11583db5f606b3a WatchSource:0}: Error finding container a49cc3a7f7bd18c4a35ef0fd84fdff299f285659a581d3dbc11583db5f606b3a: Status 404 returned error can't find the container with id a49cc3a7f7bd18c4a35ef0fd84fdff299f285659a581d3dbc11583db5f606b3a Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.509904 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:59Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.522838 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:59Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.537099 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:59Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.549918 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:59Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.561281 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:59Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.571882 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:59Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.584190 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:59Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.608653 4700 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.609866 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.609913 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.609923 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.610061 4700 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.617324 4700 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.617695 4700 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.618798 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.618857 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.618871 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.618888 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.618902 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:20:59Z","lastTransitionTime":"2025-10-07T11:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:20:59 crc kubenswrapper[4700]: E1007 11:20:59.639477 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:59Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.643969 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.644011 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.644026 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.644043 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.644054 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:20:59Z","lastTransitionTime":"2025-10-07T11:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:20:59 crc kubenswrapper[4700]: E1007 11:20:59.655836 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:59Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.659342 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.659393 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.659408 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.659432 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.659448 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:20:59Z","lastTransitionTime":"2025-10-07T11:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:20:59 crc kubenswrapper[4700]: E1007 11:20:59.671603 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:59Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.675225 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.675277 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.675291 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.675324 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.675335 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:20:59Z","lastTransitionTime":"2025-10-07T11:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:20:59 crc kubenswrapper[4700]: E1007 11:20:59.686974 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:59Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.690418 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.690468 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.690483 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.690500 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.690512 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:20:59Z","lastTransitionTime":"2025-10-07T11:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:20:59 crc kubenswrapper[4700]: E1007 11:20:59.701754 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:20:59Z is after 2025-08-24T17:21:41Z" Oct 07 11:20:59 crc kubenswrapper[4700]: E1007 11:20:59.701880 4700 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.703803 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.703837 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.703845 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.703860 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.703872 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:20:59Z","lastTransitionTime":"2025-10-07T11:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.806055 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.806097 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.806106 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.806120 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.806134 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:20:59Z","lastTransitionTime":"2025-10-07T11:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.908798 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.908849 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.908862 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.908882 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.908895 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:20:59Z","lastTransitionTime":"2025-10-07T11:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.956440 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:20:59 crc kubenswrapper[4700]: E1007 11:20:59.956564 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.956898 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:20:59 crc kubenswrapper[4700]: E1007 11:20:59.956960 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:20:59 crc kubenswrapper[4700]: I1007 11:20:59.956973 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:20:59 crc kubenswrapper[4700]: E1007 11:20:59.957164 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.013336 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.013382 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.013392 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.013411 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.013422 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:00Z","lastTransitionTime":"2025-10-07T11:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.015958 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.018104 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/59ea8470-d501-4d05-acb7-554792918f7c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wvx6b\" (UID: \"59ea8470-d501-4d05-acb7-554792918f7c\") " pod="openshift-multus/multus-additional-cni-plugins-wvx6b" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.116206 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.116283 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.116292 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.116330 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.116342 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:00Z","lastTransitionTime":"2025-10-07T11:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.121794 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.134081 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/97a77b38-e9b1-4243-ac3a-28d83d87cf15-proxy-tls\") pod \"machine-config-daemon-v6h5r\" (UID: \"97a77b38-e9b1-4243-ac3a-28d83d87cf15\") " pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.137923 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ndw62" event={"ID":"4e752a84-4326-406c-9673-bd83defa2365","Type":"ContainerStarted","Data":"2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb"} Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.140742 4700 generic.go:334] "Generic (PLEG): container finished" podID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerID="6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379" exitCode=0 Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.140789 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" event={"ID":"d0a75e4c-2144-40de-9abc-f0bb7a143a0e","Type":"ContainerDied","Data":"6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379"} Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.140809 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" event={"ID":"d0a75e4c-2144-40de-9abc-f0bb7a143a0e","Type":"ContainerStarted","Data":"a49cc3a7f7bd18c4a35ef0fd84fdff299f285659a581d3dbc11583db5f606b3a"} Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.158019 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:00Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.184007 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:00Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.204499 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:00Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.219357 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.219424 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.219442 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.219465 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.219482 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:00Z","lastTransitionTime":"2025-10-07T11:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.223236 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:00Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.240018 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:00Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.256950 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:00Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.271189 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:00Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.282274 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.286114 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:00Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.304473 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:00Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.322032 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.322067 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.322078 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.322093 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.322102 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:00Z","lastTransitionTime":"2025-10-07T11:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.330034 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:00Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.342101 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:00Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.356264 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:00Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.375425 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:00Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:00 crc kubenswrapper[4700]: E1007 11:21:00.377817 4700 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Oct 07 11:21:00 crc kubenswrapper[4700]: E1007 11:21:00.377896 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/97a77b38-e9b1-4243-ac3a-28d83d87cf15-mcd-auth-proxy-config podName:97a77b38-e9b1-4243-ac3a-28d83d87cf15 nodeName:}" failed. No retries permitted until 2025-10-07 11:21:00.877873465 +0000 UTC m=+27.674272454 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "mcd-auth-proxy-config" (UniqueName: "kubernetes.io/configmap/97a77b38-e9b1-4243-ac3a-28d83d87cf15-mcd-auth-proxy-config") pod "machine-config-daemon-v6h5r" (UID: "97a77b38-e9b1-4243-ac3a-28d83d87cf15") : failed to sync configmap cache: timed out waiting for the condition Oct 07 11:21:00 crc kubenswrapper[4700]: E1007 11:21:00.377990 4700 configmap.go:193] Couldn't get configMap openshift-multus/cni-copy-resources: failed to sync configmap cache: timed out waiting for the condition Oct 07 11:21:00 crc kubenswrapper[4700]: E1007 11:21:00.378117 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/869af552-a034-4af4-b46a-492798633d24-cni-binary-copy podName:869af552-a034-4af4-b46a-492798633d24 nodeName:}" failed. No retries permitted until 2025-10-07 11:21:00.878091381 +0000 UTC m=+27.674490360 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cni-binary-copy" (UniqueName: "kubernetes.io/configmap/869af552-a034-4af4-b46a-492798633d24-cni-binary-copy") pod "multus-zhd4s" (UID: "869af552-a034-4af4-b46a-492798633d24") : failed to sync configmap cache: timed out waiting for the condition Oct 07 11:21:00 crc kubenswrapper[4700]: E1007 11:21:00.378346 4700 configmap.go:193] Couldn't get configMap openshift-multus/cni-copy-resources: failed to sync configmap cache: timed out waiting for the condition Oct 07 11:21:00 crc kubenswrapper[4700]: E1007 11:21:00.378391 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/59ea8470-d501-4d05-acb7-554792918f7c-cni-binary-copy podName:59ea8470-d501-4d05-acb7-554792918f7c nodeName:}" failed. No retries permitted until 2025-10-07 11:21:00.878381469 +0000 UTC m=+27.674780458 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cni-binary-copy" (UniqueName: "kubernetes.io/configmap/59ea8470-d501-4d05-acb7-554792918f7c-cni-binary-copy") pod "multus-additional-cni-plugins-wvx6b" (UID: "59ea8470-d501-4d05-acb7-554792918f7c") : failed to sync configmap cache: timed out waiting for the condition Oct 07 11:21:00 crc kubenswrapper[4700]: E1007 11:21:00.393169 4700 projected.go:288] Couldn't get configMap openshift-machine-config-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.394366 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:00Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:00 crc kubenswrapper[4700]: E1007 11:21:00.395116 4700 projected.go:288] Couldn't get configMap openshift-multus/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Oct 07 11:21:00 crc kubenswrapper[4700]: E1007 11:21:00.395186 4700 projected.go:194] Error preparing data for projected volume kube-api-access-96pnm for pod openshift-multus/multus-zhd4s: failed to sync configmap cache: timed out waiting for the condition Oct 07 11:21:00 crc kubenswrapper[4700]: E1007 11:21:00.395117 4700 projected.go:288] Couldn't get configMap openshift-multus/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Oct 07 11:21:00 crc kubenswrapper[4700]: E1007 11:21:00.395280 4700 projected.go:194] Error preparing data for projected volume kube-api-access-4nr55 for pod openshift-multus/multus-additional-cni-plugins-wvx6b: failed to sync configmap cache: timed out waiting for the condition Oct 07 11:21:00 crc kubenswrapper[4700]: E1007 11:21:00.395308 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/869af552-a034-4af4-b46a-492798633d24-kube-api-access-96pnm podName:869af552-a034-4af4-b46a-492798633d24 nodeName:}" failed. No retries permitted until 2025-10-07 11:21:00.895264024 +0000 UTC m=+27.691663053 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-96pnm" (UniqueName: "kubernetes.io/projected/869af552-a034-4af4-b46a-492798633d24-kube-api-access-96pnm") pod "multus-zhd4s" (UID: "869af552-a034-4af4-b46a-492798633d24") : failed to sync configmap cache: timed out waiting for the condition Oct 07 11:21:00 crc kubenswrapper[4700]: E1007 11:21:00.395373 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/59ea8470-d501-4d05-acb7-554792918f7c-kube-api-access-4nr55 podName:59ea8470-d501-4d05-acb7-554792918f7c nodeName:}" failed. No retries permitted until 2025-10-07 11:21:00.895350137 +0000 UTC m=+27.691749126 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-4nr55" (UniqueName: "kubernetes.io/projected/59ea8470-d501-4d05-acb7-554792918f7c-kube-api-access-4nr55") pod "multus-additional-cni-plugins-wvx6b" (UID: "59ea8470-d501-4d05-acb7-554792918f7c") : failed to sync configmap cache: timed out waiting for the condition Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.413372 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 07 11:21:00 crc kubenswrapper[4700]: E1007 11:21:00.413409 4700 projected.go:194] Error preparing data for projected volume kube-api-access-smbht for pod openshift-machine-config-operator/machine-config-daemon-v6h5r: failed to sync configmap cache: timed out waiting for the condition Oct 07 11:21:00 crc kubenswrapper[4700]: E1007 11:21:00.413515 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/97a77b38-e9b1-4243-ac3a-28d83d87cf15-kube-api-access-smbht podName:97a77b38-e9b1-4243-ac3a-28d83d87cf15 nodeName:}" failed. No retries permitted until 2025-10-07 11:21:00.913489296 +0000 UTC m=+27.709888285 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-smbht" (UniqueName: "kubernetes.io/projected/97a77b38-e9b1-4243-ac3a-28d83d87cf15-kube-api-access-smbht") pod "machine-config-daemon-v6h5r" (UID: "97a77b38-e9b1-4243-ac3a-28d83d87cf15") : failed to sync configmap cache: timed out waiting for the condition Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.415154 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:00Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.425208 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.425533 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.425557 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.425581 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.425595 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:00Z","lastTransitionTime":"2025-10-07T11:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.428993 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:00Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.446888 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:00Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.468626 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.470787 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:00Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.482285 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:00Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.496091 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.499470 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:00Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.519070 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:00Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.528085 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.528118 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.528131 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.528150 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.528163 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:00Z","lastTransitionTime":"2025-10-07T11:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.535369 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:00Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.549988 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:00Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.568166 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:00Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.580958 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:00Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.593666 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:00Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.611189 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:00Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.631630 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.631677 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.631689 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.631711 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.631726 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:00Z","lastTransitionTime":"2025-10-07T11:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.645088 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:00Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.663272 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.676031 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.702529 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.734065 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.734104 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.734115 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.734132 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.734143 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:00Z","lastTransitionTime":"2025-10-07T11:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.786264 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.802458 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.837228 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.837271 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.837283 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.837316 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.837331 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:00Z","lastTransitionTime":"2025-10-07T11:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.893830 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/869af552-a034-4af4-b46a-492798633d24-cni-binary-copy\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.893914 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/97a77b38-e9b1-4243-ac3a-28d83d87cf15-mcd-auth-proxy-config\") pod \"machine-config-daemon-v6h5r\" (UID: \"97a77b38-e9b1-4243-ac3a-28d83d87cf15\") " pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.893943 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/59ea8470-d501-4d05-acb7-554792918f7c-cni-binary-copy\") pod \"multus-additional-cni-plugins-wvx6b\" (UID: \"59ea8470-d501-4d05-acb7-554792918f7c\") " pod="openshift-multus/multus-additional-cni-plugins-wvx6b" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.894680 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/869af552-a034-4af4-b46a-492798633d24-cni-binary-copy\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.894739 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/59ea8470-d501-4d05-acb7-554792918f7c-cni-binary-copy\") pod \"multus-additional-cni-plugins-wvx6b\" (UID: \"59ea8470-d501-4d05-acb7-554792918f7c\") " pod="openshift-multus/multus-additional-cni-plugins-wvx6b" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.895132 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/97a77b38-e9b1-4243-ac3a-28d83d87cf15-mcd-auth-proxy-config\") pod \"machine-config-daemon-v6h5r\" (UID: \"97a77b38-e9b1-4243-ac3a-28d83d87cf15\") " pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.940116 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.940163 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.940174 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.940192 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.940205 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:00Z","lastTransitionTime":"2025-10-07T11:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.994732 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smbht\" (UniqueName: \"kubernetes.io/projected/97a77b38-e9b1-4243-ac3a-28d83d87cf15-kube-api-access-smbht\") pod \"machine-config-daemon-v6h5r\" (UID: \"97a77b38-e9b1-4243-ac3a-28d83d87cf15\") " pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.994780 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nr55\" (UniqueName: \"kubernetes.io/projected/59ea8470-d501-4d05-acb7-554792918f7c-kube-api-access-4nr55\") pod \"multus-additional-cni-plugins-wvx6b\" (UID: \"59ea8470-d501-4d05-acb7-554792918f7c\") " pod="openshift-multus/multus-additional-cni-plugins-wvx6b" Oct 07 11:21:00 crc kubenswrapper[4700]: I1007 11:21:00.994813 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96pnm\" (UniqueName: \"kubernetes.io/projected/869af552-a034-4af4-b46a-492798633d24-kube-api-access-96pnm\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.001419 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smbht\" (UniqueName: \"kubernetes.io/projected/97a77b38-e9b1-4243-ac3a-28d83d87cf15-kube-api-access-smbht\") pod \"machine-config-daemon-v6h5r\" (UID: \"97a77b38-e9b1-4243-ac3a-28d83d87cf15\") " pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.001482 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nr55\" (UniqueName: \"kubernetes.io/projected/59ea8470-d501-4d05-acb7-554792918f7c-kube-api-access-4nr55\") pod \"multus-additional-cni-plugins-wvx6b\" (UID: \"59ea8470-d501-4d05-acb7-554792918f7c\") " pod="openshift-multus/multus-additional-cni-plugins-wvx6b" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.002666 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96pnm\" (UniqueName: \"kubernetes.io/projected/869af552-a034-4af4-b46a-492798633d24-kube-api-access-96pnm\") pod \"multus-zhd4s\" (UID: \"869af552-a034-4af4-b46a-492798633d24\") " pod="openshift-multus/multus-zhd4s" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.012891 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.017887 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zhd4s" Oct 07 11:21:01 crc kubenswrapper[4700]: W1007 11:21:01.024654 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59ea8470_d501_4d05_acb7_554792918f7c.slice/crio-1762560f0c1bf2b020923eccf9f8fe4e2b4a086b5b9d778460384cd1f17eeda4 WatchSource:0}: Error finding container 1762560f0c1bf2b020923eccf9f8fe4e2b4a086b5b9d778460384cd1f17eeda4: Status 404 returned error can't find the container with id 1762560f0c1bf2b020923eccf9f8fe4e2b4a086b5b9d778460384cd1f17eeda4 Oct 07 11:21:01 crc kubenswrapper[4700]: W1007 11:21:01.027880 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod869af552_a034_4af4_b46a_492798633d24.slice/crio-b7095c426781951ae9d9528b842484adee23244d75659cb9e001567f15e5ccea WatchSource:0}: Error finding container b7095c426781951ae9d9528b842484adee23244d75659cb9e001567f15e5ccea: Status 404 returned error can't find the container with id b7095c426781951ae9d9528b842484adee23244d75659cb9e001567f15e5ccea Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.042699 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.042748 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.042758 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.042776 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.042787 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:01Z","lastTransitionTime":"2025-10-07T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.145038 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.145078 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.145090 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.145108 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.145122 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:01Z","lastTransitionTime":"2025-10-07T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.152228 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" event={"ID":"d0a75e4c-2144-40de-9abc-f0bb7a143a0e","Type":"ContainerStarted","Data":"e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0"} Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.152277 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" event={"ID":"d0a75e4c-2144-40de-9abc-f0bb7a143a0e","Type":"ContainerStarted","Data":"173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86"} Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.152288 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" event={"ID":"d0a75e4c-2144-40de-9abc-f0bb7a143a0e","Type":"ContainerStarted","Data":"d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23"} Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.152297 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" event={"ID":"d0a75e4c-2144-40de-9abc-f0bb7a143a0e","Type":"ContainerStarted","Data":"7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb"} Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.152324 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" event={"ID":"d0a75e4c-2144-40de-9abc-f0bb7a143a0e","Type":"ContainerStarted","Data":"41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8"} Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.152336 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" event={"ID":"d0a75e4c-2144-40de-9abc-f0bb7a143a0e","Type":"ContainerStarted","Data":"648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452"} Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.154075 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zhd4s" event={"ID":"869af552-a034-4af4-b46a-492798633d24","Type":"ContainerStarted","Data":"b7095c426781951ae9d9528b842484adee23244d75659cb9e001567f15e5ccea"} Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.155591 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" event={"ID":"59ea8470-d501-4d05-acb7-554792918f7c","Type":"ContainerStarted","Data":"1762560f0c1bf2b020923eccf9f8fe4e2b4a086b5b9d778460384cd1f17eeda4"} Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.250902 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.250975 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.250990 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.251009 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.251020 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:01Z","lastTransitionTime":"2025-10-07T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.299291 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" Oct 07 11:21:01 crc kubenswrapper[4700]: W1007 11:21:01.314856 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97a77b38_e9b1_4243_ac3a_28d83d87cf15.slice/crio-45b297d220fa336dd432aa9b2c495e185a2d3923b540455a87f56a4ca610ed96 WatchSource:0}: Error finding container 45b297d220fa336dd432aa9b2c495e185a2d3923b540455a87f56a4ca610ed96: Status 404 returned error can't find the container with id 45b297d220fa336dd432aa9b2c495e185a2d3923b540455a87f56a4ca610ed96 Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.353536 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.353568 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.353577 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.353595 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.353604 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:01Z","lastTransitionTime":"2025-10-07T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.456771 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.456825 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.456838 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.456864 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.456878 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:01Z","lastTransitionTime":"2025-10-07T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.559101 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.559158 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.559170 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.559192 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.559204 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:01Z","lastTransitionTime":"2025-10-07T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.601595 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:21:01 crc kubenswrapper[4700]: E1007 11:21:01.601766 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:21:09.601737772 +0000 UTC m=+36.398136761 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.661839 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.661887 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.661896 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.661913 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.661924 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:01Z","lastTransitionTime":"2025-10-07T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.702658 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.702703 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.702728 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.702761 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:01 crc kubenswrapper[4700]: E1007 11:21:01.702876 4700 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 11:21:01 crc kubenswrapper[4700]: E1007 11:21:01.702889 4700 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 11:21:01 crc kubenswrapper[4700]: E1007 11:21:01.702929 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 11:21:09.702912151 +0000 UTC m=+36.499311140 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 11:21:01 crc kubenswrapper[4700]: E1007 11:21:01.703000 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 11:21:09.702974313 +0000 UTC m=+36.499373312 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 11:21:01 crc kubenswrapper[4700]: E1007 11:21:01.703131 4700 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 11:21:01 crc kubenswrapper[4700]: E1007 11:21:01.703148 4700 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 11:21:01 crc kubenswrapper[4700]: E1007 11:21:01.703167 4700 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 11:21:01 crc kubenswrapper[4700]: E1007 11:21:01.703200 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 11:21:09.703189759 +0000 UTC m=+36.499588758 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 11:21:01 crc kubenswrapper[4700]: E1007 11:21:01.703268 4700 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 11:21:01 crc kubenswrapper[4700]: E1007 11:21:01.703282 4700 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 11:21:01 crc kubenswrapper[4700]: E1007 11:21:01.703292 4700 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 11:21:01 crc kubenswrapper[4700]: E1007 11:21:01.703356 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 11:21:09.703347163 +0000 UTC m=+36.499746162 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.764989 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.765060 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.765079 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.765107 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.765126 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:01Z","lastTransitionTime":"2025-10-07T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.867980 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.868037 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.868050 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.868075 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.868089 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:01Z","lastTransitionTime":"2025-10-07T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.956329 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.956396 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.956396 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:21:01 crc kubenswrapper[4700]: E1007 11:21:01.956561 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:21:01 crc kubenswrapper[4700]: E1007 11:21:01.956772 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:21:01 crc kubenswrapper[4700]: E1007 11:21:01.956925 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.971148 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.971193 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.971205 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.971225 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:01 crc kubenswrapper[4700]: I1007 11:21:01.971241 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:01Z","lastTransitionTime":"2025-10-07T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.074173 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.074229 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.074242 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.074267 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.074278 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:02Z","lastTransitionTime":"2025-10-07T11:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.162977 4700 generic.go:334] "Generic (PLEG): container finished" podID="59ea8470-d501-4d05-acb7-554792918f7c" containerID="f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc" exitCode=0 Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.163131 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" event={"ID":"59ea8470-d501-4d05-acb7-554792918f7c","Type":"ContainerDied","Data":"f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc"} Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.165982 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" event={"ID":"97a77b38-e9b1-4243-ac3a-28d83d87cf15","Type":"ContainerStarted","Data":"7e499a61ee1225e7b9ad2af5b6de48463ea247761b9e8414c343ac016eb66a9f"} Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.166086 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" event={"ID":"97a77b38-e9b1-4243-ac3a-28d83d87cf15","Type":"ContainerStarted","Data":"8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6"} Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.166118 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" event={"ID":"97a77b38-e9b1-4243-ac3a-28d83d87cf15","Type":"ContainerStarted","Data":"45b297d220fa336dd432aa9b2c495e185a2d3923b540455a87f56a4ca610ed96"} Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.168341 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zhd4s" event={"ID":"869af552-a034-4af4-b46a-492798633d24","Type":"ContainerStarted","Data":"8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a"} Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.177188 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.177242 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.177258 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.177281 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.177296 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:02Z","lastTransitionTime":"2025-10-07T11:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.180214 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:02Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.202920 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:02Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.222707 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:02Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.236591 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:02Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.262530 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:02Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.276879 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:02Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.281645 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.281697 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.281710 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.281730 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.281750 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:02Z","lastTransitionTime":"2025-10-07T11:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.301001 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:02Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.316111 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:02Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.331825 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:02Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.344212 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:02Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.358220 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:02Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.374047 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:02Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.385209 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.385285 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.385300 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.385344 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.385359 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:02Z","lastTransitionTime":"2025-10-07T11:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.387737 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:02Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.400350 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:02Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.412589 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:02Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.426979 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e499a61ee1225e7b9ad2af5b6de48463ea247761b9e8414c343ac016eb66a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:02Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.442003 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:02Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.466044 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:02Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.479357 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:02Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.488741 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.488811 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.488827 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.488852 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.488867 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:02Z","lastTransitionTime":"2025-10-07T11:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.490554 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:02Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.507424 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:02Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.525593 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:02Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.540763 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:02Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.560198 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:02Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.585288 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:02Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.591696 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.591789 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.591810 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.591835 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.591857 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:02Z","lastTransitionTime":"2025-10-07T11:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.598103 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:02Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.615348 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:02Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.630929 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:02Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.694868 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.694922 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.694939 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.694960 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.694973 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:02Z","lastTransitionTime":"2025-10-07T11:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.799035 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.799557 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.799568 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.799586 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.799599 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:02Z","lastTransitionTime":"2025-10-07T11:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.902563 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.902608 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.902621 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.902642 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:02 crc kubenswrapper[4700]: I1007 11:21:02.902655 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:02Z","lastTransitionTime":"2025-10-07T11:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.006241 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.006293 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.006322 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.006351 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.006366 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:03Z","lastTransitionTime":"2025-10-07T11:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.109261 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.109354 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.109372 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.109399 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.109417 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:03Z","lastTransitionTime":"2025-10-07T11:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.174792 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" event={"ID":"d0a75e4c-2144-40de-9abc-f0bb7a143a0e","Type":"ContainerStarted","Data":"71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a"} Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.178695 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" event={"ID":"59ea8470-d501-4d05-acb7-554792918f7c","Type":"ContainerStarted","Data":"aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334"} Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.197551 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:03Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.211419 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:03Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.212689 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.212732 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.212743 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.212765 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.212786 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:03Z","lastTransitionTime":"2025-10-07T11:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.224977 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:03Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.237161 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:03Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.255290 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:03Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.268514 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:03Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.283719 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:03Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.298948 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:03Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.312261 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:03Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.315757 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.315813 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.315828 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.315851 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.315865 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:03Z","lastTransitionTime":"2025-10-07T11:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.331697 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:03Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.348397 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:03Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.361921 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:03Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.378050 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:03Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.396429 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e499a61ee1225e7b9ad2af5b6de48463ea247761b9e8414c343ac016eb66a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:03Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.418499 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.418550 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.418558 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.418578 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.418588 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:03Z","lastTransitionTime":"2025-10-07T11:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.520851 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.520906 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.520926 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.520954 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.520974 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:03Z","lastTransitionTime":"2025-10-07T11:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.624157 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.624228 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.624243 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.624267 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.624284 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:03Z","lastTransitionTime":"2025-10-07T11:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.727491 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.727554 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.727565 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.727585 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.727599 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:03Z","lastTransitionTime":"2025-10-07T11:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.830609 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.830654 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.830665 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.830682 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.830696 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:03Z","lastTransitionTime":"2025-10-07T11:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.933059 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.933108 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.933119 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.933138 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.933149 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:03Z","lastTransitionTime":"2025-10-07T11:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.956699 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.956762 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:21:03 crc kubenswrapper[4700]: E1007 11:21:03.956956 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.957017 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:03 crc kubenswrapper[4700]: E1007 11:21:03.957149 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:21:03 crc kubenswrapper[4700]: E1007 11:21:03.957238 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.975100 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:03Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:03 crc kubenswrapper[4700]: I1007 11:21:03.990250 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:03Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.002499 4700 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.003496 4700 scope.go:117] "RemoveContainer" containerID="34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2" Oct 07 11:21:04 crc kubenswrapper[4700]: E1007 11:21:04.003841 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.008245 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.033903 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.036468 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.036634 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.036668 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.036700 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.036721 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:04Z","lastTransitionTime":"2025-10-07T11:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.051926 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.068296 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.083762 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.103399 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.117734 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.131630 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.139501 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.139570 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.139586 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.139615 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.139633 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:04Z","lastTransitionTime":"2025-10-07T11:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.147913 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.161242 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.179734 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e499a61ee1225e7b9ad2af5b6de48463ea247761b9e8414c343ac016eb66a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.186077 4700 generic.go:334] "Generic (PLEG): container finished" podID="59ea8470-d501-4d05-acb7-554792918f7c" containerID="aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334" exitCode=0 Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.186155 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" event={"ID":"59ea8470-d501-4d05-acb7-554792918f7c","Type":"ContainerDied","Data":"aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334"} Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.203113 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.219961 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.237001 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.258917 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.258973 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.258985 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.259009 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.259022 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:04Z","lastTransitionTime":"2025-10-07T11:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.269676 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e499a61ee1225e7b9ad2af5b6de48463ea247761b9e8414c343ac016eb66a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.299641 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.323368 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.337247 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.354266 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.362348 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.362412 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.362425 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.362443 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.362454 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:04Z","lastTransitionTime":"2025-10-07T11:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.375410 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.391713 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.408441 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.423181 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.437495 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.450788 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.465451 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.465491 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.465505 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.465524 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.465538 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:04Z","lastTransitionTime":"2025-10-07T11:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.467643 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.569207 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.569246 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.569255 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.569273 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.569285 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:04Z","lastTransitionTime":"2025-10-07T11:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.672607 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.672685 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.672703 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.672735 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.672760 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:04Z","lastTransitionTime":"2025-10-07T11:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.776167 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.776230 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.776241 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.776259 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.776270 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:04Z","lastTransitionTime":"2025-10-07T11:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.880079 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.880157 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.880178 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.880207 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.880230 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:04Z","lastTransitionTime":"2025-10-07T11:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.982795 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.982855 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.982866 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.982884 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:04 crc kubenswrapper[4700]: I1007 11:21:04.982894 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:04Z","lastTransitionTime":"2025-10-07T11:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.086110 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.086558 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.086570 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.086590 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.086605 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:05Z","lastTransitionTime":"2025-10-07T11:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.189679 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.189737 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.189749 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.189769 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.189785 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:05Z","lastTransitionTime":"2025-10-07T11:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.197298 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" event={"ID":"d0a75e4c-2144-40de-9abc-f0bb7a143a0e","Type":"ContainerStarted","Data":"ce54e410ec32248cf4dfca79f56111681ae597757e8b0bdb87c971c2edd7e12f"} Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.197723 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.197767 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.203111 4700 generic.go:334] "Generic (PLEG): container finished" podID="59ea8470-d501-4d05-acb7-554792918f7c" containerID="675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d" exitCode=0 Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.203163 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" event={"ID":"59ea8470-d501-4d05-acb7-554792918f7c","Type":"ContainerDied","Data":"675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d"} Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.214407 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:05Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.230696 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:05Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.232103 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.232174 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.247480 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:05Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.264921 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:05Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.287400 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce54e410ec32248cf4dfca79f56111681ae597757e8b0bdb87c971c2edd7e12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:05Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.292507 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.292540 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.292551 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.292567 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.292578 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:05Z","lastTransitionTime":"2025-10-07T11:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.302430 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:05Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.319711 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:05Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.334972 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:05Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.352583 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:05Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.368267 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:05Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.382833 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:05Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.395600 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.395628 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.395637 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.395653 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.395665 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:05Z","lastTransitionTime":"2025-10-07T11:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.396449 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:05Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.408780 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:05Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.420584 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e499a61ee1225e7b9ad2af5b6de48463ea247761b9e8414c343ac016eb66a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:05Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.439140 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:05Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.456218 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:05Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.470346 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:05Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.491530 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce54e410ec32248cf4dfca79f56111681ae597757e8b0bdb87c971c2edd7e12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:05Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.498703 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.498748 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.498761 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.498782 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.498796 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:05Z","lastTransitionTime":"2025-10-07T11:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.505865 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:05Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.528266 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:05Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.542622 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:05Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.556758 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:05Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.569720 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:05Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.584106 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:05Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.595602 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:05Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.601197 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.601244 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.601255 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.601273 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.601284 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:05Z","lastTransitionTime":"2025-10-07T11:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.610048 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:05Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.625106 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e499a61ee1225e7b9ad2af5b6de48463ea247761b9e8414c343ac016eb66a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:05Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.641476 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:05Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.704600 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.704633 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.704642 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.704656 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.704670 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:05Z","lastTransitionTime":"2025-10-07T11:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.808939 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.808999 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.809012 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.809032 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.809045 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:05Z","lastTransitionTime":"2025-10-07T11:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.912391 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.912444 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.912453 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.912472 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.912483 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:05Z","lastTransitionTime":"2025-10-07T11:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.957134 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.957213 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:21:05 crc kubenswrapper[4700]: I1007 11:21:05.957278 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:21:05 crc kubenswrapper[4700]: E1007 11:21:05.957343 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:21:05 crc kubenswrapper[4700]: E1007 11:21:05.957463 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:21:05 crc kubenswrapper[4700]: E1007 11:21:05.957616 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.015723 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.015778 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.015791 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.015807 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.015816 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:06Z","lastTransitionTime":"2025-10-07T11:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.119905 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.119988 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.120014 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.120046 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.120071 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:06Z","lastTransitionTime":"2025-10-07T11:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.211848 4700 generic.go:334] "Generic (PLEG): container finished" podID="59ea8470-d501-4d05-acb7-554792918f7c" containerID="85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e" exitCode=0 Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.211975 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" event={"ID":"59ea8470-d501-4d05-acb7-554792918f7c","Type":"ContainerDied","Data":"85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e"} Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.212044 4700 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.222573 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.222624 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.222635 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.222653 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.222665 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:06Z","lastTransitionTime":"2025-10-07T11:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.241821 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:06Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.263084 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:06Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.276835 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e499a61ee1225e7b9ad2af5b6de48463ea247761b9e8414c343ac016eb66a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:06Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.293207 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:06Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.315877 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:06Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.325929 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.325981 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.325991 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.326011 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.326022 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:06Z","lastTransitionTime":"2025-10-07T11:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.331713 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:06Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.353012 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:06Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.380655 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce54e410ec32248cf4dfca79f56111681ae597757e8b0bdb87c971c2edd7e12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:06Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.397366 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:06Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.413829 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:06Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.428945 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.429002 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.429054 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.429079 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.429093 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:06Z","lastTransitionTime":"2025-10-07T11:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.429534 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:06Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.447479 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:06Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.464884 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:06Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.481196 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:06Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.532339 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.532386 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.532401 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.532420 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.532436 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:06Z","lastTransitionTime":"2025-10-07T11:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.636981 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.637056 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.637078 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.637113 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.637139 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:06Z","lastTransitionTime":"2025-10-07T11:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.741996 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.742534 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.742561 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.742595 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.742621 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:06Z","lastTransitionTime":"2025-10-07T11:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.845256 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.845329 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.845364 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.845387 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.845402 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:06Z","lastTransitionTime":"2025-10-07T11:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.947549 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.947596 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.947608 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.947623 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:06 crc kubenswrapper[4700]: I1007 11:21:06.947634 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:06Z","lastTransitionTime":"2025-10-07T11:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.051373 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.051430 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.051440 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.051468 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.051483 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:07Z","lastTransitionTime":"2025-10-07T11:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.155197 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.155257 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.155269 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.155290 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.155320 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:07Z","lastTransitionTime":"2025-10-07T11:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.221667 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" event={"ID":"59ea8470-d501-4d05-acb7-554792918f7c","Type":"ContainerStarted","Data":"1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930"} Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.221784 4700 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.242181 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:07Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.258297 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.258368 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.258380 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.258396 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.258409 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:07Z","lastTransitionTime":"2025-10-07T11:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.258846 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:07Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.280469 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e499a61ee1225e7b9ad2af5b6de48463ea247761b9e8414c343ac016eb66a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:07Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.297395 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:07Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.320681 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:07Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.337058 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:07Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.357049 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:07Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.361454 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.361508 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.361526 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.361546 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.361562 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:07Z","lastTransitionTime":"2025-10-07T11:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.393673 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce54e410ec32248cf4dfca79f56111681ae597757e8b0bdb87c971c2edd7e12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:07Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.411693 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:07Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.432366 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:07Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.445101 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:07Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.456593 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:07Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.464403 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.464447 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.464461 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.464482 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.464494 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:07Z","lastTransitionTime":"2025-10-07T11:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.472125 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:07Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.488466 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:07Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.567930 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.567990 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.568001 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.568028 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.568040 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:07Z","lastTransitionTime":"2025-10-07T11:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.670539 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.670596 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.670613 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.670635 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.670650 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:07Z","lastTransitionTime":"2025-10-07T11:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.774118 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.774167 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.774178 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.774201 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.774213 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:07Z","lastTransitionTime":"2025-10-07T11:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.877053 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.877108 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.877119 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.877138 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.877149 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:07Z","lastTransitionTime":"2025-10-07T11:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.956530 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.956586 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.956586 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:21:07 crc kubenswrapper[4700]: E1007 11:21:07.956733 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:21:07 crc kubenswrapper[4700]: E1007 11:21:07.956870 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:21:07 crc kubenswrapper[4700]: E1007 11:21:07.957050 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.980943 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.980999 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.981016 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.981043 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:07 crc kubenswrapper[4700]: I1007 11:21:07.981057 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:07Z","lastTransitionTime":"2025-10-07T11:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.083776 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.083816 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.083825 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.083842 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.083852 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:08Z","lastTransitionTime":"2025-10-07T11:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.187584 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.187665 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.187692 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.187727 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.187752 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:08Z","lastTransitionTime":"2025-10-07T11:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.227025 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fk4xc_d0a75e4c-2144-40de-9abc-f0bb7a143a0e/ovnkube-controller/0.log" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.230764 4700 generic.go:334] "Generic (PLEG): container finished" podID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerID="ce54e410ec32248cf4dfca79f56111681ae597757e8b0bdb87c971c2edd7e12f" exitCode=1 Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.230860 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" event={"ID":"d0a75e4c-2144-40de-9abc-f0bb7a143a0e","Type":"ContainerDied","Data":"ce54e410ec32248cf4dfca79f56111681ae597757e8b0bdb87c971c2edd7e12f"} Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.231861 4700 scope.go:117] "RemoveContainer" containerID="ce54e410ec32248cf4dfca79f56111681ae597757e8b0bdb87c971c2edd7e12f" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.235883 4700 generic.go:334] "Generic (PLEG): container finished" podID="59ea8470-d501-4d05-acb7-554792918f7c" containerID="1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930" exitCode=0 Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.235987 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" event={"ID":"59ea8470-d501-4d05-acb7-554792918f7c","Type":"ContainerDied","Data":"1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930"} Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.257696 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:08Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.276641 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:08Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.290668 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.290731 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.290752 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.290775 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.290792 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:08Z","lastTransitionTime":"2025-10-07T11:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.302598 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce54e410ec32248cf4dfca79f56111681ae597757e8b0bdb87c971c2edd7e12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce54e410ec32248cf4dfca79f56111681ae597757e8b0bdb87c971c2edd7e12f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:08Z\\\",\\\"message\\\":\\\"or removal\\\\nI1007 11:21:07.982537 5918 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1007 11:21:07.982591 5918 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 11:21:07.982610 5918 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 11:21:07.982616 5918 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 11:21:07.982634 5918 factory.go:656] Stopping watch factory\\\\nI1007 11:21:07.982654 5918 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 11:21:07.982665 5918 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1007 11:21:07.982674 5918 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1007 11:21:07.982682 5918 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 11:21:07.982690 5918 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 11:21:07.982698 5918 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 11:21:07.982705 5918 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 11:21:07.982875 5918 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 11:21:07.983367 5918 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:08Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.322799 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:08Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.340404 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:08Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.364401 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:08Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.384825 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:08Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.394556 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.395032 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.395046 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.395075 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.395089 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:08Z","lastTransitionTime":"2025-10-07T11:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.403070 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:08Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.415235 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:08Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.427480 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:08Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.439428 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:08Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.452277 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e499a61ee1225e7b9ad2af5b6de48463ea247761b9e8414c343ac016eb66a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:08Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.465241 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:08Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.478100 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:08Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.493225 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:08Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.498356 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.498381 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.498396 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.498415 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.498428 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:08Z","lastTransitionTime":"2025-10-07T11:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.550326 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:08Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.572430 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:08Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.587211 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:08Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.600543 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:08Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.601089 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.601134 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.601150 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.601168 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.601180 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:08Z","lastTransitionTime":"2025-10-07T11:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.613488 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:08Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.625660 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:08Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.637761 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:08Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.650282 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e499a61ee1225e7b9ad2af5b6de48463ea247761b9e8414c343ac016eb66a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:08Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.664338 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:08Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.680735 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:08Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.694193 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:08Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.704515 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.704560 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.704578 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.704604 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.704622 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:08Z","lastTransitionTime":"2025-10-07T11:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.710800 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:08Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.730521 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce54e410ec32248cf4dfca79f56111681ae597757e8b0bdb87c971c2edd7e12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce54e410ec32248cf4dfca79f56111681ae597757e8b0bdb87c971c2edd7e12f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:08Z\\\",\\\"message\\\":\\\"or removal\\\\nI1007 11:21:07.982537 5918 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1007 11:21:07.982591 5918 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 11:21:07.982610 5918 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 11:21:07.982616 5918 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 11:21:07.982634 5918 factory.go:656] Stopping watch factory\\\\nI1007 11:21:07.982654 5918 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 11:21:07.982665 5918 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1007 11:21:07.982674 5918 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1007 11:21:07.982682 5918 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 11:21:07.982690 5918 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 11:21:07.982698 5918 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 11:21:07.982705 5918 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 11:21:07.982875 5918 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 11:21:07.983367 5918 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:08Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.808633 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.808781 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.808798 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.808828 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.808847 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:08Z","lastTransitionTime":"2025-10-07T11:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.912442 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.912498 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.912515 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.912539 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.912563 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:08Z","lastTransitionTime":"2025-10-07T11:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:08 crc kubenswrapper[4700]: I1007 11:21:08.955282 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.015924 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.015960 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.015972 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.015988 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.015999 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:09Z","lastTransitionTime":"2025-10-07T11:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.118668 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.118710 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.118722 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.118743 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.118757 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:09Z","lastTransitionTime":"2025-10-07T11:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.222579 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.222641 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.222652 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.222670 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.222683 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:09Z","lastTransitionTime":"2025-10-07T11:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.241128 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fk4xc_d0a75e4c-2144-40de-9abc-f0bb7a143a0e/ovnkube-controller/0.log" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.244519 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" event={"ID":"d0a75e4c-2144-40de-9abc-f0bb7a143a0e","Type":"ContainerStarted","Data":"3528327bd9cfbcdc5d2fae6f02d872cbb6683ddadbfac6138102422250af06b6"} Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.245122 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.248165 4700 generic.go:334] "Generic (PLEG): container finished" podID="59ea8470-d501-4d05-acb7-554792918f7c" containerID="a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375" exitCode=0 Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.248233 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" event={"ID":"59ea8470-d501-4d05-acb7-554792918f7c","Type":"ContainerDied","Data":"a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375"} Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.260148 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:09Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.278361 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:09Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.309435 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:09Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.322389 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:09Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.325199 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.325229 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.325242 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.325258 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.325271 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:09Z","lastTransitionTime":"2025-10-07T11:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.335511 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e499a61ee1225e7b9ad2af5b6de48463ea247761b9e8414c343ac016eb66a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:09Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.353052 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:09Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.369804 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:09Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.386673 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:09Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.404851 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:09Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.422320 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3528327bd9cfbcdc5d2fae6f02d872cbb6683ddadbfac6138102422250af06b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce54e410ec32248cf4dfca79f56111681ae597757e8b0bdb87c971c2edd7e12f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:08Z\\\",\\\"message\\\":\\\"or removal\\\\nI1007 11:21:07.982537 5918 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1007 11:21:07.982591 5918 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 11:21:07.982610 5918 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 11:21:07.982616 5918 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 11:21:07.982634 5918 factory.go:656] Stopping watch factory\\\\nI1007 11:21:07.982654 5918 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 11:21:07.982665 5918 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1007 11:21:07.982674 5918 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1007 11:21:07.982682 5918 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 11:21:07.982690 5918 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 11:21:07.982698 5918 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 11:21:07.982705 5918 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 11:21:07.982875 5918 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 11:21:07.983367 5918 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:09Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.427530 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.427574 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.427587 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.427606 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.427619 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:09Z","lastTransitionTime":"2025-10-07T11:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.436163 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:09Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.447762 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:09Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.463873 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:09Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.509665 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:09Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.530320 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.530353 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.530362 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.530383 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.530393 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:09Z","lastTransitionTime":"2025-10-07T11:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.537412 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:09Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.559782 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e499a61ee1225e7b9ad2af5b6de48463ea247761b9e8414c343ac016eb66a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:09Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.573648 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:09Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.584611 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:09Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.598120 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:09Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.612292 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:09Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.625534 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:09Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.632629 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.632674 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.632688 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.632708 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.632725 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:09Z","lastTransitionTime":"2025-10-07T11:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.641541 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:09Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.655296 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:09Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.668787 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:09Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.687473 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3528327bd9cfbcdc5d2fae6f02d872cbb6683ddadbfac6138102422250af06b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce54e410ec32248cf4dfca79f56111681ae597757e8b0bdb87c971c2edd7e12f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:08Z\\\",\\\"message\\\":\\\"or removal\\\\nI1007 11:21:07.982537 5918 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1007 11:21:07.982591 5918 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 11:21:07.982610 5918 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 11:21:07.982616 5918 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 11:21:07.982634 5918 factory.go:656] Stopping watch factory\\\\nI1007 11:21:07.982654 5918 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 11:21:07.982665 5918 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1007 11:21:07.982674 5918 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1007 11:21:07.982682 5918 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 11:21:07.982690 5918 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 11:21:07.982698 5918 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 11:21:07.982705 5918 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 11:21:07.982875 5918 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 11:21:07.983367 5918 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:09Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.689757 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:21:09 crc kubenswrapper[4700]: E1007 11:21:09.690094 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:21:25.690049017 +0000 UTC m=+52.486448046 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.701594 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:09Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.726830 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:09Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.735420 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.735481 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.735495 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.735520 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.735533 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:09Z","lastTransitionTime":"2025-10-07T11:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.743846 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:09Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.790692 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.790766 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.790819 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.790859 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:21:09 crc kubenswrapper[4700]: E1007 11:21:09.791019 4700 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 11:21:09 crc kubenswrapper[4700]: E1007 11:21:09.791092 4700 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 11:21:09 crc kubenswrapper[4700]: E1007 11:21:09.791117 4700 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 11:21:09 crc kubenswrapper[4700]: E1007 11:21:09.791133 4700 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 11:21:09 crc kubenswrapper[4700]: E1007 11:21:09.791207 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 11:21:25.791176365 +0000 UTC m=+52.587575534 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 11:21:09 crc kubenswrapper[4700]: E1007 11:21:09.791227 4700 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 11:21:09 crc kubenswrapper[4700]: E1007 11:21:09.791245 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 11:21:25.791229956 +0000 UTC m=+52.587629175 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 11:21:09 crc kubenswrapper[4700]: E1007 11:21:09.791058 4700 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 11:21:09 crc kubenswrapper[4700]: E1007 11:21:09.791280 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 11:21:25.791256807 +0000 UTC m=+52.587656006 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 11:21:09 crc kubenswrapper[4700]: E1007 11:21:09.791280 4700 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 11:21:09 crc kubenswrapper[4700]: E1007 11:21:09.791340 4700 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 11:21:09 crc kubenswrapper[4700]: E1007 11:21:09.791388 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 11:21:25.79137571 +0000 UTC m=+52.587774929 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.838471 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.838542 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.838563 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.838592 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.838614 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:09Z","lastTransitionTime":"2025-10-07T11:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.846614 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.846655 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.846663 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.846684 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.846701 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:09Z","lastTransitionTime":"2025-10-07T11:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:09 crc kubenswrapper[4700]: E1007 11:21:09.866720 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:09Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.871709 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.871768 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.871791 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.871817 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.871834 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:09Z","lastTransitionTime":"2025-10-07T11:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:09 crc kubenswrapper[4700]: E1007 11:21:09.887241 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:09Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.893630 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.893668 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.893681 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.893705 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.893719 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:09Z","lastTransitionTime":"2025-10-07T11:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:09 crc kubenswrapper[4700]: E1007 11:21:09.910219 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:09Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.915249 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.915292 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.915335 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.915360 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.915378 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:09Z","lastTransitionTime":"2025-10-07T11:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:09 crc kubenswrapper[4700]: E1007 11:21:09.931732 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:09Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.937814 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.937868 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.937880 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.937903 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.937920 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:09Z","lastTransitionTime":"2025-10-07T11:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.957087 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.957161 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.957088 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:09 crc kubenswrapper[4700]: E1007 11:21:09.957247 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:21:09 crc kubenswrapper[4700]: E1007 11:21:09.957401 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:21:09 crc kubenswrapper[4700]: E1007 11:21:09.957566 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:21:09 crc kubenswrapper[4700]: E1007 11:21:09.963452 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:09Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:09 crc kubenswrapper[4700]: E1007 11:21:09.963847 4700 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.966199 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.966241 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.966252 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.966267 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:09 crc kubenswrapper[4700]: I1007 11:21:09.966278 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:09Z","lastTransitionTime":"2025-10-07T11:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.069768 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.069837 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.069855 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.069880 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.069900 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:10Z","lastTransitionTime":"2025-10-07T11:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.174682 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.174801 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.174820 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.174845 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.174866 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:10Z","lastTransitionTime":"2025-10-07T11:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.253332 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fk4xc_d0a75e4c-2144-40de-9abc-f0bb7a143a0e/ovnkube-controller/1.log" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.253944 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fk4xc_d0a75e4c-2144-40de-9abc-f0bb7a143a0e/ovnkube-controller/0.log" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.257072 4700 generic.go:334] "Generic (PLEG): container finished" podID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerID="3528327bd9cfbcdc5d2fae6f02d872cbb6683ddadbfac6138102422250af06b6" exitCode=1 Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.257165 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" event={"ID":"d0a75e4c-2144-40de-9abc-f0bb7a143a0e","Type":"ContainerDied","Data":"3528327bd9cfbcdc5d2fae6f02d872cbb6683ddadbfac6138102422250af06b6"} Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.257248 4700 scope.go:117] "RemoveContainer" containerID="ce54e410ec32248cf4dfca79f56111681ae597757e8b0bdb87c971c2edd7e12f" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.258452 4700 scope.go:117] "RemoveContainer" containerID="3528327bd9cfbcdc5d2fae6f02d872cbb6683ddadbfac6138102422250af06b6" Oct 07 11:21:10 crc kubenswrapper[4700]: E1007 11:21:10.258890 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-fk4xc_openshift-ovn-kubernetes(d0a75e4c-2144-40de-9abc-f0bb7a143a0e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.263336 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" event={"ID":"59ea8470-d501-4d05-acb7-554792918f7c","Type":"ContainerStarted","Data":"7313e5d6c49d403c834be26be9717bd719575d0f469773f15a1f982c8209d887"} Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.278370 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.278605 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.278779 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.278929 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:10Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.278984 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.279294 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:10Z","lastTransitionTime":"2025-10-07T11:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.298177 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:10Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.322227 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3528327bd9cfbcdc5d2fae6f02d872cbb6683ddadbfac6138102422250af06b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce54e410ec32248cf4dfca79f56111681ae597757e8b0bdb87c971c2edd7e12f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:08Z\\\",\\\"message\\\":\\\"or removal\\\\nI1007 11:21:07.982537 5918 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1007 11:21:07.982591 5918 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 11:21:07.982610 5918 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 11:21:07.982616 5918 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 11:21:07.982634 5918 factory.go:656] Stopping watch factory\\\\nI1007 11:21:07.982654 5918 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 11:21:07.982665 5918 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1007 11:21:07.982674 5918 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1007 11:21:07.982682 5918 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 11:21:07.982690 5918 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 11:21:07.982698 5918 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 11:21:07.982705 5918 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 11:21:07.982875 5918 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 11:21:07.983367 5918 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3528327bd9cfbcdc5d2fae6f02d872cbb6683ddadbfac6138102422250af06b6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"message\\\":\\\"I1007 11:21:09.455651 6095 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1007 11:21:09.455680 6095 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1007 11:21:09.456063 6095 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 11:21:09.456093 6095 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1007 11:21:09.456092 6095 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 11:21:09.456238 6095 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 11:21:09.456347 6095 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 11:21:09.456421 6095 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1007 11:21:09.456519 6095 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 11:21:09.456616 6095 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 11:21:09.456705 6095 factory.go:656] Stopping watch factory\\\\nI1007 11:21:09.456791 6095 ovnkube.go:599] Stopped ovnkube\\\\nI1007 11:21:09.456467 6095 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 11:21:09.456991 6095 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 11:21:09.456573 6095 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1007 11:21:09.456667 6095 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 11:21:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:10Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.341687 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:10Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.362012 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:10Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.383106 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.383152 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.383166 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.383184 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.383196 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:10Z","lastTransitionTime":"2025-10-07T11:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.393237 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:10Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.416703 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:10Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.438887 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:10Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.461206 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:10Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.475667 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:10Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.485860 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.485915 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.485932 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.485960 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.485983 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:10Z","lastTransitionTime":"2025-10-07T11:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.493446 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:10Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.512232 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e499a61ee1225e7b9ad2af5b6de48463ea247761b9e8414c343ac016eb66a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:10Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.531925 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:10Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.551868 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:10Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.574239 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:10Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.589471 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.589527 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.589545 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.589574 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.589597 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:10Z","lastTransitionTime":"2025-10-07T11:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.590229 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:10Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.602389 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:10Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.619529 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3528327bd9cfbcdc5d2fae6f02d872cbb6683ddadbfac6138102422250af06b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce54e410ec32248cf4dfca79f56111681ae597757e8b0bdb87c971c2edd7e12f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:08Z\\\",\\\"message\\\":\\\"or removal\\\\nI1007 11:21:07.982537 5918 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1007 11:21:07.982591 5918 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 11:21:07.982610 5918 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 11:21:07.982616 5918 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 11:21:07.982634 5918 factory.go:656] Stopping watch factory\\\\nI1007 11:21:07.982654 5918 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 11:21:07.982665 5918 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1007 11:21:07.982674 5918 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1007 11:21:07.982682 5918 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 11:21:07.982690 5918 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 11:21:07.982698 5918 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 11:21:07.982705 5918 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 11:21:07.982875 5918 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 11:21:07.983367 5918 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3528327bd9cfbcdc5d2fae6f02d872cbb6683ddadbfac6138102422250af06b6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"message\\\":\\\"I1007 11:21:09.455651 6095 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1007 11:21:09.455680 6095 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1007 11:21:09.456063 6095 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 11:21:09.456093 6095 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1007 11:21:09.456092 6095 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 11:21:09.456238 6095 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 11:21:09.456347 6095 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 11:21:09.456421 6095 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1007 11:21:09.456519 6095 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 11:21:09.456616 6095 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 11:21:09.456705 6095 factory.go:656] Stopping watch factory\\\\nI1007 11:21:09.456791 6095 ovnkube.go:599] Stopped ovnkube\\\\nI1007 11:21:09.456467 6095 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 11:21:09.456991 6095 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 11:21:09.456573 6095 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1007 11:21:09.456667 6095 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 11:21:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:10Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.634461 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:10Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.651845 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7313e5d6c49d403c834be26be9717bd719575d0f469773f15a1f982c8209d887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:10Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.664447 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:10Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.680914 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:10Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.692705 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.692777 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.692806 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.692841 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.692870 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:10Z","lastTransitionTime":"2025-10-07T11:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.698619 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:10Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.716662 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:10Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.730148 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:10Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.747287 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:10Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.763606 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e499a61ee1225e7b9ad2af5b6de48463ea247761b9e8414c343ac016eb66a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:10Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.779872 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:10Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.796574 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.796630 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.796648 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.796673 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.796690 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:10Z","lastTransitionTime":"2025-10-07T11:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.901042 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.901111 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.901128 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.901154 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:10 crc kubenswrapper[4700]: I1007 11:21:10.901174 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:10Z","lastTransitionTime":"2025-10-07T11:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.004295 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.004396 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.004416 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.004444 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.004472 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:11Z","lastTransitionTime":"2025-10-07T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.108534 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.108599 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.108614 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.108637 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.108654 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:11Z","lastTransitionTime":"2025-10-07T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.212537 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.212591 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.212603 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.212623 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.212638 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:11Z","lastTransitionTime":"2025-10-07T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.268262 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fk4xc_d0a75e4c-2144-40de-9abc-f0bb7a143a0e/ovnkube-controller/1.log" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.273215 4700 scope.go:117] "RemoveContainer" containerID="3528327bd9cfbcdc5d2fae6f02d872cbb6683ddadbfac6138102422250af06b6" Oct 07 11:21:11 crc kubenswrapper[4700]: E1007 11:21:11.273529 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-fk4xc_openshift-ovn-kubernetes(d0a75e4c-2144-40de-9abc-f0bb7a143a0e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.289743 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:11Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.304838 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:11Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.316019 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.316085 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.316099 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.316119 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.316130 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:11Z","lastTransitionTime":"2025-10-07T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.324049 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:11Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.354034 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3528327bd9cfbcdc5d2fae6f02d872cbb6683ddadbfac6138102422250af06b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3528327bd9cfbcdc5d2fae6f02d872cbb6683ddadbfac6138102422250af06b6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"message\\\":\\\"I1007 11:21:09.455651 6095 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1007 11:21:09.455680 6095 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1007 11:21:09.456063 6095 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 11:21:09.456093 6095 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1007 11:21:09.456092 6095 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 11:21:09.456238 6095 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 11:21:09.456347 6095 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 11:21:09.456421 6095 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1007 11:21:09.456519 6095 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 11:21:09.456616 6095 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 11:21:09.456705 6095 factory.go:656] Stopping watch factory\\\\nI1007 11:21:09.456791 6095 ovnkube.go:599] Stopped ovnkube\\\\nI1007 11:21:09.456467 6095 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 11:21:09.456991 6095 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 11:21:09.456573 6095 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1007 11:21:09.456667 6095 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 11:21:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-fk4xc_openshift-ovn-kubernetes(d0a75e4c-2144-40de-9abc-f0bb7a143a0e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:11Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.371618 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:11Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.394245 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7313e5d6c49d403c834be26be9717bd719575d0f469773f15a1f982c8209d887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:11Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.416496 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:11Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.419346 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.419430 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.419456 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.419494 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.419525 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:11Z","lastTransitionTime":"2025-10-07T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.436365 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e499a61ee1225e7b9ad2af5b6de48463ea247761b9e8414c343ac016eb66a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:11Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.456006 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:11Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.474801 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:11Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.493055 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:11Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.511701 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:11Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.522232 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.522299 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.522337 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.522362 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.522376 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:11Z","lastTransitionTime":"2025-10-07T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.528449 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:11Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.547220 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:11Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.624724 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.624764 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.624777 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.624793 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.624803 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:11Z","lastTransitionTime":"2025-10-07T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.728465 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.728529 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.728546 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.728570 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.728591 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:11Z","lastTransitionTime":"2025-10-07T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.834968 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.835036 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.835054 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.835083 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.835108 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:11Z","lastTransitionTime":"2025-10-07T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.938806 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.938889 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.938915 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.938947 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.938968 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:11Z","lastTransitionTime":"2025-10-07T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.948617 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt"] Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.949300 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.952454 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.952825 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.957062 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.957112 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.957083 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:21:11 crc kubenswrapper[4700]: E1007 11:21:11.957227 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:21:11 crc kubenswrapper[4700]: E1007 11:21:11.957350 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:21:11 crc kubenswrapper[4700]: E1007 11:21:11.957467 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.971246 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e499a61ee1225e7b9ad2af5b6de48463ea247761b9e8414c343ac016eb66a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:11Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:11 crc kubenswrapper[4700]: I1007 11:21:11.994936 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:11Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.015190 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:12Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.015903 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/244b2984-d3ec-4577-893d-b9b4030db764-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-tbrzt\" (UID: \"244b2984-d3ec-4577-893d-b9b4030db764\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.015961 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/244b2984-d3ec-4577-893d-b9b4030db764-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-tbrzt\" (UID: \"244b2984-d3ec-4577-893d-b9b4030db764\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.016005 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/244b2984-d3ec-4577-893d-b9b4030db764-env-overrides\") pod \"ovnkube-control-plane-749d76644c-tbrzt\" (UID: \"244b2984-d3ec-4577-893d-b9b4030db764\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.016060 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2wgz\" (UniqueName: \"kubernetes.io/projected/244b2984-d3ec-4577-893d-b9b4030db764-kube-api-access-n2wgz\") pod \"ovnkube-control-plane-749d76644c-tbrzt\" (UID: \"244b2984-d3ec-4577-893d-b9b4030db764\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.034591 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:12Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.042191 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.042241 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.042278 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.042330 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.042351 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:12Z","lastTransitionTime":"2025-10-07T11:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.052908 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:12Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.065778 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:12Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.079948 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:12Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.096207 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:12Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.114955 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:12Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.117473 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/244b2984-d3ec-4577-893d-b9b4030db764-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-tbrzt\" (UID: \"244b2984-d3ec-4577-893d-b9b4030db764\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.117535 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/244b2984-d3ec-4577-893d-b9b4030db764-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-tbrzt\" (UID: \"244b2984-d3ec-4577-893d-b9b4030db764\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.117585 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/244b2984-d3ec-4577-893d-b9b4030db764-env-overrides\") pod \"ovnkube-control-plane-749d76644c-tbrzt\" (UID: \"244b2984-d3ec-4577-893d-b9b4030db764\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.117636 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2wgz\" (UniqueName: \"kubernetes.io/projected/244b2984-d3ec-4577-893d-b9b4030db764-kube-api-access-n2wgz\") pod \"ovnkube-control-plane-749d76644c-tbrzt\" (UID: \"244b2984-d3ec-4577-893d-b9b4030db764\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.118325 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/244b2984-d3ec-4577-893d-b9b4030db764-env-overrides\") pod \"ovnkube-control-plane-749d76644c-tbrzt\" (UID: \"244b2984-d3ec-4577-893d-b9b4030db764\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.119055 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/244b2984-d3ec-4577-893d-b9b4030db764-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-tbrzt\" (UID: \"244b2984-d3ec-4577-893d-b9b4030db764\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.122971 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/244b2984-d3ec-4577-893d-b9b4030db764-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-tbrzt\" (UID: \"244b2984-d3ec-4577-893d-b9b4030db764\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.131204 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:12Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.135854 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2wgz\" (UniqueName: \"kubernetes.io/projected/244b2984-d3ec-4577-893d-b9b4030db764-kube-api-access-n2wgz\") pod \"ovnkube-control-plane-749d76644c-tbrzt\" (UID: \"244b2984-d3ec-4577-893d-b9b4030db764\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.145703 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.145736 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.145746 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.145763 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.145774 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:12Z","lastTransitionTime":"2025-10-07T11:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.164392 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3528327bd9cfbcdc5d2fae6f02d872cbb6683ddadbfac6138102422250af06b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3528327bd9cfbcdc5d2fae6f02d872cbb6683ddadbfac6138102422250af06b6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"message\\\":\\\"I1007 11:21:09.455651 6095 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1007 11:21:09.455680 6095 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1007 11:21:09.456063 6095 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 11:21:09.456093 6095 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1007 11:21:09.456092 6095 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 11:21:09.456238 6095 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 11:21:09.456347 6095 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 11:21:09.456421 6095 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1007 11:21:09.456519 6095 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 11:21:09.456616 6095 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 11:21:09.456705 6095 factory.go:656] Stopping watch factory\\\\nI1007 11:21:09.456791 6095 ovnkube.go:599] Stopped ovnkube\\\\nI1007 11:21:09.456467 6095 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 11:21:09.456991 6095 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 11:21:09.456573 6095 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1007 11:21:09.456667 6095 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 11:21:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-fk4xc_openshift-ovn-kubernetes(d0a75e4c-2144-40de-9abc-f0bb7a143a0e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:12Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.180550 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:12Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.201779 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7313e5d6c49d403c834be26be9717bd719575d0f469773f15a1f982c8209d887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:12Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.225615 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:12Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.239940 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"244b2984-d3ec-4577-893d-b9b4030db764\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tbrzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:12Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.248695 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.248767 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.248787 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.248816 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.248838 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:12Z","lastTransitionTime":"2025-10-07T11:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.269217 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.353201 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.353849 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.353875 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.353910 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.353936 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:12Z","lastTransitionTime":"2025-10-07T11:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.456986 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.457050 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.457062 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.457079 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.457091 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:12Z","lastTransitionTime":"2025-10-07T11:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.561687 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.561765 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.561793 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.561827 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.561852 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:12Z","lastTransitionTime":"2025-10-07T11:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.665061 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.665112 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.665128 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.665152 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.665164 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:12Z","lastTransitionTime":"2025-10-07T11:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.769059 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.769117 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.769134 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.769155 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.769172 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:12Z","lastTransitionTime":"2025-10-07T11:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.872094 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.872170 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.872192 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.872224 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.872298 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:12Z","lastTransitionTime":"2025-10-07T11:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.975802 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.975859 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.975877 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.975905 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:12 crc kubenswrapper[4700]: I1007 11:21:12.975926 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:12Z","lastTransitionTime":"2025-10-07T11:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.079230 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.079282 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.079296 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.079334 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.079345 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:13Z","lastTransitionTime":"2025-10-07T11:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.124434 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-dhsvm"] Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.125623 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:21:13 crc kubenswrapper[4700]: E1007 11:21:13.125761 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.146678 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e499a61ee1225e7b9ad2af5b6de48463ea247761b9e8414c343ac016eb66a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:13Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.170655 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:13Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.182092 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.182146 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.182166 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.182186 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.182199 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:13Z","lastTransitionTime":"2025-10-07T11:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.198638 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:13Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.220903 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:13Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.230225 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/25429408-169d-4998-9b40-44a882f5a89e-metrics-certs\") pod \"network-metrics-daemon-dhsvm\" (UID: \"25429408-169d-4998-9b40-44a882f5a89e\") " pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.230299 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xglxs\" (UniqueName: \"kubernetes.io/projected/25429408-169d-4998-9b40-44a882f5a89e-kube-api-access-xglxs\") pod \"network-metrics-daemon-dhsvm\" (UID: \"25429408-169d-4998-9b40-44a882f5a89e\") " pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.241290 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:13Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.257879 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:13Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.273761 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:13Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.284813 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.284864 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.284879 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.284903 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.284919 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:13Z","lastTransitionTime":"2025-10-07T11:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.286010 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" event={"ID":"244b2984-d3ec-4577-893d-b9b4030db764","Type":"ContainerStarted","Data":"4d1ac3c28f1187deb559f1e8a08bd2989283f9e23cce4808fe6a6cd71f48b75f"} Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.286069 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" event={"ID":"244b2984-d3ec-4577-893d-b9b4030db764","Type":"ContainerStarted","Data":"5aa43badab7b685d1e4e47cae1ae05b3c890acdf44b9723e2a594f4beb57ac60"} Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.286083 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" event={"ID":"244b2984-d3ec-4577-893d-b9b4030db764","Type":"ContainerStarted","Data":"b17fbe270c7fb45101d45c3161d2368da6b75aa520f27bdd75908e0a5c447c36"} Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.298545 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:13Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.312340 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:13Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.326768 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:13Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.330802 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xglxs\" (UniqueName: \"kubernetes.io/projected/25429408-169d-4998-9b40-44a882f5a89e-kube-api-access-xglxs\") pod \"network-metrics-daemon-dhsvm\" (UID: \"25429408-169d-4998-9b40-44a882f5a89e\") " pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.330887 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/25429408-169d-4998-9b40-44a882f5a89e-metrics-certs\") pod \"network-metrics-daemon-dhsvm\" (UID: \"25429408-169d-4998-9b40-44a882f5a89e\") " pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:21:13 crc kubenswrapper[4700]: E1007 11:21:13.331030 4700 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 11:21:13 crc kubenswrapper[4700]: E1007 11:21:13.331096 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25429408-169d-4998-9b40-44a882f5a89e-metrics-certs podName:25429408-169d-4998-9b40-44a882f5a89e nodeName:}" failed. No retries permitted until 2025-10-07 11:21:13.831075053 +0000 UTC m=+40.627474062 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/25429408-169d-4998-9b40-44a882f5a89e-metrics-certs") pod "network-metrics-daemon-dhsvm" (UID: "25429408-169d-4998-9b40-44a882f5a89e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.350347 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3528327bd9cfbcdc5d2fae6f02d872cbb6683ddadbfac6138102422250af06b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3528327bd9cfbcdc5d2fae6f02d872cbb6683ddadbfac6138102422250af06b6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"message\\\":\\\"I1007 11:21:09.455651 6095 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1007 11:21:09.455680 6095 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1007 11:21:09.456063 6095 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 11:21:09.456093 6095 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1007 11:21:09.456092 6095 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 11:21:09.456238 6095 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 11:21:09.456347 6095 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 11:21:09.456421 6095 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1007 11:21:09.456519 6095 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 11:21:09.456616 6095 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 11:21:09.456705 6095 factory.go:656] Stopping watch factory\\\\nI1007 11:21:09.456791 6095 ovnkube.go:599] Stopped ovnkube\\\\nI1007 11:21:09.456467 6095 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 11:21:09.456991 6095 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 11:21:09.456573 6095 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1007 11:21:09.456667 6095 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 11:21:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-fk4xc_openshift-ovn-kubernetes(d0a75e4c-2144-40de-9abc-f0bb7a143a0e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:13Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.361009 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xglxs\" (UniqueName: \"kubernetes.io/projected/25429408-169d-4998-9b40-44a882f5a89e-kube-api-access-xglxs\") pod \"network-metrics-daemon-dhsvm\" (UID: \"25429408-169d-4998-9b40-44a882f5a89e\") " pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.365815 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dhsvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25429408-169d-4998-9b40-44a882f5a89e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dhsvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:13Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.384681 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:13Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.387566 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.387602 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.387613 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.387633 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.387647 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:13Z","lastTransitionTime":"2025-10-07T11:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.407812 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7313e5d6c49d403c834be26be9717bd719575d0f469773f15a1f982c8209d887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:13Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.424630 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:13Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.439534 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"244b2984-d3ec-4577-893d-b9b4030db764\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tbrzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:13Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.456383 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:13Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.470607 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:13Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.486278 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:13Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.490643 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.490723 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.490748 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.490786 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.490812 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:13Z","lastTransitionTime":"2025-10-07T11:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.519569 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3528327bd9cfbcdc5d2fae6f02d872cbb6683ddadbfac6138102422250af06b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3528327bd9cfbcdc5d2fae6f02d872cbb6683ddadbfac6138102422250af06b6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"message\\\":\\\"I1007 11:21:09.455651 6095 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1007 11:21:09.455680 6095 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1007 11:21:09.456063 6095 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 11:21:09.456093 6095 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1007 11:21:09.456092 6095 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 11:21:09.456238 6095 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 11:21:09.456347 6095 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 11:21:09.456421 6095 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1007 11:21:09.456519 6095 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 11:21:09.456616 6095 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 11:21:09.456705 6095 factory.go:656] Stopping watch factory\\\\nI1007 11:21:09.456791 6095 ovnkube.go:599] Stopped ovnkube\\\\nI1007 11:21:09.456467 6095 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 11:21:09.456991 6095 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 11:21:09.456573 6095 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1007 11:21:09.456667 6095 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 11:21:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-fk4xc_openshift-ovn-kubernetes(d0a75e4c-2144-40de-9abc-f0bb7a143a0e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:13Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.537964 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dhsvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25429408-169d-4998-9b40-44a882f5a89e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dhsvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:13Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.558543 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:13Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.582761 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7313e5d6c49d403c834be26be9717bd719575d0f469773f15a1f982c8209d887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:13Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.594152 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.594215 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.594234 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.594261 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.594282 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:13Z","lastTransitionTime":"2025-10-07T11:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.605250 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:13Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.624011 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"244b2984-d3ec-4577-893d-b9b4030db764\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa43badab7b685d1e4e47cae1ae05b3c890acdf44b9723e2a594f4beb57ac60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d1ac3c28f1187deb559f1e8a08bd2989283f9e23cce4808fe6a6cd71f48b75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tbrzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:13Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.643832 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e499a61ee1225e7b9ad2af5b6de48463ea247761b9e8414c343ac016eb66a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:13Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.664539 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:13Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.684786 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:13Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.697793 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.697884 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.697902 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.697928 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.697949 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:13Z","lastTransitionTime":"2025-10-07T11:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.705759 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:13Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.725849 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:13Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.743670 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:13Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.768163 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:13Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.800952 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.801014 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.801032 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.801057 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.801075 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:13Z","lastTransitionTime":"2025-10-07T11:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.836746 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/25429408-169d-4998-9b40-44a882f5a89e-metrics-certs\") pod \"network-metrics-daemon-dhsvm\" (UID: \"25429408-169d-4998-9b40-44a882f5a89e\") " pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:21:13 crc kubenswrapper[4700]: E1007 11:21:13.837046 4700 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 11:21:13 crc kubenswrapper[4700]: E1007 11:21:13.837198 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25429408-169d-4998-9b40-44a882f5a89e-metrics-certs podName:25429408-169d-4998-9b40-44a882f5a89e nodeName:}" failed. No retries permitted until 2025-10-07 11:21:14.837166092 +0000 UTC m=+41.633565121 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/25429408-169d-4998-9b40-44a882f5a89e-metrics-certs") pod "network-metrics-daemon-dhsvm" (UID: "25429408-169d-4998-9b40-44a882f5a89e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.904760 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.904817 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.904830 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.904849 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.904859 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:13Z","lastTransitionTime":"2025-10-07T11:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.956891 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.957014 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:21:13 crc kubenswrapper[4700]: E1007 11:21:13.957118 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:21:13 crc kubenswrapper[4700]: E1007 11:21:13.957263 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.957024 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:21:13 crc kubenswrapper[4700]: E1007 11:21:13.957456 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:21:13 crc kubenswrapper[4700]: I1007 11:21:13.980364 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dhsvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25429408-169d-4998-9b40-44a882f5a89e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dhsvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:13Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.001255 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:13Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.009076 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.009149 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.009172 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.009203 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.009228 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:14Z","lastTransitionTime":"2025-10-07T11:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.032017 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7313e5d6c49d403c834be26be9717bd719575d0f469773f15a1f982c8209d887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:14Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.058201 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:14Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.074632 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"244b2984-d3ec-4577-893d-b9b4030db764\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa43badab7b685d1e4e47cae1ae05b3c890acdf44b9723e2a594f4beb57ac60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d1ac3c28f1187deb559f1e8a08bd2989283f9e23cce4808fe6a6cd71f48b75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tbrzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:14Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.088108 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e499a61ee1225e7b9ad2af5b6de48463ea247761b9e8414c343ac016eb66a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:14Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.107104 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:14Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.112106 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.112163 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.112187 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.112218 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.112242 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:14Z","lastTransitionTime":"2025-10-07T11:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.124201 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:14Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.139080 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:14Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.151750 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:14Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.167289 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:14Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.192622 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:14Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.215186 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.215276 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.215300 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.215380 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.215410 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:14Z","lastTransitionTime":"2025-10-07T11:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.217140 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:14Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.235197 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:14Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.255290 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:14Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.283971 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3528327bd9cfbcdc5d2fae6f02d872cbb6683ddadbfac6138102422250af06b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3528327bd9cfbcdc5d2fae6f02d872cbb6683ddadbfac6138102422250af06b6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"message\\\":\\\"I1007 11:21:09.455651 6095 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1007 11:21:09.455680 6095 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1007 11:21:09.456063 6095 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 11:21:09.456093 6095 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1007 11:21:09.456092 6095 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 11:21:09.456238 6095 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 11:21:09.456347 6095 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 11:21:09.456421 6095 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1007 11:21:09.456519 6095 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 11:21:09.456616 6095 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 11:21:09.456705 6095 factory.go:656] Stopping watch factory\\\\nI1007 11:21:09.456791 6095 ovnkube.go:599] Stopped ovnkube\\\\nI1007 11:21:09.456467 6095 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 11:21:09.456991 6095 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 11:21:09.456573 6095 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1007 11:21:09.456667 6095 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 11:21:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-fk4xc_openshift-ovn-kubernetes(d0a75e4c-2144-40de-9abc-f0bb7a143a0e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:14Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.318237 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.318285 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.318299 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.318340 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.318351 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:14Z","lastTransitionTime":"2025-10-07T11:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.421209 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.421293 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.421349 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.421377 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.421396 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:14Z","lastTransitionTime":"2025-10-07T11:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.524353 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.524429 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.524446 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.524474 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.524496 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:14Z","lastTransitionTime":"2025-10-07T11:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.627651 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.627726 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.627742 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.627778 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.627802 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:14Z","lastTransitionTime":"2025-10-07T11:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.731084 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.731934 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.731957 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.731984 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.732003 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:14Z","lastTransitionTime":"2025-10-07T11:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.835220 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.835365 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.835387 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.835416 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.835451 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:14Z","lastTransitionTime":"2025-10-07T11:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.848132 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/25429408-169d-4998-9b40-44a882f5a89e-metrics-certs\") pod \"network-metrics-daemon-dhsvm\" (UID: \"25429408-169d-4998-9b40-44a882f5a89e\") " pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:21:14 crc kubenswrapper[4700]: E1007 11:21:14.848356 4700 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 11:21:14 crc kubenswrapper[4700]: E1007 11:21:14.848480 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25429408-169d-4998-9b40-44a882f5a89e-metrics-certs podName:25429408-169d-4998-9b40-44a882f5a89e nodeName:}" failed. No retries permitted until 2025-10-07 11:21:16.848451916 +0000 UTC m=+43.644850925 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/25429408-169d-4998-9b40-44a882f5a89e-metrics-certs") pod "network-metrics-daemon-dhsvm" (UID: "25429408-169d-4998-9b40-44a882f5a89e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.938285 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.938386 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.938410 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.938440 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.938463 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:14Z","lastTransitionTime":"2025-10-07T11:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:14 crc kubenswrapper[4700]: I1007 11:21:14.956898 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:21:14 crc kubenswrapper[4700]: E1007 11:21:14.957120 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.042700 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.042792 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.042815 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.042846 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.042867 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:15Z","lastTransitionTime":"2025-10-07T11:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.147232 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.147359 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.147374 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.147399 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.147412 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:15Z","lastTransitionTime":"2025-10-07T11:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.249465 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.249517 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.249528 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.249545 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.249556 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:15Z","lastTransitionTime":"2025-10-07T11:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.352032 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.352084 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.352100 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.352117 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.352133 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:15Z","lastTransitionTime":"2025-10-07T11:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.455215 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.455259 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.455270 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.455286 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.455298 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:15Z","lastTransitionTime":"2025-10-07T11:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.557625 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.557680 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.557698 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.557724 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.557744 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:15Z","lastTransitionTime":"2025-10-07T11:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.660214 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.660288 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.660341 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.660372 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.660397 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:15Z","lastTransitionTime":"2025-10-07T11:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.762610 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.762677 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.762701 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.762733 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.762755 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:15Z","lastTransitionTime":"2025-10-07T11:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.866032 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.866092 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.866128 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.866157 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.866178 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:15Z","lastTransitionTime":"2025-10-07T11:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.956495 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.956537 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.956739 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:15 crc kubenswrapper[4700]: E1007 11:21:15.956735 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:21:15 crc kubenswrapper[4700]: E1007 11:21:15.957133 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:21:15 crc kubenswrapper[4700]: E1007 11:21:15.957249 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.957618 4700 scope.go:117] "RemoveContainer" containerID="34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.968414 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.969024 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.969178 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.969446 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:15 crc kubenswrapper[4700]: I1007 11:21:15.969618 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:15Z","lastTransitionTime":"2025-10-07T11:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.073961 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.074008 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.074026 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.074046 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.074063 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:16Z","lastTransitionTime":"2025-10-07T11:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.180036 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.180131 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.180144 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.180162 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.180173 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:16Z","lastTransitionTime":"2025-10-07T11:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.283464 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.283509 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.283521 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.283538 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.283550 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:16Z","lastTransitionTime":"2025-10-07T11:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.301527 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.304339 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5f802d78653da0d6865fa5c88a4c16488eb620476eb832a61905962260c1f0b1"} Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.304802 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.326499 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:16Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.346485 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"244b2984-d3ec-4577-893d-b9b4030db764\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa43badab7b685d1e4e47cae1ae05b3c890acdf44b9723e2a594f4beb57ac60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d1ac3c28f1187deb559f1e8a08bd2989283f9e23cce4808fe6a6cd71f48b75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tbrzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:16Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.362706 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dhsvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25429408-169d-4998-9b40-44a882f5a89e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dhsvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:16Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.378010 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:16Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.386155 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.386200 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.386217 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.386241 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.386259 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:16Z","lastTransitionTime":"2025-10-07T11:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.402479 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7313e5d6c49d403c834be26be9717bd719575d0f469773f15a1f982c8209d887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:16Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.422437 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:16Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.438102 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:16Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.454328 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e499a61ee1225e7b9ad2af5b6de48463ea247761b9e8414c343ac016eb66a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:16Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.474132 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:16Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.489164 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.489217 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.489234 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.489260 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.489282 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:16Z","lastTransitionTime":"2025-10-07T11:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.499006 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:16Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.515060 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:16Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.534120 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:16Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.568954 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3528327bd9cfbcdc5d2fae6f02d872cbb6683ddadbfac6138102422250af06b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3528327bd9cfbcdc5d2fae6f02d872cbb6683ddadbfac6138102422250af06b6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"message\\\":\\\"I1007 11:21:09.455651 6095 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1007 11:21:09.455680 6095 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1007 11:21:09.456063 6095 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 11:21:09.456093 6095 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1007 11:21:09.456092 6095 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 11:21:09.456238 6095 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 11:21:09.456347 6095 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 11:21:09.456421 6095 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1007 11:21:09.456519 6095 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 11:21:09.456616 6095 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 11:21:09.456705 6095 factory.go:656] Stopping watch factory\\\\nI1007 11:21:09.456791 6095 ovnkube.go:599] Stopped ovnkube\\\\nI1007 11:21:09.456467 6095 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 11:21:09.456991 6095 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 11:21:09.456573 6095 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1007 11:21:09.456667 6095 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 11:21:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-fk4xc_openshift-ovn-kubernetes(d0a75e4c-2144-40de-9abc-f0bb7a143a0e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:16Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.589840 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f802d78653da0d6865fa5c88a4c16488eb620476eb832a61905962260c1f0b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:16Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.594219 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.594281 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.594344 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.594377 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.594403 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:16Z","lastTransitionTime":"2025-10-07T11:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.609252 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:16Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.628357 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:16Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.696941 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.697242 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.697395 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.697514 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.697624 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:16Z","lastTransitionTime":"2025-10-07T11:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.801044 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.801098 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.801108 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.801128 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.801139 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:16Z","lastTransitionTime":"2025-10-07T11:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.873629 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/25429408-169d-4998-9b40-44a882f5a89e-metrics-certs\") pod \"network-metrics-daemon-dhsvm\" (UID: \"25429408-169d-4998-9b40-44a882f5a89e\") " pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:21:16 crc kubenswrapper[4700]: E1007 11:21:16.873920 4700 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 11:21:16 crc kubenswrapper[4700]: E1007 11:21:16.874087 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25429408-169d-4998-9b40-44a882f5a89e-metrics-certs podName:25429408-169d-4998-9b40-44a882f5a89e nodeName:}" failed. No retries permitted until 2025-10-07 11:21:20.874051545 +0000 UTC m=+47.670450764 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/25429408-169d-4998-9b40-44a882f5a89e-metrics-certs") pod "network-metrics-daemon-dhsvm" (UID: "25429408-169d-4998-9b40-44a882f5a89e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.904286 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.904337 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.904346 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.904362 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.904374 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:16Z","lastTransitionTime":"2025-10-07T11:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:16 crc kubenswrapper[4700]: I1007 11:21:16.956850 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:21:16 crc kubenswrapper[4700]: E1007 11:21:16.957064 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.007689 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.007725 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.007734 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.007751 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.007762 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:17Z","lastTransitionTime":"2025-10-07T11:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.110725 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.110786 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.110797 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.110815 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.110826 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:17Z","lastTransitionTime":"2025-10-07T11:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.214029 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.214077 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.214086 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.214104 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.214116 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:17Z","lastTransitionTime":"2025-10-07T11:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.317424 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.317514 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.317530 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.317553 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.317567 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:17Z","lastTransitionTime":"2025-10-07T11:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.420531 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.420616 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.420633 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.420657 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.420674 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:17Z","lastTransitionTime":"2025-10-07T11:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.523275 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.523337 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.523349 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.523370 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.523383 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:17Z","lastTransitionTime":"2025-10-07T11:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.627077 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.627151 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.627165 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.627217 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.627232 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:17Z","lastTransitionTime":"2025-10-07T11:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.730444 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.730501 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.730516 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.730540 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.730554 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:17Z","lastTransitionTime":"2025-10-07T11:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.834012 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.834077 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.834094 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.834116 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.834129 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:17Z","lastTransitionTime":"2025-10-07T11:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.937688 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.937749 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.937759 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.937788 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.937801 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:17Z","lastTransitionTime":"2025-10-07T11:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.957362 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:21:17 crc kubenswrapper[4700]: E1007 11:21:17.957531 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.957371 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:17 crc kubenswrapper[4700]: I1007 11:21:17.957351 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:21:17 crc kubenswrapper[4700]: E1007 11:21:17.957648 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:21:17 crc kubenswrapper[4700]: E1007 11:21:17.957836 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.041406 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.041494 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.041524 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.041555 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.041578 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:18Z","lastTransitionTime":"2025-10-07T11:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.145041 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.145130 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.145149 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.145176 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.145195 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:18Z","lastTransitionTime":"2025-10-07T11:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.247925 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.248018 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.248045 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.248080 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.248211 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:18Z","lastTransitionTime":"2025-10-07T11:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.352355 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.352404 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.352421 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.352446 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.352464 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:18Z","lastTransitionTime":"2025-10-07T11:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.462078 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.462168 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.462189 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.462220 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.462242 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:18Z","lastTransitionTime":"2025-10-07T11:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.565590 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.565653 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.565670 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.565698 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.565718 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:18Z","lastTransitionTime":"2025-10-07T11:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.669145 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.669206 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.669217 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.669241 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.669255 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:18Z","lastTransitionTime":"2025-10-07T11:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.773206 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.773270 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.773282 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.773332 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.773352 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:18Z","lastTransitionTime":"2025-10-07T11:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.877166 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.877211 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.877220 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.877239 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.877255 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:18Z","lastTransitionTime":"2025-10-07T11:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.956357 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:21:18 crc kubenswrapper[4700]: E1007 11:21:18.956532 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.980243 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.980290 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.980335 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.980357 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:18 crc kubenswrapper[4700]: I1007 11:21:18.980370 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:18Z","lastTransitionTime":"2025-10-07T11:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.083183 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.083260 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.083284 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.083370 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.083441 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:19Z","lastTransitionTime":"2025-10-07T11:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.186730 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.186812 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.186833 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.187254 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.187491 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:19Z","lastTransitionTime":"2025-10-07T11:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.291422 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.291487 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.291508 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.291533 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.291550 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:19Z","lastTransitionTime":"2025-10-07T11:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.396125 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.396191 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.396221 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.396250 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.396269 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:19Z","lastTransitionTime":"2025-10-07T11:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.499863 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.499923 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.499941 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.499964 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.499982 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:19Z","lastTransitionTime":"2025-10-07T11:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.603228 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.603265 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.603276 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.603294 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.603325 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:19Z","lastTransitionTime":"2025-10-07T11:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.707465 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.707564 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.707589 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.707623 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.707652 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:19Z","lastTransitionTime":"2025-10-07T11:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.810893 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.810958 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.810973 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.810996 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.811013 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:19Z","lastTransitionTime":"2025-10-07T11:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.914462 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.914540 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.914561 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.914590 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.914609 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:19Z","lastTransitionTime":"2025-10-07T11:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.957040 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.957101 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:21:19 crc kubenswrapper[4700]: I1007 11:21:19.957040 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:19 crc kubenswrapper[4700]: E1007 11:21:19.957233 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:21:19 crc kubenswrapper[4700]: E1007 11:21:19.957372 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:21:19 crc kubenswrapper[4700]: E1007 11:21:19.957491 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.017541 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.017617 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.017636 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.017665 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.017683 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:20Z","lastTransitionTime":"2025-10-07T11:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.120822 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.120890 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.120904 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.120953 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.120971 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:20Z","lastTransitionTime":"2025-10-07T11:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.224172 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.224233 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.224247 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.224270 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.224284 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:20Z","lastTransitionTime":"2025-10-07T11:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.327015 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.327055 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.327066 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.327085 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.327100 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:20Z","lastTransitionTime":"2025-10-07T11:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.339139 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.339170 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.339180 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.339192 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.339200 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:20Z","lastTransitionTime":"2025-10-07T11:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:20 crc kubenswrapper[4700]: E1007 11:21:20.359864 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:20Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.365553 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.365587 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.365596 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.365610 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.365621 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:20Z","lastTransitionTime":"2025-10-07T11:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:20 crc kubenswrapper[4700]: E1007 11:21:20.386039 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:20Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.392354 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.392384 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.392395 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.392407 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.392416 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:20Z","lastTransitionTime":"2025-10-07T11:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:20 crc kubenswrapper[4700]: E1007 11:21:20.420657 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:20Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.426906 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.426952 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.426972 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.427002 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.427026 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:20Z","lastTransitionTime":"2025-10-07T11:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:20 crc kubenswrapper[4700]: E1007 11:21:20.445919 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:20Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.451526 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.451587 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.451605 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.451630 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.451650 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:20Z","lastTransitionTime":"2025-10-07T11:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:20 crc kubenswrapper[4700]: E1007 11:21:20.471844 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:20Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:20 crc kubenswrapper[4700]: E1007 11:21:20.472533 4700 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.474660 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.474719 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.474743 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.474775 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.474798 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:20Z","lastTransitionTime":"2025-10-07T11:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.577503 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.577563 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.577583 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.577606 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.577624 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:20Z","lastTransitionTime":"2025-10-07T11:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.680919 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.680982 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.680997 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.681021 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.681037 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:20Z","lastTransitionTime":"2025-10-07T11:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.784167 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.784233 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.784245 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.784270 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.784285 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:20Z","lastTransitionTime":"2025-10-07T11:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.887727 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.887805 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.887825 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.887857 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.887915 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:20Z","lastTransitionTime":"2025-10-07T11:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.925577 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/25429408-169d-4998-9b40-44a882f5a89e-metrics-certs\") pod \"network-metrics-daemon-dhsvm\" (UID: \"25429408-169d-4998-9b40-44a882f5a89e\") " pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:21:20 crc kubenswrapper[4700]: E1007 11:21:20.925765 4700 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 11:21:20 crc kubenswrapper[4700]: E1007 11:21:20.925842 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25429408-169d-4998-9b40-44a882f5a89e-metrics-certs podName:25429408-169d-4998-9b40-44a882f5a89e nodeName:}" failed. No retries permitted until 2025-10-07 11:21:28.925819519 +0000 UTC m=+55.722218548 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/25429408-169d-4998-9b40-44a882f5a89e-metrics-certs") pod "network-metrics-daemon-dhsvm" (UID: "25429408-169d-4998-9b40-44a882f5a89e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.956863 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:21:20 crc kubenswrapper[4700]: E1007 11:21:20.957052 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.991481 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.991536 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.991560 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.991593 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:20 crc kubenswrapper[4700]: I1007 11:21:20.991618 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:20Z","lastTransitionTime":"2025-10-07T11:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.095750 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.095816 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.095834 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.095860 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.095877 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:21Z","lastTransitionTime":"2025-10-07T11:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.198676 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.198794 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.198813 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.198842 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.198862 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:21Z","lastTransitionTime":"2025-10-07T11:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.302580 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.302656 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.302678 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.302710 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.302728 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:21Z","lastTransitionTime":"2025-10-07T11:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.405802 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.405877 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.405902 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.405931 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.405954 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:21Z","lastTransitionTime":"2025-10-07T11:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.508740 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.508819 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.508842 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.508872 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.508900 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:21Z","lastTransitionTime":"2025-10-07T11:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.612474 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.612539 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.612559 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.612587 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.612605 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:21Z","lastTransitionTime":"2025-10-07T11:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.715661 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.715740 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.715775 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.715805 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.715830 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:21Z","lastTransitionTime":"2025-10-07T11:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.819120 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.819212 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.819235 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.819266 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.819285 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:21Z","lastTransitionTime":"2025-10-07T11:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.922393 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.922444 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.922456 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.922480 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.922491 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:21Z","lastTransitionTime":"2025-10-07T11:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.957081 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.957208 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:21:21 crc kubenswrapper[4700]: I1007 11:21:21.957362 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:21:21 crc kubenswrapper[4700]: E1007 11:21:21.957355 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:21:21 crc kubenswrapper[4700]: E1007 11:21:21.957642 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:21:21 crc kubenswrapper[4700]: E1007 11:21:21.957526 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.025363 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.025402 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.025414 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.025428 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.025439 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:22Z","lastTransitionTime":"2025-10-07T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.128418 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.128483 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.128497 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.128520 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.128547 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:22Z","lastTransitionTime":"2025-10-07T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.231953 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.232015 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.232028 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.232046 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.232062 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:22Z","lastTransitionTime":"2025-10-07T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.334609 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.334663 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.334680 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.334701 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.334719 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:22Z","lastTransitionTime":"2025-10-07T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.441939 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.442009 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.442033 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.442065 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.442088 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:22Z","lastTransitionTime":"2025-10-07T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.545848 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.545909 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.545927 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.545960 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.545981 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:22Z","lastTransitionTime":"2025-10-07T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.649599 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.649656 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.649676 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.649701 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.649719 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:22Z","lastTransitionTime":"2025-10-07T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.753261 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.753326 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.753342 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.753361 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.753373 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:22Z","lastTransitionTime":"2025-10-07T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.856508 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.856562 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.856573 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.856591 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.856604 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:22Z","lastTransitionTime":"2025-10-07T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.956243 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:21:22 crc kubenswrapper[4700]: E1007 11:21:22.956512 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.959429 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.959482 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.959499 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.959515 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:22 crc kubenswrapper[4700]: I1007 11:21:22.959528 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:22Z","lastTransitionTime":"2025-10-07T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.062287 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.062347 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.062356 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.062377 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.062390 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:23Z","lastTransitionTime":"2025-10-07T11:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.165107 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.165160 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.165172 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.165189 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.165201 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:23Z","lastTransitionTime":"2025-10-07T11:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.268781 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.268855 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.268864 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.268886 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.268896 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:23Z","lastTransitionTime":"2025-10-07T11:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.371761 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.371803 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.371813 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.371832 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.371898 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:23Z","lastTransitionTime":"2025-10-07T11:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.475033 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.475089 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.475100 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.475119 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.475131 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:23Z","lastTransitionTime":"2025-10-07T11:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.577952 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.578032 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.578063 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.578096 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.578119 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:23Z","lastTransitionTime":"2025-10-07T11:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.681042 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.681106 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.681116 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.681132 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.681143 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:23Z","lastTransitionTime":"2025-10-07T11:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.784401 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.784456 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.784475 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.784499 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.784519 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:23Z","lastTransitionTime":"2025-10-07T11:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.887082 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.887157 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.887180 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.887210 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.887233 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:23Z","lastTransitionTime":"2025-10-07T11:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.956715 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.956731 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:23 crc kubenswrapper[4700]: E1007 11:21:23.956929 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.956725 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:21:23 crc kubenswrapper[4700]: E1007 11:21:23.957013 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:21:23 crc kubenswrapper[4700]: E1007 11:21:23.957040 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.975185 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:23Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.989743 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.989815 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.989830 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.989850 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.989864 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:23Z","lastTransitionTime":"2025-10-07T11:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:23 crc kubenswrapper[4700]: I1007 11:21:23.992677 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7313e5d6c49d403c834be26be9717bd719575d0f469773f15a1f982c8209d887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:23Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.008339 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:24Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.022701 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"244b2984-d3ec-4577-893d-b9b4030db764\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa43badab7b685d1e4e47cae1ae05b3c890acdf44b9723e2a594f4beb57ac60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d1ac3c28f1187deb559f1e8a08bd2989283f9e23cce4808fe6a6cd71f48b75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tbrzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:24Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.036418 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dhsvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25429408-169d-4998-9b40-44a882f5a89e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dhsvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:24Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.054690 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:24Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.070448 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:24Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.085962 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:24Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.092900 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.092955 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.092969 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.092991 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.093005 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:24Z","lastTransitionTime":"2025-10-07T11:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.100723 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:24Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.115452 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e499a61ee1225e7b9ad2af5b6de48463ea247761b9e8414c343ac016eb66a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:24Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.129995 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:24Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.149622 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:24Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.168996 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:24Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.186057 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:24Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.196238 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.196359 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.196379 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.196403 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.196421 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:24Z","lastTransitionTime":"2025-10-07T11:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.217675 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3528327bd9cfbcdc5d2fae6f02d872cbb6683ddadbfac6138102422250af06b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3528327bd9cfbcdc5d2fae6f02d872cbb6683ddadbfac6138102422250af06b6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"message\\\":\\\"I1007 11:21:09.455651 6095 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1007 11:21:09.455680 6095 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1007 11:21:09.456063 6095 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 11:21:09.456093 6095 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1007 11:21:09.456092 6095 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 11:21:09.456238 6095 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 11:21:09.456347 6095 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 11:21:09.456421 6095 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1007 11:21:09.456519 6095 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 11:21:09.456616 6095 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 11:21:09.456705 6095 factory.go:656] Stopping watch factory\\\\nI1007 11:21:09.456791 6095 ovnkube.go:599] Stopped ovnkube\\\\nI1007 11:21:09.456467 6095 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 11:21:09.456991 6095 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 11:21:09.456573 6095 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1007 11:21:09.456667 6095 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 11:21:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-fk4xc_openshift-ovn-kubernetes(d0a75e4c-2144-40de-9abc-f0bb7a143a0e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:24Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.238123 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f802d78653da0d6865fa5c88a4c16488eb620476eb832a61905962260c1f0b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:24Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.299789 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.299836 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.299847 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.299872 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.299888 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:24Z","lastTransitionTime":"2025-10-07T11:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.402862 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.402921 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.402935 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.402960 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.402973 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:24Z","lastTransitionTime":"2025-10-07T11:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.505875 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.505917 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.505927 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.505948 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.505959 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:24Z","lastTransitionTime":"2025-10-07T11:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.609247 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.609659 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.609756 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.609793 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.609806 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:24Z","lastTransitionTime":"2025-10-07T11:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.712597 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.712668 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.712685 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.712714 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.712732 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:24Z","lastTransitionTime":"2025-10-07T11:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.816356 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.816416 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.816428 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.816449 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.816464 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:24Z","lastTransitionTime":"2025-10-07T11:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.920092 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.920158 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.920175 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.920204 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.920222 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:24Z","lastTransitionTime":"2025-10-07T11:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:24 crc kubenswrapper[4700]: I1007 11:21:24.956974 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:21:24 crc kubenswrapper[4700]: E1007 11:21:24.957260 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.023424 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.023486 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.023503 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.023531 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.023552 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:25Z","lastTransitionTime":"2025-10-07T11:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.126802 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.126865 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.126875 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.126895 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.126905 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:25Z","lastTransitionTime":"2025-10-07T11:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.229875 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.229925 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.229936 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.229958 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.229969 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:25Z","lastTransitionTime":"2025-10-07T11:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.333322 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.333372 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.333384 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.333403 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.333414 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:25Z","lastTransitionTime":"2025-10-07T11:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.436893 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.437006 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.437032 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.437068 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.437095 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:25Z","lastTransitionTime":"2025-10-07T11:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.540010 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.540071 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.540093 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.540124 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.540145 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:25Z","lastTransitionTime":"2025-10-07T11:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.643494 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.643569 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.643594 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.643626 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.643648 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:25Z","lastTransitionTime":"2025-10-07T11:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.746791 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.746864 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.746888 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.746916 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.746938 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:25Z","lastTransitionTime":"2025-10-07T11:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.780577 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:21:25 crc kubenswrapper[4700]: E1007 11:21:25.780809 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:21:57.780770395 +0000 UTC m=+84.577169414 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.849726 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.849799 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.849811 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.849830 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.849841 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:25Z","lastTransitionTime":"2025-10-07T11:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.881673 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.881753 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.881830 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.881877 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:21:25 crc kubenswrapper[4700]: E1007 11:21:25.881968 4700 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 11:21:25 crc kubenswrapper[4700]: E1007 11:21:25.881996 4700 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 11:21:25 crc kubenswrapper[4700]: E1007 11:21:25.882017 4700 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 11:21:25 crc kubenswrapper[4700]: E1007 11:21:25.882043 4700 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 11:21:25 crc kubenswrapper[4700]: E1007 11:21:25.882050 4700 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 11:21:25 crc kubenswrapper[4700]: E1007 11:21:25.882079 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 11:21:57.882058696 +0000 UTC m=+84.678457695 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 11:21:25 crc kubenswrapper[4700]: E1007 11:21:25.882081 4700 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 11:21:25 crc kubenswrapper[4700]: E1007 11:21:25.882114 4700 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 11:21:25 crc kubenswrapper[4700]: E1007 11:21:25.882146 4700 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 11:21:25 crc kubenswrapper[4700]: E1007 11:21:25.882145 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 11:21:57.882094687 +0000 UTC m=+84.678493676 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 11:21:25 crc kubenswrapper[4700]: E1007 11:21:25.882245 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 11:21:57.882226711 +0000 UTC m=+84.678625740 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 11:21:25 crc kubenswrapper[4700]: E1007 11:21:25.882287 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 11:21:57.882272012 +0000 UTC m=+84.678671051 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.953531 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.953585 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.953596 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.953617 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.953628 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:25Z","lastTransitionTime":"2025-10-07T11:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.956883 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.956952 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.956996 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:21:25 crc kubenswrapper[4700]: E1007 11:21:25.957116 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:21:25 crc kubenswrapper[4700]: E1007 11:21:25.957717 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:21:25 crc kubenswrapper[4700]: E1007 11:21:25.957885 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:21:25 crc kubenswrapper[4700]: I1007 11:21:25.959024 4700 scope.go:117] "RemoveContainer" containerID="3528327bd9cfbcdc5d2fae6f02d872cbb6683ddadbfac6138102422250af06b6" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.057634 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.057678 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.057688 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.057708 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.057731 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:26Z","lastTransitionTime":"2025-10-07T11:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.160276 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.160330 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.160342 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.160356 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.160366 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:26Z","lastTransitionTime":"2025-10-07T11:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.263091 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.263127 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.263137 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.263245 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.263267 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:26Z","lastTransitionTime":"2025-10-07T11:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.344763 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fk4xc_d0a75e4c-2144-40de-9abc-f0bb7a143a0e/ovnkube-controller/1.log" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.348341 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" event={"ID":"d0a75e4c-2144-40de-9abc-f0bb7a143a0e","Type":"ContainerStarted","Data":"3f2787d9123276a75dd019bdf6b9626c5bae9dcf0da8beed39a268372249b4fd"} Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.348942 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.368136 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.368204 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.368227 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.368266 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.368290 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:26Z","lastTransitionTime":"2025-10-07T11:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.374861 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.394512 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.403255 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.415243 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.435639 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e499a61ee1225e7b9ad2af5b6de48463ea247761b9e8414c343ac016eb66a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.459487 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.470594 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.470646 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.470659 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.470678 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.470695 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:26Z","lastTransitionTime":"2025-10-07T11:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.480235 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.508394 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.534618 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.559254 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f2787d9123276a75dd019bdf6b9626c5bae9dcf0da8beed39a268372249b4fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3528327bd9cfbcdc5d2fae6f02d872cbb6683ddadbfac6138102422250af06b6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"message\\\":\\\"I1007 11:21:09.455651 6095 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1007 11:21:09.455680 6095 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1007 11:21:09.456063 6095 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 11:21:09.456093 6095 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1007 11:21:09.456092 6095 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 11:21:09.456238 6095 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 11:21:09.456347 6095 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 11:21:09.456421 6095 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1007 11:21:09.456519 6095 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 11:21:09.456616 6095 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 11:21:09.456705 6095 factory.go:656] Stopping watch factory\\\\nI1007 11:21:09.456791 6095 ovnkube.go:599] Stopped ovnkube\\\\nI1007 11:21:09.456467 6095 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 11:21:09.456991 6095 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 11:21:09.456573 6095 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1007 11:21:09.456667 6095 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 11:21:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.574145 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.574221 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.574241 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.574266 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.574285 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:26Z","lastTransitionTime":"2025-10-07T11:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.576577 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f802d78653da0d6865fa5c88a4c16488eb620476eb832a61905962260c1f0b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.590980 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.608838 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7313e5d6c49d403c834be26be9717bd719575d0f469773f15a1f982c8209d887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.623587 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.637856 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"244b2984-d3ec-4577-893d-b9b4030db764\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa43badab7b685d1e4e47cae1ae05b3c890acdf44b9723e2a594f4beb57ac60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d1ac3c28f1187deb559f1e8a08bd2989283f9e23cce4808fe6a6cd71f48b75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tbrzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.650966 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dhsvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25429408-169d-4998-9b40-44a882f5a89e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dhsvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.668264 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.677850 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.677939 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.677949 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.677970 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.677981 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:26Z","lastTransitionTime":"2025-10-07T11:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.686787 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.703171 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f802d78653da0d6865fa5c88a4c16488eb620476eb832a61905962260c1f0b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.719682 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.735080 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.764669 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f2787d9123276a75dd019bdf6b9626c5bae9dcf0da8beed39a268372249b4fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3528327bd9cfbcdc5d2fae6f02d872cbb6683ddadbfac6138102422250af06b6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"message\\\":\\\"I1007 11:21:09.455651 6095 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1007 11:21:09.455680 6095 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1007 11:21:09.456063 6095 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 11:21:09.456093 6095 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1007 11:21:09.456092 6095 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 11:21:09.456238 6095 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 11:21:09.456347 6095 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 11:21:09.456421 6095 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1007 11:21:09.456519 6095 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 11:21:09.456616 6095 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 11:21:09.456705 6095 factory.go:656] Stopping watch factory\\\\nI1007 11:21:09.456791 6095 ovnkube.go:599] Stopped ovnkube\\\\nI1007 11:21:09.456467 6095 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 11:21:09.456991 6095 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 11:21:09.456573 6095 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1007 11:21:09.456667 6095 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 11:21:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.781219 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.781567 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.781721 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.781807 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.781867 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:26Z","lastTransitionTime":"2025-10-07T11:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.785234 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.805872 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7313e5d6c49d403c834be26be9717bd719575d0f469773f15a1f982c8209d887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.821225 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.846091 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"244b2984-d3ec-4577-893d-b9b4030db764\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa43badab7b685d1e4e47cae1ae05b3c890acdf44b9723e2a594f4beb57ac60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d1ac3c28f1187deb559f1e8a08bd2989283f9e23cce4808fe6a6cd71f48b75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tbrzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.862824 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dhsvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25429408-169d-4998-9b40-44a882f5a89e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dhsvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.885225 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.885273 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.885292 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.885333 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.885354 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:26Z","lastTransitionTime":"2025-10-07T11:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.885185 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.909662 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.929923 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.941379 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.952672 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.956144 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:21:26 crc kubenswrapper[4700]: E1007 11:21:26.956248 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.964006 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e499a61ee1225e7b9ad2af5b6de48463ea247761b9e8414c343ac016eb66a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.988248 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.988281 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.988294 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.988327 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:26 crc kubenswrapper[4700]: I1007 11:21:26.988340 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:26Z","lastTransitionTime":"2025-10-07T11:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.092295 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.092664 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.092733 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.092804 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.092862 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:27Z","lastTransitionTime":"2025-10-07T11:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.197042 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.197087 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.197099 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.197119 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.197132 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:27Z","lastTransitionTime":"2025-10-07T11:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.301006 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.301083 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.301101 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.301129 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.301151 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:27Z","lastTransitionTime":"2025-10-07T11:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.353923 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fk4xc_d0a75e4c-2144-40de-9abc-f0bb7a143a0e/ovnkube-controller/2.log" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.355226 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fk4xc_d0a75e4c-2144-40de-9abc-f0bb7a143a0e/ovnkube-controller/1.log" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.358949 4700 generic.go:334] "Generic (PLEG): container finished" podID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerID="3f2787d9123276a75dd019bdf6b9626c5bae9dcf0da8beed39a268372249b4fd" exitCode=1 Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.358993 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" event={"ID":"d0a75e4c-2144-40de-9abc-f0bb7a143a0e","Type":"ContainerDied","Data":"3f2787d9123276a75dd019bdf6b9626c5bae9dcf0da8beed39a268372249b4fd"} Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.359037 4700 scope.go:117] "RemoveContainer" containerID="3528327bd9cfbcdc5d2fae6f02d872cbb6683ddadbfac6138102422250af06b6" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.359947 4700 scope.go:117] "RemoveContainer" containerID="3f2787d9123276a75dd019bdf6b9626c5bae9dcf0da8beed39a268372249b4fd" Oct 07 11:21:27 crc kubenswrapper[4700]: E1007 11:21:27.360188 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fk4xc_openshift-ovn-kubernetes(d0a75e4c-2144-40de-9abc-f0bb7a143a0e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.375691 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:27Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.390224 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:27Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.403245 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:27Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.404616 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.404645 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.404656 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.404672 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.404685 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:27Z","lastTransitionTime":"2025-10-07T11:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.424821 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f2787d9123276a75dd019bdf6b9626c5bae9dcf0da8beed39a268372249b4fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3528327bd9cfbcdc5d2fae6f02d872cbb6683ddadbfac6138102422250af06b6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"message\\\":\\\"I1007 11:21:09.455651 6095 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1007 11:21:09.455680 6095 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1007 11:21:09.456063 6095 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 11:21:09.456093 6095 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1007 11:21:09.456092 6095 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 11:21:09.456238 6095 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 11:21:09.456347 6095 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 11:21:09.456421 6095 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1007 11:21:09.456519 6095 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 11:21:09.456616 6095 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 11:21:09.456705 6095 factory.go:656] Stopping watch factory\\\\nI1007 11:21:09.456791 6095 ovnkube.go:599] Stopped ovnkube\\\\nI1007 11:21:09.456467 6095 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 11:21:09.456991 6095 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 11:21:09.456573 6095 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1007 11:21:09.456667 6095 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 11:21:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f2787d9123276a75dd019bdf6b9626c5bae9dcf0da8beed39a268372249b4fd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"message\\\":\\\" because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z]\\\\nI1007 11:21:26.925629 6379 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-dhsvm before timer (time: 2025-10-07 11:21:28.078635138 +0000 UTC m=+1.806164794): skip\\\\nI1007 11:21:26.925649 6379 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 63.262µs)\\\\nI1007 11:21:26.925540 6379 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/downloads_TCP_cluster\\\\\\\", UUID:\\\\\\\"2ead45b3-c313-4fbc-a7bc-2b3c4ffd610c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/downloads\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:27Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.438905 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f802d78653da0d6865fa5c88a4c16488eb620476eb832a61905962260c1f0b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:27Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.453584 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:27Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.471445 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7313e5d6c49d403c834be26be9717bd719575d0f469773f15a1f982c8209d887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:27Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.488731 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:27Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.505870 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"244b2984-d3ec-4577-893d-b9b4030db764\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa43badab7b685d1e4e47cae1ae05b3c890acdf44b9723e2a594f4beb57ac60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d1ac3c28f1187deb559f1e8a08bd2989283f9e23cce4808fe6a6cd71f48b75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tbrzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:27Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.508044 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.508110 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.508129 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.508156 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.508176 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:27Z","lastTransitionTime":"2025-10-07T11:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.521981 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dhsvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25429408-169d-4998-9b40-44a882f5a89e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dhsvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:27Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.540647 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:27Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.558494 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:27Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.573642 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:27Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.590834 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:27Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.606100 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e499a61ee1225e7b9ad2af5b6de48463ea247761b9e8414c343ac016eb66a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:27Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.611349 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.611419 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.611441 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.611475 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.611498 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:27Z","lastTransitionTime":"2025-10-07T11:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.623956 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:27Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.715338 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.715388 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.715403 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.715423 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.715437 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:27Z","lastTransitionTime":"2025-10-07T11:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.819039 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.819090 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.819106 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.819130 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.819148 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:27Z","lastTransitionTime":"2025-10-07T11:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.922536 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.922620 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.922638 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.922663 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.922681 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:27Z","lastTransitionTime":"2025-10-07T11:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.957233 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.957358 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:21:27 crc kubenswrapper[4700]: E1007 11:21:27.957520 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:21:27 crc kubenswrapper[4700]: I1007 11:21:27.957575 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:27 crc kubenswrapper[4700]: E1007 11:21:27.957743 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:21:27 crc kubenswrapper[4700]: E1007 11:21:27.957857 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.025446 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.025538 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.025556 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.025585 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.025606 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:28Z","lastTransitionTime":"2025-10-07T11:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.128521 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.128616 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.128635 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.128660 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.128680 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:28Z","lastTransitionTime":"2025-10-07T11:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.232199 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.232256 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.232270 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.232293 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.232329 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:28Z","lastTransitionTime":"2025-10-07T11:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.334858 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.334948 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.334971 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.335002 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.335027 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:28Z","lastTransitionTime":"2025-10-07T11:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.366637 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fk4xc_d0a75e4c-2144-40de-9abc-f0bb7a143a0e/ovnkube-controller/2.log" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.372877 4700 scope.go:117] "RemoveContainer" containerID="3f2787d9123276a75dd019bdf6b9626c5bae9dcf0da8beed39a268372249b4fd" Oct 07 11:21:28 crc kubenswrapper[4700]: E1007 11:21:28.373226 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fk4xc_openshift-ovn-kubernetes(d0a75e4c-2144-40de-9abc-f0bb7a143a0e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.392641 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:28Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.410620 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:28Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.431835 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e499a61ee1225e7b9ad2af5b6de48463ea247761b9e8414c343ac016eb66a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:28Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.437908 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.437977 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.438005 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.438037 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.438062 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:28Z","lastTransitionTime":"2025-10-07T11:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.455645 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:28Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.476612 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:28Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.495386 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:28Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.526945 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:28Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.542261 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.542330 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.542342 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.542361 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.542375 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:28Z","lastTransitionTime":"2025-10-07T11:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.564575 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f2787d9123276a75dd019bdf6b9626c5bae9dcf0da8beed39a268372249b4fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f2787d9123276a75dd019bdf6b9626c5bae9dcf0da8beed39a268372249b4fd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"message\\\":\\\" because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z]\\\\nI1007 11:21:26.925629 6379 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-dhsvm before timer (time: 2025-10-07 11:21:28.078635138 +0000 UTC m=+1.806164794): skip\\\\nI1007 11:21:26.925649 6379 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 63.262µs)\\\\nI1007 11:21:26.925540 6379 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/downloads_TCP_cluster\\\\\\\", UUID:\\\\\\\"2ead45b3-c313-4fbc-a7bc-2b3c4ffd610c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/downloads\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fk4xc_openshift-ovn-kubernetes(d0a75e4c-2144-40de-9abc-f0bb7a143a0e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:28Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.586476 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f802d78653da0d6865fa5c88a4c16488eb620476eb832a61905962260c1f0b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:28Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.606634 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:28Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.624218 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:28Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.642453 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:28Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.645289 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.645397 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.645424 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.645453 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.645477 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:28Z","lastTransitionTime":"2025-10-07T11:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.662502 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"244b2984-d3ec-4577-893d-b9b4030db764\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa43badab7b685d1e4e47cae1ae05b3c890acdf44b9723e2a594f4beb57ac60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d1ac3c28f1187deb559f1e8a08bd2989283f9e23cce4808fe6a6cd71f48b75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tbrzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:28Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.672148 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.681190 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dhsvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25429408-169d-4998-9b40-44a882f5a89e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dhsvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:28Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.688013 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.703488 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:28Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.726887 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7313e5d6c49d403c834be26be9717bd719575d0f469773f15a1f982c8209d887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:28Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.744784 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dhsvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25429408-169d-4998-9b40-44a882f5a89e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dhsvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:28Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.748151 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.748186 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.748195 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.748214 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.748227 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:28Z","lastTransitionTime":"2025-10-07T11:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.763762 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"183a9149-9c82-4391-ae03-8f8c0e95649b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e380a35d97b01333eeb9aaf48ff90dc4766a55fbcb27631fea7e3a64385475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7d621df1ddc7e0317936ccecce52285289c5e438edbc91e5ce187102c9ed27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddcaa62485eb2b39092d82fb0d67856085cabef7478c4f34b0b80a082579d088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05f33d7fc6a2c018fdfc7441ecbf9075c5e2cbe237c9f02d30e3dee8b553f3ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f33d7fc6a2c018fdfc7441ecbf9075c5e2cbe237c9f02d30e3dee8b553f3ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:28Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.781473 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:28Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.806883 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7313e5d6c49d403c834be26be9717bd719575d0f469773f15a1f982c8209d887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:28Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.830749 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:28Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.847736 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"244b2984-d3ec-4577-893d-b9b4030db764\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa43badab7b685d1e4e47cae1ae05b3c890acdf44b9723e2a594f4beb57ac60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d1ac3c28f1187deb559f1e8a08bd2989283f9e23cce4808fe6a6cd71f48b75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tbrzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:28Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.850926 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.850959 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.850973 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.850993 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.851010 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:28Z","lastTransitionTime":"2025-10-07T11:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.867523 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e499a61ee1225e7b9ad2af5b6de48463ea247761b9e8414c343ac016eb66a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:28Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.891946 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:28Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.911799 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:28Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.929670 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:28Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.947981 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:28Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.953796 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.953842 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.953851 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.953869 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.953880 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:28Z","lastTransitionTime":"2025-10-07T11:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.956422 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:21:28 crc kubenswrapper[4700]: E1007 11:21:28.956572 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.968163 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:28Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:28 crc kubenswrapper[4700]: I1007 11:21:28.991082 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:28Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.016209 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f802d78653da0d6865fa5c88a4c16488eb620476eb832a61905962260c1f0b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:29Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.018935 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/25429408-169d-4998-9b40-44a882f5a89e-metrics-certs\") pod \"network-metrics-daemon-dhsvm\" (UID: \"25429408-169d-4998-9b40-44a882f5a89e\") " pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:21:29 crc kubenswrapper[4700]: E1007 11:21:29.019237 4700 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 11:21:29 crc kubenswrapper[4700]: E1007 11:21:29.019384 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25429408-169d-4998-9b40-44a882f5a89e-metrics-certs podName:25429408-169d-4998-9b40-44a882f5a89e nodeName:}" failed. No retries permitted until 2025-10-07 11:21:45.019356786 +0000 UTC m=+71.815755815 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/25429408-169d-4998-9b40-44a882f5a89e-metrics-certs") pod "network-metrics-daemon-dhsvm" (UID: "25429408-169d-4998-9b40-44a882f5a89e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.035447 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:29Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.057226 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.057278 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.057291 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.057328 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.057343 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:29Z","lastTransitionTime":"2025-10-07T11:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.057300 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:29Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.086411 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f2787d9123276a75dd019bdf6b9626c5bae9dcf0da8beed39a268372249b4fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f2787d9123276a75dd019bdf6b9626c5bae9dcf0da8beed39a268372249b4fd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"message\\\":\\\" because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z]\\\\nI1007 11:21:26.925629 6379 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-dhsvm before timer (time: 2025-10-07 11:21:28.078635138 +0000 UTC m=+1.806164794): skip\\\\nI1007 11:21:26.925649 6379 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 63.262µs)\\\\nI1007 11:21:26.925540 6379 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/downloads_TCP_cluster\\\\\\\", UUID:\\\\\\\"2ead45b3-c313-4fbc-a7bc-2b3c4ffd610c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/downloads\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fk4xc_openshift-ovn-kubernetes(d0a75e4c-2144-40de-9abc-f0bb7a143a0e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:29Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.161290 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.161362 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.161374 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.161396 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.161409 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:29Z","lastTransitionTime":"2025-10-07T11:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.265544 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.265605 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.265621 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.265650 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.265675 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:29Z","lastTransitionTime":"2025-10-07T11:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.368825 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.368890 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.368909 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.368935 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.368954 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:29Z","lastTransitionTime":"2025-10-07T11:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.472284 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.472381 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.472394 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.472419 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.472434 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:29Z","lastTransitionTime":"2025-10-07T11:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.575227 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.575295 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.575327 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.575351 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.575366 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:29Z","lastTransitionTime":"2025-10-07T11:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.678284 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.678367 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.678385 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.678410 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.678422 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:29Z","lastTransitionTime":"2025-10-07T11:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.781184 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.781254 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.781271 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.781297 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.781355 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:29Z","lastTransitionTime":"2025-10-07T11:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.884483 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.884548 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.884565 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.884591 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.884615 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:29Z","lastTransitionTime":"2025-10-07T11:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.956761 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.956881 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:21:29 crc kubenswrapper[4700]: E1007 11:21:29.956981 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:21:29 crc kubenswrapper[4700]: E1007 11:21:29.957121 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.957247 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:21:29 crc kubenswrapper[4700]: E1007 11:21:29.957389 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.987230 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.987279 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.987289 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.987347 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:29 crc kubenswrapper[4700]: I1007 11:21:29.987362 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:29Z","lastTransitionTime":"2025-10-07T11:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.091684 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.091758 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.091784 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.091815 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.091836 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:30Z","lastTransitionTime":"2025-10-07T11:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.195527 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.195608 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.195617 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.195633 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.195645 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:30Z","lastTransitionTime":"2025-10-07T11:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.299658 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.299723 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.299743 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.299774 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.299794 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:30Z","lastTransitionTime":"2025-10-07T11:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.403097 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.403192 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.403211 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.403240 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.403258 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:30Z","lastTransitionTime":"2025-10-07T11:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.506760 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.506818 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.506839 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.506864 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.506885 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:30Z","lastTransitionTime":"2025-10-07T11:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.610695 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.610752 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.610764 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.610783 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.610796 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:30Z","lastTransitionTime":"2025-10-07T11:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.714269 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.714354 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.714364 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.714382 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.714394 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:30Z","lastTransitionTime":"2025-10-07T11:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.715898 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.715923 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.715932 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.715945 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.715954 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:30Z","lastTransitionTime":"2025-10-07T11:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:30 crc kubenswrapper[4700]: E1007 11:21:30.736560 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:30Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.742987 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.743248 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.743438 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.743639 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.743826 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:30Z","lastTransitionTime":"2025-10-07T11:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:30 crc kubenswrapper[4700]: E1007 11:21:30.767085 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:30Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.771916 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.772199 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.772412 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.772590 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.772770 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:30Z","lastTransitionTime":"2025-10-07T11:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:30 crc kubenswrapper[4700]: E1007 11:21:30.796469 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:30Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.804082 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.804155 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.804173 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.804202 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.804226 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:30Z","lastTransitionTime":"2025-10-07T11:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:30 crc kubenswrapper[4700]: E1007 11:21:30.828223 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:30Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.832773 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.832808 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.832818 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.832839 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.832851 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:30Z","lastTransitionTime":"2025-10-07T11:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:30 crc kubenswrapper[4700]: E1007 11:21:30.850946 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:30Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:30 crc kubenswrapper[4700]: E1007 11:21:30.851188 4700 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.853636 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.853689 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.853708 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.853735 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.853755 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:30Z","lastTransitionTime":"2025-10-07T11:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.957204 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.957260 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.957274 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.957296 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.957333 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:30Z","lastTransitionTime":"2025-10-07T11:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:30 crc kubenswrapper[4700]: I1007 11:21:30.957455 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:21:30 crc kubenswrapper[4700]: E1007 11:21:30.957682 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.061177 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.061233 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.061248 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.061273 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.061286 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:31Z","lastTransitionTime":"2025-10-07T11:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.165026 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.165104 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.165127 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.165159 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.165181 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:31Z","lastTransitionTime":"2025-10-07T11:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.268287 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.268359 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.268371 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.268389 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.268398 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:31Z","lastTransitionTime":"2025-10-07T11:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.371733 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.371782 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.371791 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.371810 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.371822 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:31Z","lastTransitionTime":"2025-10-07T11:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.474950 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.475028 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.475049 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.475077 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.475099 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:31Z","lastTransitionTime":"2025-10-07T11:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.578392 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.578442 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.578455 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.578480 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.578494 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:31Z","lastTransitionTime":"2025-10-07T11:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.682439 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.682508 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.682523 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.682552 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.682567 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:31Z","lastTransitionTime":"2025-10-07T11:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.786068 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.786118 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.786133 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.786158 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.786169 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:31Z","lastTransitionTime":"2025-10-07T11:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.889198 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.889256 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.889270 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.889297 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.889344 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:31Z","lastTransitionTime":"2025-10-07T11:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.957214 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.957214 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.957423 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:31 crc kubenswrapper[4700]: E1007 11:21:31.957548 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:21:31 crc kubenswrapper[4700]: E1007 11:21:31.957601 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:21:31 crc kubenswrapper[4700]: E1007 11:21:31.957761 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.992040 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.992095 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.992135 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.992155 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:31 crc kubenswrapper[4700]: I1007 11:21:31.992168 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:31Z","lastTransitionTime":"2025-10-07T11:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.095083 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.095145 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.095158 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.095177 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.095190 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:32Z","lastTransitionTime":"2025-10-07T11:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.198704 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.198764 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.198777 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.198794 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.198809 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:32Z","lastTransitionTime":"2025-10-07T11:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.302348 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.302403 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.302413 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.302430 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.302441 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:32Z","lastTransitionTime":"2025-10-07T11:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.405817 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.405879 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.405895 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.405919 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.405935 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:32Z","lastTransitionTime":"2025-10-07T11:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.509963 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.510052 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.510082 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.510113 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.510134 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:32Z","lastTransitionTime":"2025-10-07T11:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.614407 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.614468 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.614493 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.614526 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.614548 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:32Z","lastTransitionTime":"2025-10-07T11:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.717676 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.717737 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.717754 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.717778 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.717800 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:32Z","lastTransitionTime":"2025-10-07T11:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.821638 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.821686 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.821695 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.821712 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.821721 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:32Z","lastTransitionTime":"2025-10-07T11:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.925047 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.925101 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.925117 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.925136 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.925149 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:32Z","lastTransitionTime":"2025-10-07T11:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:32 crc kubenswrapper[4700]: I1007 11:21:32.956466 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:21:32 crc kubenswrapper[4700]: E1007 11:21:32.956656 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.028670 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.028721 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.028733 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.028753 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.028764 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:33Z","lastTransitionTime":"2025-10-07T11:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.131970 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.132038 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.132050 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.132071 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.132084 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:33Z","lastTransitionTime":"2025-10-07T11:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.236155 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.236217 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.236233 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.236256 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.236270 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:33Z","lastTransitionTime":"2025-10-07T11:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.339165 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.339249 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.339268 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.339302 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.339354 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:33Z","lastTransitionTime":"2025-10-07T11:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.442675 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.442727 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.442740 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.442763 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.442776 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:33Z","lastTransitionTime":"2025-10-07T11:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.546417 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.546476 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.546488 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.546508 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.546520 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:33Z","lastTransitionTime":"2025-10-07T11:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.650126 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.650177 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.650189 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.650209 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.650224 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:33Z","lastTransitionTime":"2025-10-07T11:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.754202 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.754277 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.754296 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.754395 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.754422 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:33Z","lastTransitionTime":"2025-10-07T11:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.858053 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.858123 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.858142 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.858172 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.858190 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:33Z","lastTransitionTime":"2025-10-07T11:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.957059 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.957151 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:21:33 crc kubenswrapper[4700]: E1007 11:21:33.957276 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:21:33 crc kubenswrapper[4700]: E1007 11:21:33.957547 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.957771 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:33 crc kubenswrapper[4700]: E1007 11:21:33.957989 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.962020 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.962068 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.962081 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.962103 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.962117 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:33Z","lastTransitionTime":"2025-10-07T11:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.978435 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:33Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:33 crc kubenswrapper[4700]: I1007 11:21:33.995847 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:33Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.011846 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:34Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.025651 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:34Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.041424 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:34Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.060497 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e499a61ee1225e7b9ad2af5b6de48463ea247761b9e8414c343ac016eb66a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:34Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.067276 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.067357 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.067374 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.067396 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.067418 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:34Z","lastTransitionTime":"2025-10-07T11:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.085723 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:34Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.107796 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f802d78653da0d6865fa5c88a4c16488eb620476eb832a61905962260c1f0b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:34Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.123392 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:34Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.145347 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:34Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.171412 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.171518 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.171538 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.171563 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.171582 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:34Z","lastTransitionTime":"2025-10-07T11:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.178686 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f2787d9123276a75dd019bdf6b9626c5bae9dcf0da8beed39a268372249b4fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f2787d9123276a75dd019bdf6b9626c5bae9dcf0da8beed39a268372249b4fd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"message\\\":\\\" because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z]\\\\nI1007 11:21:26.925629 6379 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-dhsvm before timer (time: 2025-10-07 11:21:28.078635138 +0000 UTC m=+1.806164794): skip\\\\nI1007 11:21:26.925649 6379 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 63.262µs)\\\\nI1007 11:21:26.925540 6379 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/downloads_TCP_cluster\\\\\\\", UUID:\\\\\\\"2ead45b3-c313-4fbc-a7bc-2b3c4ffd610c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/downloads\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fk4xc_openshift-ovn-kubernetes(d0a75e4c-2144-40de-9abc-f0bb7a143a0e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:34Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.193890 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"183a9149-9c82-4391-ae03-8f8c0e95649b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e380a35d97b01333eeb9aaf48ff90dc4766a55fbcb27631fea7e3a64385475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7d621df1ddc7e0317936ccecce52285289c5e438edbc91e5ce187102c9ed27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddcaa62485eb2b39092d82fb0d67856085cabef7478c4f34b0b80a082579d088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05f33d7fc6a2c018fdfc7441ecbf9075c5e2cbe237c9f02d30e3dee8b553f3ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f33d7fc6a2c018fdfc7441ecbf9075c5e2cbe237c9f02d30e3dee8b553f3ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:34Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.207661 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:34Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.225762 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7313e5d6c49d403c834be26be9717bd719575d0f469773f15a1f982c8209d887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:34Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.245276 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:34Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.262717 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"244b2984-d3ec-4577-893d-b9b4030db764\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa43badab7b685d1e4e47cae1ae05b3c890acdf44b9723e2a594f4beb57ac60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d1ac3c28f1187deb559f1e8a08bd2989283f9e23cce4808fe6a6cd71f48b75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tbrzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:34Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.274891 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.274936 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.274951 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.274969 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.274981 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:34Z","lastTransitionTime":"2025-10-07T11:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.278092 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dhsvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25429408-169d-4998-9b40-44a882f5a89e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dhsvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:34Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.378006 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.378645 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.378660 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.378684 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.378699 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:34Z","lastTransitionTime":"2025-10-07T11:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.482401 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.482819 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.482977 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.483118 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.483240 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:34Z","lastTransitionTime":"2025-10-07T11:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.587295 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.587360 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.587373 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.587392 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.587406 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:34Z","lastTransitionTime":"2025-10-07T11:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.690897 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.690961 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.690984 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.691014 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.691038 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:34Z","lastTransitionTime":"2025-10-07T11:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.794120 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.794188 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.794200 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.794221 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.794234 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:34Z","lastTransitionTime":"2025-10-07T11:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.897370 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.897419 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.897429 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.897446 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.897456 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:34Z","lastTransitionTime":"2025-10-07T11:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:34 crc kubenswrapper[4700]: I1007 11:21:34.956235 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:21:34 crc kubenswrapper[4700]: E1007 11:21:34.956516 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.000378 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.000441 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.000459 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.000488 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.000507 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:35Z","lastTransitionTime":"2025-10-07T11:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.103726 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.103801 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.103819 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.103846 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.103870 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:35Z","lastTransitionTime":"2025-10-07T11:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.208138 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.208238 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.208259 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.208290 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.208354 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:35Z","lastTransitionTime":"2025-10-07T11:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.311042 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.311111 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.311129 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.311156 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.311175 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:35Z","lastTransitionTime":"2025-10-07T11:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.414214 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.414270 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.414279 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.414299 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.414328 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:35Z","lastTransitionTime":"2025-10-07T11:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.520546 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.521229 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.521257 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.521282 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.521299 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:35Z","lastTransitionTime":"2025-10-07T11:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.623655 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.623686 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.623695 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.623710 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.623720 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:35Z","lastTransitionTime":"2025-10-07T11:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.727178 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.727563 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.727705 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.727989 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.728057 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:35Z","lastTransitionTime":"2025-10-07T11:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.831110 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.831181 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.831194 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.831230 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.831244 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:35Z","lastTransitionTime":"2025-10-07T11:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.936006 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.936136 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.936158 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.936187 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.936242 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:35Z","lastTransitionTime":"2025-10-07T11:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.956576 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.956712 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:21:35 crc kubenswrapper[4700]: I1007 11:21:35.956797 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:35 crc kubenswrapper[4700]: E1007 11:21:35.957079 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:21:35 crc kubenswrapper[4700]: E1007 11:21:35.957385 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:21:35 crc kubenswrapper[4700]: E1007 11:21:35.957492 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.040098 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.040166 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.040182 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.040209 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.040234 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:36Z","lastTransitionTime":"2025-10-07T11:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.144710 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.144772 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.144789 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.144814 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.144835 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:36Z","lastTransitionTime":"2025-10-07T11:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.248666 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.248750 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.248773 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.248807 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.248832 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:36Z","lastTransitionTime":"2025-10-07T11:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.352567 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.352637 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.352656 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.352682 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.352700 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:36Z","lastTransitionTime":"2025-10-07T11:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.455559 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.455633 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.455657 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.455692 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.455721 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:36Z","lastTransitionTime":"2025-10-07T11:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.559439 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.559527 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.559553 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.559588 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.559615 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:36Z","lastTransitionTime":"2025-10-07T11:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.663215 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.663275 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.663292 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.663342 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.663361 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:36Z","lastTransitionTime":"2025-10-07T11:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.766812 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.766876 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.766960 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.766989 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.767007 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:36Z","lastTransitionTime":"2025-10-07T11:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.869558 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.869629 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.869648 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.869675 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.869700 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:36Z","lastTransitionTime":"2025-10-07T11:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.959865 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:21:36 crc kubenswrapper[4700]: E1007 11:21:36.960040 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.973685 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.973719 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.973728 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.973743 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:36 crc kubenswrapper[4700]: I1007 11:21:36.973756 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:36Z","lastTransitionTime":"2025-10-07T11:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.077026 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.077114 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.077131 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.077159 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.077181 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:37Z","lastTransitionTime":"2025-10-07T11:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.181183 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.181238 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.181252 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.181276 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.181293 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:37Z","lastTransitionTime":"2025-10-07T11:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.284280 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.284348 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.284359 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.284379 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.284391 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:37Z","lastTransitionTime":"2025-10-07T11:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.387184 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.387267 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.387283 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.387300 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.387336 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:37Z","lastTransitionTime":"2025-10-07T11:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.490620 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.490699 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.490717 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.490738 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.490753 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:37Z","lastTransitionTime":"2025-10-07T11:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.593673 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.593746 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.593763 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.593790 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.593809 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:37Z","lastTransitionTime":"2025-10-07T11:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.696843 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.696895 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.696913 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.696935 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.696952 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:37Z","lastTransitionTime":"2025-10-07T11:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.800421 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.800493 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.800512 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.800540 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.800560 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:37Z","lastTransitionTime":"2025-10-07T11:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.903828 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.903881 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.903897 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.903918 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.903931 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:37Z","lastTransitionTime":"2025-10-07T11:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.956611 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.956692 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:21:37 crc kubenswrapper[4700]: I1007 11:21:37.956633 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:21:37 crc kubenswrapper[4700]: E1007 11:21:37.956779 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:21:37 crc kubenswrapper[4700]: E1007 11:21:37.956872 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:21:37 crc kubenswrapper[4700]: E1007 11:21:37.956972 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.006431 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.006474 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.006486 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.006509 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.006525 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:38Z","lastTransitionTime":"2025-10-07T11:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.109359 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.109412 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.109426 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.109446 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.109458 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:38Z","lastTransitionTime":"2025-10-07T11:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.212880 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.212939 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.212954 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.212975 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.212992 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:38Z","lastTransitionTime":"2025-10-07T11:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.315963 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.316024 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.316036 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.316057 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.316068 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:38Z","lastTransitionTime":"2025-10-07T11:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.454387 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.454465 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.454486 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.454507 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.454525 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:38Z","lastTransitionTime":"2025-10-07T11:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.556765 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.557068 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.557137 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.557215 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.557275 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:38Z","lastTransitionTime":"2025-10-07T11:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.660680 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.660741 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.660756 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.660777 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.660789 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:38Z","lastTransitionTime":"2025-10-07T11:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.763395 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.763445 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.763456 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.763472 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.763483 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:38Z","lastTransitionTime":"2025-10-07T11:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.866171 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.866224 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.866236 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.866254 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.866268 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:38Z","lastTransitionTime":"2025-10-07T11:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.956143 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:21:38 crc kubenswrapper[4700]: E1007 11:21:38.956293 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.969535 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.969568 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.969579 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.969594 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:38 crc kubenswrapper[4700]: I1007 11:21:38.969605 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:38Z","lastTransitionTime":"2025-10-07T11:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.072439 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.072488 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.072505 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.072529 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.072548 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:39Z","lastTransitionTime":"2025-10-07T11:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.175134 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.175190 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.175202 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.175222 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.175234 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:39Z","lastTransitionTime":"2025-10-07T11:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.277968 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.278008 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.278021 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.278039 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.278050 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:39Z","lastTransitionTime":"2025-10-07T11:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.381256 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.381331 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.381342 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.381358 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.381371 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:39Z","lastTransitionTime":"2025-10-07T11:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.483870 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.484357 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.484450 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.484517 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.484584 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:39Z","lastTransitionTime":"2025-10-07T11:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.587769 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.587821 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.587833 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.587853 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.587869 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:39Z","lastTransitionTime":"2025-10-07T11:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.690501 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.690554 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.690567 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.690593 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.690607 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:39Z","lastTransitionTime":"2025-10-07T11:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.792787 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.792837 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.792847 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.792871 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.792884 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:39Z","lastTransitionTime":"2025-10-07T11:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.895862 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.896050 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.896117 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.896190 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.896269 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:39Z","lastTransitionTime":"2025-10-07T11:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.956706 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:39 crc kubenswrapper[4700]: E1007 11:21:39.956906 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.957232 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:21:39 crc kubenswrapper[4700]: E1007 11:21:39.957641 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.957713 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:21:39 crc kubenswrapper[4700]: E1007 11:21:39.958299 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.999416 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.999457 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.999472 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.999489 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:39 crc kubenswrapper[4700]: I1007 11:21:39.999499 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:39Z","lastTransitionTime":"2025-10-07T11:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.102463 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.102504 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.102519 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.102534 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.102548 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:40Z","lastTransitionTime":"2025-10-07T11:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.205150 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.205192 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.205201 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.205219 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.205231 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:40Z","lastTransitionTime":"2025-10-07T11:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.308943 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.309897 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.310047 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.310196 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.310370 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:40Z","lastTransitionTime":"2025-10-07T11:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.413491 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.413528 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.413538 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.413557 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.413572 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:40Z","lastTransitionTime":"2025-10-07T11:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.516828 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.516973 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.516987 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.517010 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.517026 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:40Z","lastTransitionTime":"2025-10-07T11:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.620475 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.620804 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.620986 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.621190 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.621355 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:40Z","lastTransitionTime":"2025-10-07T11:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.724652 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.724700 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.724715 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.724737 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.724748 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:40Z","lastTransitionTime":"2025-10-07T11:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.827329 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.827371 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.827382 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.827399 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.827414 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:40Z","lastTransitionTime":"2025-10-07T11:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.934220 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.934260 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.934268 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.934285 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.934296 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:40Z","lastTransitionTime":"2025-10-07T11:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.957029 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:21:40 crc kubenswrapper[4700]: E1007 11:21:40.957221 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:21:40 crc kubenswrapper[4700]: I1007 11:21:40.958029 4700 scope.go:117] "RemoveContainer" containerID="3f2787d9123276a75dd019bdf6b9626c5bae9dcf0da8beed39a268372249b4fd" Oct 07 11:21:40 crc kubenswrapper[4700]: E1007 11:21:40.958213 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fk4xc_openshift-ovn-kubernetes(d0a75e4c-2144-40de-9abc-f0bb7a143a0e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.037115 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.037220 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.037240 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.037271 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.037288 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:41Z","lastTransitionTime":"2025-10-07T11:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.139818 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.140199 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.140404 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.140586 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.140873 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:41Z","lastTransitionTime":"2025-10-07T11:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.186575 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.186612 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.186622 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.186639 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.186651 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:41Z","lastTransitionTime":"2025-10-07T11:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:41 crc kubenswrapper[4700]: E1007 11:21:41.201176 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:41Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.205026 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.205046 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.205056 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.205070 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.205082 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:41Z","lastTransitionTime":"2025-10-07T11:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:41 crc kubenswrapper[4700]: E1007 11:21:41.216737 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:41Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.220744 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.220765 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.220773 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.220784 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.220792 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:41Z","lastTransitionTime":"2025-10-07T11:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:41 crc kubenswrapper[4700]: E1007 11:21:41.234919 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:41Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.238532 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.238563 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.238576 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.238595 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.238608 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:41Z","lastTransitionTime":"2025-10-07T11:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:41 crc kubenswrapper[4700]: E1007 11:21:41.250066 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:41Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.254209 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.254248 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.254258 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.254269 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.254278 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:41Z","lastTransitionTime":"2025-10-07T11:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:41 crc kubenswrapper[4700]: E1007 11:21:41.266910 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:41Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:41 crc kubenswrapper[4700]: E1007 11:21:41.267022 4700 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.268184 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.268214 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.268228 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.268247 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.268258 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:41Z","lastTransitionTime":"2025-10-07T11:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.370281 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.370367 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.370377 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.370395 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.370407 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:41Z","lastTransitionTime":"2025-10-07T11:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.472881 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.472958 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.472980 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.473011 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.473035 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:41Z","lastTransitionTime":"2025-10-07T11:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.576143 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.576208 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.576225 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.576254 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.576272 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:41Z","lastTransitionTime":"2025-10-07T11:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.679448 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.679492 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.679507 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.679524 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.679534 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:41Z","lastTransitionTime":"2025-10-07T11:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.782674 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.782738 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.782762 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.782787 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.782804 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:41Z","lastTransitionTime":"2025-10-07T11:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.885411 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.885484 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.885569 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.885627 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.885648 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:41Z","lastTransitionTime":"2025-10-07T11:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.956264 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.956407 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:21:41 crc kubenswrapper[4700]: E1007 11:21:41.956485 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.956414 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:21:41 crc kubenswrapper[4700]: E1007 11:21:41.956639 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:21:41 crc kubenswrapper[4700]: E1007 11:21:41.956775 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.989948 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.989994 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.990007 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.990026 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:41 crc kubenswrapper[4700]: I1007 11:21:41.990038 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:41Z","lastTransitionTime":"2025-10-07T11:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.092962 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.093029 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.093049 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.093076 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.093097 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:42Z","lastTransitionTime":"2025-10-07T11:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.200469 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.200535 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.200550 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.200572 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.200585 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:42Z","lastTransitionTime":"2025-10-07T11:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.304734 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.305558 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.305853 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.306056 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.306209 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:42Z","lastTransitionTime":"2025-10-07T11:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.408840 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.408902 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.408915 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.408934 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.408949 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:42Z","lastTransitionTime":"2025-10-07T11:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.512389 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.512819 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.512994 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.513157 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.513337 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:42Z","lastTransitionTime":"2025-10-07T11:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.616356 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.616396 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.616405 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.616422 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.616435 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:42Z","lastTransitionTime":"2025-10-07T11:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.718903 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.719299 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.719465 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.719598 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.719781 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:42Z","lastTransitionTime":"2025-10-07T11:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.823257 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.823331 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.823342 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.823361 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.823374 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:42Z","lastTransitionTime":"2025-10-07T11:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.925583 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.925897 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.925988 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.926077 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.926197 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:42Z","lastTransitionTime":"2025-10-07T11:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:42 crc kubenswrapper[4700]: I1007 11:21:42.956644 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:21:42 crc kubenswrapper[4700]: E1007 11:21:42.956850 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.029417 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.029461 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.029471 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.029489 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.029503 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:43Z","lastTransitionTime":"2025-10-07T11:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.132595 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.132637 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.132655 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.132674 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.132713 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:43Z","lastTransitionTime":"2025-10-07T11:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.236518 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.236582 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.236599 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.236634 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.236652 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:43Z","lastTransitionTime":"2025-10-07T11:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.340198 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.340265 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.340285 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.340356 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.340380 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:43Z","lastTransitionTime":"2025-10-07T11:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.443333 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.443384 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.443395 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.443414 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.443425 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:43Z","lastTransitionTime":"2025-10-07T11:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.545867 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.545915 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.545933 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.545956 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.545973 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:43Z","lastTransitionTime":"2025-10-07T11:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.649511 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.649587 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.649617 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.649648 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.649674 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:43Z","lastTransitionTime":"2025-10-07T11:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.753161 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.753211 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.753228 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.753259 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.753278 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:43Z","lastTransitionTime":"2025-10-07T11:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.855901 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.855959 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.855974 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.855996 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.856008 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:43Z","lastTransitionTime":"2025-10-07T11:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.956906 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.957166 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.957286 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:21:43 crc kubenswrapper[4700]: E1007 11:21:43.957355 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:21:43 crc kubenswrapper[4700]: E1007 11:21:43.957844 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:21:43 crc kubenswrapper[4700]: E1007 11:21:43.957969 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.959528 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.959579 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.959598 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.959622 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.959641 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:43Z","lastTransitionTime":"2025-10-07T11:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.971127 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.972913 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:43Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:43 crc kubenswrapper[4700]: I1007 11:21:43.986185 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:43Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.001634 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:43Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.013324 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:44Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.023550 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:44Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.036068 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e499a61ee1225e7b9ad2af5b6de48463ea247761b9e8414c343ac016eb66a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:44Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.051728 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:44Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.064624 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.064743 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.064796 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.064844 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.064896 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:44Z","lastTransitionTime":"2025-10-07T11:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.077490 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f802d78653da0d6865fa5c88a4c16488eb620476eb832a61905962260c1f0b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:44Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.092383 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:44Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.105061 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:44Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.126349 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f2787d9123276a75dd019bdf6b9626c5bae9dcf0da8beed39a268372249b4fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f2787d9123276a75dd019bdf6b9626c5bae9dcf0da8beed39a268372249b4fd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"message\\\":\\\" because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z]\\\\nI1007 11:21:26.925629 6379 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-dhsvm before timer (time: 2025-10-07 11:21:28.078635138 +0000 UTC m=+1.806164794): skip\\\\nI1007 11:21:26.925649 6379 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 63.262µs)\\\\nI1007 11:21:26.925540 6379 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/downloads_TCP_cluster\\\\\\\", UUID:\\\\\\\"2ead45b3-c313-4fbc-a7bc-2b3c4ffd610c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/downloads\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fk4xc_openshift-ovn-kubernetes(d0a75e4c-2144-40de-9abc-f0bb7a143a0e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:44Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.139042 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"183a9149-9c82-4391-ae03-8f8c0e95649b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e380a35d97b01333eeb9aaf48ff90dc4766a55fbcb27631fea7e3a64385475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7d621df1ddc7e0317936ccecce52285289c5e438edbc91e5ce187102c9ed27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddcaa62485eb2b39092d82fb0d67856085cabef7478c4f34b0b80a082579d088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05f33d7fc6a2c018fdfc7441ecbf9075c5e2cbe237c9f02d30e3dee8b553f3ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f33d7fc6a2c018fdfc7441ecbf9075c5e2cbe237c9f02d30e3dee8b553f3ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:44Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.153110 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:44Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.168178 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.168443 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.168508 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.168589 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.168709 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:44Z","lastTransitionTime":"2025-10-07T11:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.169686 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7313e5d6c49d403c834be26be9717bd719575d0f469773f15a1f982c8209d887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:44Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.184101 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:44Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.196688 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"244b2984-d3ec-4577-893d-b9b4030db764\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa43badab7b685d1e4e47cae1ae05b3c890acdf44b9723e2a594f4beb57ac60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d1ac3c28f1187deb559f1e8a08bd2989283f9e23cce4808fe6a6cd71f48b75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tbrzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:44Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.208420 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dhsvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25429408-169d-4998-9b40-44a882f5a89e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dhsvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:44Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.271873 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.272013 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.272094 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.272167 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.272249 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:44Z","lastTransitionTime":"2025-10-07T11:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.374181 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.374538 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.374610 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.374687 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.374752 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:44Z","lastTransitionTime":"2025-10-07T11:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.477183 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.477261 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.477283 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.477360 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.477387 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:44Z","lastTransitionTime":"2025-10-07T11:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.580138 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.580183 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.580194 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.580211 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.580225 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:44Z","lastTransitionTime":"2025-10-07T11:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.683345 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.683389 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.683401 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.683421 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.683432 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:44Z","lastTransitionTime":"2025-10-07T11:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.786754 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.786802 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.786819 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.786839 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.786850 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:44Z","lastTransitionTime":"2025-10-07T11:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.889596 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.889640 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.889652 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.889670 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.889683 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:44Z","lastTransitionTime":"2025-10-07T11:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.956992 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:21:44 crc kubenswrapper[4700]: E1007 11:21:44.957182 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.992219 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.992281 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.992322 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.992351 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:44 crc kubenswrapper[4700]: I1007 11:21:44.992371 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:44Z","lastTransitionTime":"2025-10-07T11:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.032261 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/25429408-169d-4998-9b40-44a882f5a89e-metrics-certs\") pod \"network-metrics-daemon-dhsvm\" (UID: \"25429408-169d-4998-9b40-44a882f5a89e\") " pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:21:45 crc kubenswrapper[4700]: E1007 11:21:45.032486 4700 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 11:21:45 crc kubenswrapper[4700]: E1007 11:21:45.032605 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25429408-169d-4998-9b40-44a882f5a89e-metrics-certs podName:25429408-169d-4998-9b40-44a882f5a89e nodeName:}" failed. No retries permitted until 2025-10-07 11:22:17.032581921 +0000 UTC m=+103.828980920 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/25429408-169d-4998-9b40-44a882f5a89e-metrics-certs") pod "network-metrics-daemon-dhsvm" (UID: "25429408-169d-4998-9b40-44a882f5a89e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.094920 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.094975 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.094985 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.095003 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.095016 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:45Z","lastTransitionTime":"2025-10-07T11:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.198347 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.198392 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.198406 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.198422 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.198435 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:45Z","lastTransitionTime":"2025-10-07T11:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.301253 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.301290 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.301317 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.301335 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.301345 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:45Z","lastTransitionTime":"2025-10-07T11:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.404364 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.404399 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.404406 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.404418 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.404428 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:45Z","lastTransitionTime":"2025-10-07T11:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.506891 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.506952 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.506965 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.506984 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.506994 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:45Z","lastTransitionTime":"2025-10-07T11:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.610575 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.610617 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.610629 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.610647 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.610660 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:45Z","lastTransitionTime":"2025-10-07T11:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.713938 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.714277 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.714377 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.714476 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.714558 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:45Z","lastTransitionTime":"2025-10-07T11:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.817212 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.817274 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.817286 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.817322 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.817335 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:45Z","lastTransitionTime":"2025-10-07T11:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.920910 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.921295 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.921396 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.921498 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.921588 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:45Z","lastTransitionTime":"2025-10-07T11:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.957148 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:21:45 crc kubenswrapper[4700]: E1007 11:21:45.957445 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.957274 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:21:45 crc kubenswrapper[4700]: I1007 11:21:45.957274 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:45 crc kubenswrapper[4700]: E1007 11:21:45.957675 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:21:45 crc kubenswrapper[4700]: E1007 11:21:45.957874 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.024102 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.024154 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.024165 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.024183 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.024196 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:46Z","lastTransitionTime":"2025-10-07T11:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.127684 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.128161 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.128426 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.128548 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.128637 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:46Z","lastTransitionTime":"2025-10-07T11:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.232280 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.232716 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.232798 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.232908 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.232987 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:46Z","lastTransitionTime":"2025-10-07T11:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.337094 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.337167 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.337188 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.337215 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.337233 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:46Z","lastTransitionTime":"2025-10-07T11:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.440861 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.440940 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.440950 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.440967 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.440994 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:46Z","lastTransitionTime":"2025-10-07T11:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.544697 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.544765 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.544783 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.544822 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.544844 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:46Z","lastTransitionTime":"2025-10-07T11:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.648062 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.648700 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.648898 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.649095 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.649264 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:46Z","lastTransitionTime":"2025-10-07T11:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.752506 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.752548 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.752558 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.752575 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.752589 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:46Z","lastTransitionTime":"2025-10-07T11:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.855703 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.855753 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.855764 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.855784 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.855796 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:46Z","lastTransitionTime":"2025-10-07T11:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.956961 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:21:46 crc kubenswrapper[4700]: E1007 11:21:46.957274 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.958578 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.958630 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.958642 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.958662 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:46 crc kubenswrapper[4700]: I1007 11:21:46.958675 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:46Z","lastTransitionTime":"2025-10-07T11:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.061438 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.061482 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.061496 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.061513 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.061525 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:47Z","lastTransitionTime":"2025-10-07T11:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.164630 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.164687 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.164699 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.164720 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.164735 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:47Z","lastTransitionTime":"2025-10-07T11:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.268554 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.268628 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.268642 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.268664 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.268678 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:47Z","lastTransitionTime":"2025-10-07T11:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.371035 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.371082 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.371102 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.371119 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.371129 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:47Z","lastTransitionTime":"2025-10-07T11:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.473322 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.473384 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.473393 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.473410 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.473422 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:47Z","lastTransitionTime":"2025-10-07T11:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.577270 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.577422 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.577456 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.577489 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.577515 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:47Z","lastTransitionTime":"2025-10-07T11:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.681623 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.681690 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.681707 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.681763 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.681785 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:47Z","lastTransitionTime":"2025-10-07T11:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.785260 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.785348 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.785368 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.785396 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.785419 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:47Z","lastTransitionTime":"2025-10-07T11:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.888969 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.889049 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.889089 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.889110 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.889124 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:47Z","lastTransitionTime":"2025-10-07T11:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.957035 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.957110 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.957361 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:47 crc kubenswrapper[4700]: E1007 11:21:47.957302 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:21:47 crc kubenswrapper[4700]: E1007 11:21:47.957530 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:21:47 crc kubenswrapper[4700]: E1007 11:21:47.957752 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.993393 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.993488 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.993526 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.993557 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:47 crc kubenswrapper[4700]: I1007 11:21:47.993581 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:47Z","lastTransitionTime":"2025-10-07T11:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.097742 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.097786 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.097794 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.097809 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.097820 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:48Z","lastTransitionTime":"2025-10-07T11:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.201528 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.201597 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.201611 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.201632 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.201645 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:48Z","lastTransitionTime":"2025-10-07T11:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.305074 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.305403 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.305594 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.305755 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.305921 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:48Z","lastTransitionTime":"2025-10-07T11:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.409200 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.409279 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.409338 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.409378 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.409402 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:48Z","lastTransitionTime":"2025-10-07T11:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.496432 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zhd4s_869af552-a034-4af4-b46a-492798633d24/kube-multus/0.log" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.496514 4700 generic.go:334] "Generic (PLEG): container finished" podID="869af552-a034-4af4-b46a-492798633d24" containerID="8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a" exitCode=1 Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.496567 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zhd4s" event={"ID":"869af552-a034-4af4-b46a-492798633d24","Type":"ContainerDied","Data":"8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a"} Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.497221 4700 scope.go:117] "RemoveContainer" containerID="8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.514141 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.514469 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.514502 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.514843 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.515033 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:48Z","lastTransitionTime":"2025-10-07T11:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.519695 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:48Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.536877 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e499a61ee1225e7b9ad2af5b6de48463ea247761b9e8414c343ac016eb66a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:48Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.553872 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:48Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.571421 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:48Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.591674 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:48Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.602692 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:48Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.617596 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee71f49a-bb16-4ed0-9777-91c1443097f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://881c4a00a6d28e60588dc9cc465730f0e190c9ac71665d691a6d0f46bbd75de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a67ca6dd1cda2310e84aa65eac4f8a81721b765bb612169f423a4a9fa5b7d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a67ca6dd1cda2310e84aa65eac4f8a81721b765bb612169f423a4a9fa5b7d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:48Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.619541 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.619962 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.619975 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.619992 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.620004 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:48Z","lastTransitionTime":"2025-10-07T11:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.634319 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:48Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.650287 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f802d78653da0d6865fa5c88a4c16488eb620476eb832a61905962260c1f0b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:48Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.664212 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:48Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.683541 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:48Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.712398 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f2787d9123276a75dd019bdf6b9626c5bae9dcf0da8beed39a268372249b4fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f2787d9123276a75dd019bdf6b9626c5bae9dcf0da8beed39a268372249b4fd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"message\\\":\\\" because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z]\\\\nI1007 11:21:26.925629 6379 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-dhsvm before timer (time: 2025-10-07 11:21:28.078635138 +0000 UTC m=+1.806164794): skip\\\\nI1007 11:21:26.925649 6379 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 63.262µs)\\\\nI1007 11:21:26.925540 6379 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/downloads_TCP_cluster\\\\\\\", UUID:\\\\\\\"2ead45b3-c313-4fbc-a7bc-2b3c4ffd610c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/downloads\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fk4xc_openshift-ovn-kubernetes(d0a75e4c-2144-40de-9abc-f0bb7a143a0e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:48Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.723004 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.723056 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.723066 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.723088 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.723105 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:48Z","lastTransitionTime":"2025-10-07T11:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.730133 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"244b2984-d3ec-4577-893d-b9b4030db764\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa43badab7b685d1e4e47cae1ae05b3c890acdf44b9723e2a594f4beb57ac60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d1ac3c28f1187deb559f1e8a08bd2989283f9e23cce4808fe6a6cd71f48b75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tbrzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:48Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.743363 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dhsvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25429408-169d-4998-9b40-44a882f5a89e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dhsvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:48Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.755410 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"183a9149-9c82-4391-ae03-8f8c0e95649b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e380a35d97b01333eeb9aaf48ff90dc4766a55fbcb27631fea7e3a64385475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7d621df1ddc7e0317936ccecce52285289c5e438edbc91e5ce187102c9ed27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddcaa62485eb2b39092d82fb0d67856085cabef7478c4f34b0b80a082579d088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05f33d7fc6a2c018fdfc7441ecbf9075c5e2cbe237c9f02d30e3dee8b553f3ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f33d7fc6a2c018fdfc7441ecbf9075c5e2cbe237c9f02d30e3dee8b553f3ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:48Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.770333 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:48Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.787011 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7313e5d6c49d403c834be26be9717bd719575d0f469773f15a1f982c8209d887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:48Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.803261 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:47Z\\\",\\\"message\\\":\\\"2025-10-07T11:21:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9add58a6-3aaf-4e8d-b919-5ceee6291a55\\\\n2025-10-07T11:21:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9add58a6-3aaf-4e8d-b919-5ceee6291a55 to /host/opt/cni/bin/\\\\n2025-10-07T11:21:02Z [verbose] multus-daemon started\\\\n2025-10-07T11:21:02Z [verbose] Readiness Indicator file check\\\\n2025-10-07T11:21:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:48Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.826181 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.826231 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.826240 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.826257 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.826268 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:48Z","lastTransitionTime":"2025-10-07T11:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.929726 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.929794 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.929812 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.929838 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.929871 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:48Z","lastTransitionTime":"2025-10-07T11:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:48 crc kubenswrapper[4700]: I1007 11:21:48.956508 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:21:48 crc kubenswrapper[4700]: E1007 11:21:48.956726 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.032707 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.033176 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.033338 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.033516 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.033634 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:49Z","lastTransitionTime":"2025-10-07T11:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.136899 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.136965 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.136985 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.137011 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.137032 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:49Z","lastTransitionTime":"2025-10-07T11:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.240494 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.240533 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.240545 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.240567 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.240581 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:49Z","lastTransitionTime":"2025-10-07T11:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.343933 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.343993 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.344010 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.344033 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.344050 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:49Z","lastTransitionTime":"2025-10-07T11:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.447202 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.447645 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.447828 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.448036 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.448242 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:49Z","lastTransitionTime":"2025-10-07T11:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.503990 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zhd4s_869af552-a034-4af4-b46a-492798633d24/kube-multus/0.log" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.504374 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zhd4s" event={"ID":"869af552-a034-4af4-b46a-492798633d24","Type":"ContainerStarted","Data":"3343c4ca5375885915b7fdee68df54d6460363b58b4802ff380f38334a9312bb"} Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.529159 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:49Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.552472 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.552550 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.552575 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.552606 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.552630 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:49Z","lastTransitionTime":"2025-10-07T11:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.569566 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f2787d9123276a75dd019bdf6b9626c5bae9dcf0da8beed39a268372249b4fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f2787d9123276a75dd019bdf6b9626c5bae9dcf0da8beed39a268372249b4fd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"message\\\":\\\" because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z]\\\\nI1007 11:21:26.925629 6379 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-dhsvm before timer (time: 2025-10-07 11:21:28.078635138 +0000 UTC m=+1.806164794): skip\\\\nI1007 11:21:26.925649 6379 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 63.262µs)\\\\nI1007 11:21:26.925540 6379 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/downloads_TCP_cluster\\\\\\\", UUID:\\\\\\\"2ead45b3-c313-4fbc-a7bc-2b3c4ffd610c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/downloads\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fk4xc_openshift-ovn-kubernetes(d0a75e4c-2144-40de-9abc-f0bb7a143a0e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:49Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.588143 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f802d78653da0d6865fa5c88a4c16488eb620476eb832a61905962260c1f0b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:49Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.604079 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:49Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.628447 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7313e5d6c49d403c834be26be9717bd719575d0f469773f15a1f982c8209d887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:49Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.645846 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3343c4ca5375885915b7fdee68df54d6460363b58b4802ff380f38334a9312bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:47Z\\\",\\\"message\\\":\\\"2025-10-07T11:21:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9add58a6-3aaf-4e8d-b919-5ceee6291a55\\\\n2025-10-07T11:21:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9add58a6-3aaf-4e8d-b919-5ceee6291a55 to /host/opt/cni/bin/\\\\n2025-10-07T11:21:02Z [verbose] multus-daemon started\\\\n2025-10-07T11:21:02Z [verbose] Readiness Indicator file check\\\\n2025-10-07T11:21:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:49Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.655573 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.655641 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.655666 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.655695 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.655720 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:49Z","lastTransitionTime":"2025-10-07T11:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.659819 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"244b2984-d3ec-4577-893d-b9b4030db764\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa43badab7b685d1e4e47cae1ae05b3c890acdf44b9723e2a594f4beb57ac60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d1ac3c28f1187deb559f1e8a08bd2989283f9e23cce4808fe6a6cd71f48b75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tbrzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:49Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.673952 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dhsvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25429408-169d-4998-9b40-44a882f5a89e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dhsvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:49Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.690281 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"183a9149-9c82-4391-ae03-8f8c0e95649b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e380a35d97b01333eeb9aaf48ff90dc4766a55fbcb27631fea7e3a64385475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7d621df1ddc7e0317936ccecce52285289c5e438edbc91e5ce187102c9ed27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddcaa62485eb2b39092d82fb0d67856085cabef7478c4f34b0b80a082579d088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05f33d7fc6a2c018fdfc7441ecbf9075c5e2cbe237c9f02d30e3dee8b553f3ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f33d7fc6a2c018fdfc7441ecbf9075c5e2cbe237c9f02d30e3dee8b553f3ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:49Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.710487 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:49Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.724661 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:49Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.734819 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:49Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.747174 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:49Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.758675 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.758715 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.758745 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.758763 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.758774 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:49Z","lastTransitionTime":"2025-10-07T11:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.759974 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e499a61ee1225e7b9ad2af5b6de48463ea247761b9e8414c343ac016eb66a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:49Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.775609 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:49Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.794412 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:49Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.809606 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:49Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.823717 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee71f49a-bb16-4ed0-9777-91c1443097f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://881c4a00a6d28e60588dc9cc465730f0e190c9ac71665d691a6d0f46bbd75de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a67ca6dd1cda2310e84aa65eac4f8a81721b765bb612169f423a4a9fa5b7d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a67ca6dd1cda2310e84aa65eac4f8a81721b765bb612169f423a4a9fa5b7d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:49Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.861934 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.861977 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.861988 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.862006 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.862019 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:49Z","lastTransitionTime":"2025-10-07T11:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.956336 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.956347 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:21:49 crc kubenswrapper[4700]: E1007 11:21:49.956565 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.956615 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:21:49 crc kubenswrapper[4700]: E1007 11:21:49.956729 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:21:49 crc kubenswrapper[4700]: E1007 11:21:49.956884 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.964590 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.964645 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.964662 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.964690 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:49 crc kubenswrapper[4700]: I1007 11:21:49.964711 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:49Z","lastTransitionTime":"2025-10-07T11:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.067813 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.067888 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.067915 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.067945 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.067963 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:50Z","lastTransitionTime":"2025-10-07T11:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.171647 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.171695 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.171704 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.171722 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.171732 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:50Z","lastTransitionTime":"2025-10-07T11:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.274791 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.274853 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.274867 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.274890 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.274904 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:50Z","lastTransitionTime":"2025-10-07T11:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.378278 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.378398 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.378415 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.378442 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.378457 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:50Z","lastTransitionTime":"2025-10-07T11:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.481149 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.481200 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.481212 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.481231 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.481244 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:50Z","lastTransitionTime":"2025-10-07T11:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.584692 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.584753 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.584768 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.584794 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.584811 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:50Z","lastTransitionTime":"2025-10-07T11:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.687541 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.687597 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.687610 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.687628 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.687641 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:50Z","lastTransitionTime":"2025-10-07T11:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.792450 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.792566 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.792598 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.792631 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.792659 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:50Z","lastTransitionTime":"2025-10-07T11:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.896543 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.896634 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.896660 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.896697 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.896721 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:50Z","lastTransitionTime":"2025-10-07T11:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:50 crc kubenswrapper[4700]: I1007 11:21:50.956722 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:21:50 crc kubenswrapper[4700]: E1007 11:21:50.956950 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.000908 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.000978 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.000998 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.001100 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.001121 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:51Z","lastTransitionTime":"2025-10-07T11:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.104510 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.104592 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.104609 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.104638 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.104659 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:51Z","lastTransitionTime":"2025-10-07T11:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.207548 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.207622 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.207647 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.207675 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.207698 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:51Z","lastTransitionTime":"2025-10-07T11:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.310180 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.310269 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.310291 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.310344 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.310369 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:51Z","lastTransitionTime":"2025-10-07T11:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.359530 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.359591 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.359601 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.359624 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.359636 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:51Z","lastTransitionTime":"2025-10-07T11:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:51 crc kubenswrapper[4700]: E1007 11:21:51.380363 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:51Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.387469 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.387530 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.387544 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.387567 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.387583 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:51Z","lastTransitionTime":"2025-10-07T11:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:51 crc kubenswrapper[4700]: E1007 11:21:51.406125 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:51Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.412385 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.412437 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.412464 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.412484 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.412495 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:51Z","lastTransitionTime":"2025-10-07T11:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:51 crc kubenswrapper[4700]: E1007 11:21:51.434789 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:51Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.443760 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.443834 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.443851 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.443877 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.443898 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:51Z","lastTransitionTime":"2025-10-07T11:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:51 crc kubenswrapper[4700]: E1007 11:21:51.461859 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:51Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.466901 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.466962 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.466979 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.467003 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.467021 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:51Z","lastTransitionTime":"2025-10-07T11:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:51 crc kubenswrapper[4700]: E1007 11:21:51.489803 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:51Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:51 crc kubenswrapper[4700]: E1007 11:21:51.489990 4700 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.492519 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.492579 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.492598 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.492626 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.492645 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:51Z","lastTransitionTime":"2025-10-07T11:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.595892 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.595940 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.595950 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.595966 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.595977 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:51Z","lastTransitionTime":"2025-10-07T11:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.700140 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.700190 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.700203 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.700223 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.700232 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:51Z","lastTransitionTime":"2025-10-07T11:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.803427 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.803489 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.803505 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.803532 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.803553 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:51Z","lastTransitionTime":"2025-10-07T11:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.906698 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.906765 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.906789 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.906818 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.906839 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:51Z","lastTransitionTime":"2025-10-07T11:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.957145 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.957251 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:21:51 crc kubenswrapper[4700]: E1007 11:21:51.957319 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:21:51 crc kubenswrapper[4700]: E1007 11:21:51.957454 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:21:51 crc kubenswrapper[4700]: I1007 11:21:51.957576 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:21:51 crc kubenswrapper[4700]: E1007 11:21:51.957684 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.010493 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.010565 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.010587 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.010622 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.010646 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:52Z","lastTransitionTime":"2025-10-07T11:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.113871 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.113930 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.113949 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.113979 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.114000 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:52Z","lastTransitionTime":"2025-10-07T11:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.217832 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.217904 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.217919 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.217946 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.217962 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:52Z","lastTransitionTime":"2025-10-07T11:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.320935 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.321005 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.321026 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.321050 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.321068 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:52Z","lastTransitionTime":"2025-10-07T11:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.424739 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.424819 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.424836 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.424865 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.424886 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:52Z","lastTransitionTime":"2025-10-07T11:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.527908 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.527963 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.527982 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.528011 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.528031 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:52Z","lastTransitionTime":"2025-10-07T11:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.631346 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.631399 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.631410 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.631431 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.631448 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:52Z","lastTransitionTime":"2025-10-07T11:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.734561 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.734635 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.734654 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.734681 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.734706 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:52Z","lastTransitionTime":"2025-10-07T11:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.837373 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.837456 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.837475 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.837495 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.837506 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:52Z","lastTransitionTime":"2025-10-07T11:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.940640 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.940694 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.940710 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.940740 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.940761 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:52Z","lastTransitionTime":"2025-10-07T11:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.956904 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:21:52 crc kubenswrapper[4700]: E1007 11:21:52.957102 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:21:52 crc kubenswrapper[4700]: I1007 11:21:52.958378 4700 scope.go:117] "RemoveContainer" containerID="3f2787d9123276a75dd019bdf6b9626c5bae9dcf0da8beed39a268372249b4fd" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.043983 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.044048 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.044067 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.044099 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.044119 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:53Z","lastTransitionTime":"2025-10-07T11:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.153830 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.154193 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.154501 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.154594 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.154631 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:53Z","lastTransitionTime":"2025-10-07T11:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.271777 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.271843 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.271859 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.271885 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.271904 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:53Z","lastTransitionTime":"2025-10-07T11:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.374983 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.375034 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.375054 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.375074 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.375087 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:53Z","lastTransitionTime":"2025-10-07T11:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.477808 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.477841 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.477850 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.477865 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.477873 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:53Z","lastTransitionTime":"2025-10-07T11:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.522329 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fk4xc_d0a75e4c-2144-40de-9abc-f0bb7a143a0e/ovnkube-controller/2.log" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.525412 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" event={"ID":"d0a75e4c-2144-40de-9abc-f0bb7a143a0e","Type":"ContainerStarted","Data":"86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb"} Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.526599 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.538397 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"244b2984-d3ec-4577-893d-b9b4030db764\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa43badab7b685d1e4e47cae1ae05b3c890acdf44b9723e2a594f4beb57ac60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d1ac3c28f1187deb559f1e8a08bd2989283f9e23cce4808fe6a6cd71f48b75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tbrzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:53Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.551105 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dhsvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25429408-169d-4998-9b40-44a882f5a89e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dhsvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:53Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.565603 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"183a9149-9c82-4391-ae03-8f8c0e95649b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e380a35d97b01333eeb9aaf48ff90dc4766a55fbcb27631fea7e3a64385475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7d621df1ddc7e0317936ccecce52285289c5e438edbc91e5ce187102c9ed27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddcaa62485eb2b39092d82fb0d67856085cabef7478c4f34b0b80a082579d088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05f33d7fc6a2c018fdfc7441ecbf9075c5e2cbe237c9f02d30e3dee8b553f3ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f33d7fc6a2c018fdfc7441ecbf9075c5e2cbe237c9f02d30e3dee8b553f3ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:53Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.578585 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:53Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.581238 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.581267 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.581276 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.581293 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.581321 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:53Z","lastTransitionTime":"2025-10-07T11:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.597161 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7313e5d6c49d403c834be26be9717bd719575d0f469773f15a1f982c8209d887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:53Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.613584 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3343c4ca5375885915b7fdee68df54d6460363b58b4802ff380f38334a9312bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:47Z\\\",\\\"message\\\":\\\"2025-10-07T11:21:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9add58a6-3aaf-4e8d-b919-5ceee6291a55\\\\n2025-10-07T11:21:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9add58a6-3aaf-4e8d-b919-5ceee6291a55 to /host/opt/cni/bin/\\\\n2025-10-07T11:21:02Z [verbose] multus-daemon started\\\\n2025-10-07T11:21:02Z [verbose] Readiness Indicator file check\\\\n2025-10-07T11:21:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:53Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.625338 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:53Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.639387 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e499a61ee1225e7b9ad2af5b6de48463ea247761b9e8414c343ac016eb66a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:53Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.657166 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:53Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.670826 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:53Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.682143 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:53Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.683611 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.683664 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.683680 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.683704 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.683717 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:53Z","lastTransitionTime":"2025-10-07T11:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.693884 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:53Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.706158 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee71f49a-bb16-4ed0-9777-91c1443097f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://881c4a00a6d28e60588dc9cc465730f0e190c9ac71665d691a6d0f46bbd75de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a67ca6dd1cda2310e84aa65eac4f8a81721b765bb612169f423a4a9fa5b7d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a67ca6dd1cda2310e84aa65eac4f8a81721b765bb612169f423a4a9fa5b7d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:53Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.720998 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:53Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.733665 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f802d78653da0d6865fa5c88a4c16488eb620476eb832a61905962260c1f0b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:53Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.747748 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:53Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.765828 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:53Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.784234 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f2787d9123276a75dd019bdf6b9626c5bae9dcf0da8beed39a268372249b4fd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"message\\\":\\\" because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z]\\\\nI1007 11:21:26.925629 6379 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-dhsvm before timer (time: 2025-10-07 11:21:28.078635138 +0000 UTC m=+1.806164794): skip\\\\nI1007 11:21:26.925649 6379 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 63.262µs)\\\\nI1007 11:21:26.925540 6379 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/downloads_TCP_cluster\\\\\\\", UUID:\\\\\\\"2ead45b3-c313-4fbc-a7bc-2b3c4ffd610c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/downloads\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:53Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.786005 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.786043 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.786056 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.786075 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.786090 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:53Z","lastTransitionTime":"2025-10-07T11:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.889559 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.889603 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.889612 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.889628 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.889638 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:53Z","lastTransitionTime":"2025-10-07T11:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.956467 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.956643 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:21:53 crc kubenswrapper[4700]: E1007 11:21:53.956833 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.957076 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:21:53 crc kubenswrapper[4700]: E1007 11:21:53.957168 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:21:53 crc kubenswrapper[4700]: E1007 11:21:53.957385 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.978990 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:53Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.993629 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.993685 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.993701 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.993724 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.993742 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:53Z","lastTransitionTime":"2025-10-07T11:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:53 crc kubenswrapper[4700]: I1007 11:21:53.996899 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee71f49a-bb16-4ed0-9777-91c1443097f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://881c4a00a6d28e60588dc9cc465730f0e190c9ac71665d691a6d0f46bbd75de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a67ca6dd1cda2310e84aa65eac4f8a81721b765bb612169f423a4a9fa5b7d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a67ca6dd1cda2310e84aa65eac4f8a81721b765bb612169f423a4a9fa5b7d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:53Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.015775 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:54Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.033764 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:54Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.057103 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f2787d9123276a75dd019bdf6b9626c5bae9dcf0da8beed39a268372249b4fd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"message\\\":\\\" because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z]\\\\nI1007 11:21:26.925629 6379 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-dhsvm before timer (time: 2025-10-07 11:21:28.078635138 +0000 UTC m=+1.806164794): skip\\\\nI1007 11:21:26.925649 6379 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 63.262µs)\\\\nI1007 11:21:26.925540 6379 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/downloads_TCP_cluster\\\\\\\", UUID:\\\\\\\"2ead45b3-c313-4fbc-a7bc-2b3c4ffd610c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/downloads\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:54Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.075959 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f802d78653da0d6865fa5c88a4c16488eb620476eb832a61905962260c1f0b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:54Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.094844 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:54Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.096790 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.096816 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.096826 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.096845 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.096858 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:54Z","lastTransitionTime":"2025-10-07T11:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.112710 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7313e5d6c49d403c834be26be9717bd719575d0f469773f15a1f982c8209d887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:54Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.131744 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3343c4ca5375885915b7fdee68df54d6460363b58b4802ff380f38334a9312bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:47Z\\\",\\\"message\\\":\\\"2025-10-07T11:21:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9add58a6-3aaf-4e8d-b919-5ceee6291a55\\\\n2025-10-07T11:21:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9add58a6-3aaf-4e8d-b919-5ceee6291a55 to /host/opt/cni/bin/\\\\n2025-10-07T11:21:02Z [verbose] multus-daemon started\\\\n2025-10-07T11:21:02Z [verbose] Readiness Indicator file check\\\\n2025-10-07T11:21:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:54Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.148590 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"244b2984-d3ec-4577-893d-b9b4030db764\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa43badab7b685d1e4e47cae1ae05b3c890acdf44b9723e2a594f4beb57ac60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d1ac3c28f1187deb559f1e8a08bd2989283f9e23cce4808fe6a6cd71f48b75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tbrzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:54Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.167102 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dhsvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25429408-169d-4998-9b40-44a882f5a89e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dhsvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:54Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.185948 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"183a9149-9c82-4391-ae03-8f8c0e95649b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e380a35d97b01333eeb9aaf48ff90dc4766a55fbcb27631fea7e3a64385475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7d621df1ddc7e0317936ccecce52285289c5e438edbc91e5ce187102c9ed27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddcaa62485eb2b39092d82fb0d67856085cabef7478c4f34b0b80a082579d088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05f33d7fc6a2c018fdfc7441ecbf9075c5e2cbe237c9f02d30e3dee8b553f3ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f33d7fc6a2c018fdfc7441ecbf9075c5e2cbe237c9f02d30e3dee8b553f3ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:54Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.201381 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.201425 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.201437 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.201455 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.201469 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:54Z","lastTransitionTime":"2025-10-07T11:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.201538 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:54Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.221181 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:54Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.236436 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:54Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.252025 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:54Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.265063 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e499a61ee1225e7b9ad2af5b6de48463ea247761b9e8414c343ac016eb66a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:54Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.280191 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:54Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.305578 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.305632 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.305646 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.305667 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.305679 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:54Z","lastTransitionTime":"2025-10-07T11:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.408603 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.408675 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.408696 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.408723 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.408744 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:54Z","lastTransitionTime":"2025-10-07T11:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.511373 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.512020 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.512055 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.512086 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.512110 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:54Z","lastTransitionTime":"2025-10-07T11:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.531202 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fk4xc_d0a75e4c-2144-40de-9abc-f0bb7a143a0e/ovnkube-controller/3.log" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.532428 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fk4xc_d0a75e4c-2144-40de-9abc-f0bb7a143a0e/ovnkube-controller/2.log" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.535929 4700 generic.go:334] "Generic (PLEG): container finished" podID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerID="86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb" exitCode=1 Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.535993 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" event={"ID":"d0a75e4c-2144-40de-9abc-f0bb7a143a0e","Type":"ContainerDied","Data":"86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb"} Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.536047 4700 scope.go:117] "RemoveContainer" containerID="3f2787d9123276a75dd019bdf6b9626c5bae9dcf0da8beed39a268372249b4fd" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.537406 4700 scope.go:117] "RemoveContainer" containerID="86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb" Oct 07 11:21:54 crc kubenswrapper[4700]: E1007 11:21:54.537671 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fk4xc_openshift-ovn-kubernetes(d0a75e4c-2144-40de-9abc-f0bb7a143a0e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.551139 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:54Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.569605 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7313e5d6c49d403c834be26be9717bd719575d0f469773f15a1f982c8209d887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:54Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.585156 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3343c4ca5375885915b7fdee68df54d6460363b58b4802ff380f38334a9312bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:47Z\\\",\\\"message\\\":\\\"2025-10-07T11:21:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9add58a6-3aaf-4e8d-b919-5ceee6291a55\\\\n2025-10-07T11:21:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9add58a6-3aaf-4e8d-b919-5ceee6291a55 to /host/opt/cni/bin/\\\\n2025-10-07T11:21:02Z [verbose] multus-daemon started\\\\n2025-10-07T11:21:02Z [verbose] Readiness Indicator file check\\\\n2025-10-07T11:21:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:54Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.599224 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"244b2984-d3ec-4577-893d-b9b4030db764\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa43badab7b685d1e4e47cae1ae05b3c890acdf44b9723e2a594f4beb57ac60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d1ac3c28f1187deb559f1e8a08bd2989283f9e23cce4808fe6a6cd71f48b75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tbrzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:54Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.613179 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dhsvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25429408-169d-4998-9b40-44a882f5a89e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dhsvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:54Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.615590 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.615637 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.615647 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.615666 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.615679 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:54Z","lastTransitionTime":"2025-10-07T11:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.629377 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"183a9149-9c82-4391-ae03-8f8c0e95649b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e380a35d97b01333eeb9aaf48ff90dc4766a55fbcb27631fea7e3a64385475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7d621df1ddc7e0317936ccecce52285289c5e438edbc91e5ce187102c9ed27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddcaa62485eb2b39092d82fb0d67856085cabef7478c4f34b0b80a082579d088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05f33d7fc6a2c018fdfc7441ecbf9075c5e2cbe237c9f02d30e3dee8b553f3ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f33d7fc6a2c018fdfc7441ecbf9075c5e2cbe237c9f02d30e3dee8b553f3ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:54Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.645401 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:54Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.658520 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:54Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.673019 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:54Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.689040 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:54Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.705929 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e499a61ee1225e7b9ad2af5b6de48463ea247761b9e8414c343ac016eb66a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:54Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.719462 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.719540 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.719557 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.719577 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.719588 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:54Z","lastTransitionTime":"2025-10-07T11:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.724688 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:54Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.746401 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:54Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.762394 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee71f49a-bb16-4ed0-9777-91c1443097f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://881c4a00a6d28e60588dc9cc465730f0e190c9ac71665d691a6d0f46bbd75de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a67ca6dd1cda2310e84aa65eac4f8a81721b765bb612169f423a4a9fa5b7d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a67ca6dd1cda2310e84aa65eac4f8a81721b765bb612169f423a4a9fa5b7d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:54Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.781617 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:54Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.803214 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:54Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.822530 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.822601 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.822621 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.822649 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.822676 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:54Z","lastTransitionTime":"2025-10-07T11:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.846862 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f2787d9123276a75dd019bdf6b9626c5bae9dcf0da8beed39a268372249b4fd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"message\\\":\\\" because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:26Z is after 2025-08-24T17:21:41Z]\\\\nI1007 11:21:26.925629 6379 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-dhsvm before timer (time: 2025-10-07 11:21:28.078635138 +0000 UTC m=+1.806164794): skip\\\\nI1007 11:21:26.925649 6379 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 63.262µs)\\\\nI1007 11:21:26.925540 6379 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/downloads_TCP_cluster\\\\\\\", UUID:\\\\\\\"2ead45b3-c313-4fbc-a7bc-2b3c4ffd610c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/downloads\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:53Z\\\",\\\"message\\\":\\\"ost \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:53Z is after 2025-08-24T17:21:41Z]\\\\nI1007 11:21:53.887722 6734 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1007 11:21:53.887704 6734 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/community-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"d389393c-7ba9-422c-b3f5-06e391d537d2\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/community-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Ru\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:54Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.867715 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f802d78653da0d6865fa5c88a4c16488eb620476eb832a61905962260c1f0b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:54Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.926542 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.926666 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.926690 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.927222 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.927260 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:54Z","lastTransitionTime":"2025-10-07T11:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:54 crc kubenswrapper[4700]: I1007 11:21:54.957201 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:21:54 crc kubenswrapper[4700]: E1007 11:21:54.957469 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.030588 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.030642 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.030650 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.030669 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.030679 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:55Z","lastTransitionTime":"2025-10-07T11:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.134070 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.134146 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.134165 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.134192 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.134212 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:55Z","lastTransitionTime":"2025-10-07T11:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.236614 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.236719 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.236737 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.236765 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.236782 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:55Z","lastTransitionTime":"2025-10-07T11:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.340238 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.340347 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.340370 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.340396 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.340414 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:55Z","lastTransitionTime":"2025-10-07T11:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.444643 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.444716 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.444770 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.444798 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.444816 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:55Z","lastTransitionTime":"2025-10-07T11:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.544450 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fk4xc_d0a75e4c-2144-40de-9abc-f0bb7a143a0e/ovnkube-controller/3.log" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.547204 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.547266 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.547288 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.547341 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.547363 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:55Z","lastTransitionTime":"2025-10-07T11:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.551512 4700 scope.go:117] "RemoveContainer" containerID="86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb" Oct 07 11:21:55 crc kubenswrapper[4700]: E1007 11:21:55.551868 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fk4xc_openshift-ovn-kubernetes(d0a75e4c-2144-40de-9abc-f0bb7a143a0e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.570478 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee71f49a-bb16-4ed0-9777-91c1443097f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://881c4a00a6d28e60588dc9cc465730f0e190c9ac71665d691a6d0f46bbd75de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a67ca6dd1cda2310e84aa65eac4f8a81721b765bb612169f423a4a9fa5b7d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a67ca6dd1cda2310e84aa65eac4f8a81721b765bb612169f423a4a9fa5b7d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:55Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.595184 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:55Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.650764 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.650826 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.650842 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.650866 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.650887 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:55Z","lastTransitionTime":"2025-10-07T11:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.661425 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:53Z\\\",\\\"message\\\":\\\"ost \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:53Z is after 2025-08-24T17:21:41Z]\\\\nI1007 11:21:53.887722 6734 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1007 11:21:53.887704 6734 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/community-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"d389393c-7ba9-422c-b3f5-06e391d537d2\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/community-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Ru\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fk4xc_openshift-ovn-kubernetes(d0a75e4c-2144-40de-9abc-f0bb7a143a0e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:55Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.685985 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f802d78653da0d6865fa5c88a4c16488eb620476eb832a61905962260c1f0b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:55Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.702454 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:55Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.718337 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:55Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.733133 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3343c4ca5375885915b7fdee68df54d6460363b58b4802ff380f38334a9312bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:47Z\\\",\\\"message\\\":\\\"2025-10-07T11:21:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9add58a6-3aaf-4e8d-b919-5ceee6291a55\\\\n2025-10-07T11:21:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9add58a6-3aaf-4e8d-b919-5ceee6291a55 to /host/opt/cni/bin/\\\\n2025-10-07T11:21:02Z [verbose] multus-daemon started\\\\n2025-10-07T11:21:02Z [verbose] Readiness Indicator file check\\\\n2025-10-07T11:21:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:55Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.748186 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"244b2984-d3ec-4577-893d-b9b4030db764\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa43badab7b685d1e4e47cae1ae05b3c890acdf44b9723e2a594f4beb57ac60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d1ac3c28f1187deb559f1e8a08bd2989283f9e23cce4808fe6a6cd71f48b75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tbrzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:55Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.753230 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.753356 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.753394 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.753428 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.753455 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:55Z","lastTransitionTime":"2025-10-07T11:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.763814 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dhsvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25429408-169d-4998-9b40-44a882f5a89e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dhsvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:55Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.779723 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"183a9149-9c82-4391-ae03-8f8c0e95649b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e380a35d97b01333eeb9aaf48ff90dc4766a55fbcb27631fea7e3a64385475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7d621df1ddc7e0317936ccecce52285289c5e438edbc91e5ce187102c9ed27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddcaa62485eb2b39092d82fb0d67856085cabef7478c4f34b0b80a082579d088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05f33d7fc6a2c018fdfc7441ecbf9075c5e2cbe237c9f02d30e3dee8b553f3ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f33d7fc6a2c018fdfc7441ecbf9075c5e2cbe237c9f02d30e3dee8b553f3ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:55Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.796531 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:55Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.819466 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7313e5d6c49d403c834be26be9717bd719575d0f469773f15a1f982c8209d887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:55Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.835222 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:55Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.850188 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:55Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.856250 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.856353 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.856369 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.856393 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.856406 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:55Z","lastTransitionTime":"2025-10-07T11:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.867199 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e499a61ee1225e7b9ad2af5b6de48463ea247761b9e8414c343ac016eb66a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:55Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.886127 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:55Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.908036 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:55Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.925360 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:55Z is after 2025-08-24T17:21:41Z" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.957361 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.957514 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.957386 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:21:55 crc kubenswrapper[4700]: E1007 11:21:55.957652 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:21:55 crc kubenswrapper[4700]: E1007 11:21:55.957725 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:21:55 crc kubenswrapper[4700]: E1007 11:21:55.957873 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.959963 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.960033 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.960057 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.960089 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:55 crc kubenswrapper[4700]: I1007 11:21:55.960114 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:55Z","lastTransitionTime":"2025-10-07T11:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.063390 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.063460 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.063472 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.063491 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.063505 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:56Z","lastTransitionTime":"2025-10-07T11:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.166174 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.166230 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.166245 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.166268 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.166284 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:56Z","lastTransitionTime":"2025-10-07T11:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.268921 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.269007 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.269024 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.269047 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.269061 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:56Z","lastTransitionTime":"2025-10-07T11:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.372462 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.372518 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.372527 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.372546 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.372557 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:56Z","lastTransitionTime":"2025-10-07T11:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.476355 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.476421 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.476440 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.476467 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.476493 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:56Z","lastTransitionTime":"2025-10-07T11:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.579711 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.579801 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.579826 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.579859 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.579883 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:56Z","lastTransitionTime":"2025-10-07T11:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.683437 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.683517 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.683538 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.683571 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.683595 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:56Z","lastTransitionTime":"2025-10-07T11:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.786537 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.786588 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.786606 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.786630 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.786647 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:56Z","lastTransitionTime":"2025-10-07T11:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.889744 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.889869 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.889887 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.889914 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.889932 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:56Z","lastTransitionTime":"2025-10-07T11:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.956423 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:21:56 crc kubenswrapper[4700]: E1007 11:21:56.956672 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.993624 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.993699 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.993723 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.993749 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:56 crc kubenswrapper[4700]: I1007 11:21:56.993770 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:56Z","lastTransitionTime":"2025-10-07T11:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.097573 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.097665 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.097682 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.097706 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.097724 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:57Z","lastTransitionTime":"2025-10-07T11:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.201047 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.201110 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.201127 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.201151 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.201169 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:57Z","lastTransitionTime":"2025-10-07T11:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.303439 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.303488 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.303500 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.303517 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.303531 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:57Z","lastTransitionTime":"2025-10-07T11:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.406569 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.406644 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.406663 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.406691 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.406709 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:57Z","lastTransitionTime":"2025-10-07T11:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.510246 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.510296 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.510337 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.510360 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.510375 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:57Z","lastTransitionTime":"2025-10-07T11:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.613624 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.613670 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.613681 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.613698 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.613710 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:57Z","lastTransitionTime":"2025-10-07T11:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.717018 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.717093 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.717110 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.717137 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.717155 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:57Z","lastTransitionTime":"2025-10-07T11:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.783035 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:21:57 crc kubenswrapper[4700]: E1007 11:21:57.783285 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:01.783233908 +0000 UTC m=+148.579632937 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.820022 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.820087 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.820105 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.820137 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.820157 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:57Z","lastTransitionTime":"2025-10-07T11:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.884960 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.885042 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.885095 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.885140 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:57 crc kubenswrapper[4700]: E1007 11:21:57.885381 4700 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 11:21:57 crc kubenswrapper[4700]: E1007 11:21:57.885461 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 11:23:01.885439344 +0000 UTC m=+148.681838373 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 11:21:57 crc kubenswrapper[4700]: E1007 11:21:57.885607 4700 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 11:21:57 crc kubenswrapper[4700]: E1007 11:21:57.885646 4700 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 11:21:57 crc kubenswrapper[4700]: E1007 11:21:57.885667 4700 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 11:21:57 crc kubenswrapper[4700]: E1007 11:21:57.885722 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 11:23:01.885704511 +0000 UTC m=+148.682103540 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 11:21:57 crc kubenswrapper[4700]: E1007 11:21:57.885746 4700 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 11:21:57 crc kubenswrapper[4700]: E1007 11:21:57.885807 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 11:23:01.885789463 +0000 UTC m=+148.682188492 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 11:21:57 crc kubenswrapper[4700]: E1007 11:21:57.885741 4700 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 11:21:57 crc kubenswrapper[4700]: E1007 11:21:57.885921 4700 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 11:21:57 crc kubenswrapper[4700]: E1007 11:21:57.885956 4700 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 11:21:57 crc kubenswrapper[4700]: E1007 11:21:57.886121 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 11:23:01.886083261 +0000 UTC m=+148.682482440 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.923505 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.923595 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.923608 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.923632 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.923652 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:57Z","lastTransitionTime":"2025-10-07T11:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.956281 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.956364 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:21:57 crc kubenswrapper[4700]: I1007 11:21:57.956436 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:21:57 crc kubenswrapper[4700]: E1007 11:21:57.956580 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:21:57 crc kubenswrapper[4700]: E1007 11:21:57.956699 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:21:57 crc kubenswrapper[4700]: E1007 11:21:57.956908 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.026950 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.027018 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.027035 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.027058 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.027076 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:58Z","lastTransitionTime":"2025-10-07T11:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.130846 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.130930 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.130940 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.130957 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.130967 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:58Z","lastTransitionTime":"2025-10-07T11:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.234582 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.234681 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.234703 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.234730 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.234748 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:58Z","lastTransitionTime":"2025-10-07T11:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.338064 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.338127 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.338138 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.338157 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.338170 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:58Z","lastTransitionTime":"2025-10-07T11:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.441048 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.441088 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.441096 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.441111 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.441120 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:58Z","lastTransitionTime":"2025-10-07T11:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.543687 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.543742 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.543755 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.543774 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.543786 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:58Z","lastTransitionTime":"2025-10-07T11:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.646777 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.646875 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.646892 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.646916 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.646933 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:58Z","lastTransitionTime":"2025-10-07T11:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.750886 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.750959 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.750981 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.751007 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.751026 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:58Z","lastTransitionTime":"2025-10-07T11:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.854625 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.854703 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.854713 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.854731 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.854742 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:58Z","lastTransitionTime":"2025-10-07T11:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.956851 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.957333 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.957377 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.957396 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.957418 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:58 crc kubenswrapper[4700]: I1007 11:21:58.957437 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:58Z","lastTransitionTime":"2025-10-07T11:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:58 crc kubenswrapper[4700]: E1007 11:21:58.957779 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.060188 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.060273 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.060290 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.060340 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.060360 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:59Z","lastTransitionTime":"2025-10-07T11:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.163785 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.163860 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.163869 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.163888 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.163902 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:59Z","lastTransitionTime":"2025-10-07T11:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.267398 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.267496 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.267519 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.267549 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.267571 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:59Z","lastTransitionTime":"2025-10-07T11:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.370613 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.370677 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.370699 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.370725 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.370744 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:59Z","lastTransitionTime":"2025-10-07T11:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.474523 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.474576 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.474587 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.474604 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.474614 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:59Z","lastTransitionTime":"2025-10-07T11:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.578870 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.578948 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.578988 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.579012 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.579033 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:59Z","lastTransitionTime":"2025-10-07T11:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.681977 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.682038 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.682049 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.682070 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.682083 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:59Z","lastTransitionTime":"2025-10-07T11:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.784879 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.784921 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.784933 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.784950 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.784962 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:59Z","lastTransitionTime":"2025-10-07T11:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.887934 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.887994 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.888014 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.888048 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.888072 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:59Z","lastTransitionTime":"2025-10-07T11:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.957015 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.957075 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.957114 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:21:59 crc kubenswrapper[4700]: E1007 11:21:59.957197 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:21:59 crc kubenswrapper[4700]: E1007 11:21:59.957403 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:21:59 crc kubenswrapper[4700]: E1007 11:21:59.957491 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.991482 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.991552 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.991575 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.991607 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:21:59 crc kubenswrapper[4700]: I1007 11:21:59.991629 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:21:59Z","lastTransitionTime":"2025-10-07T11:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.094165 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.094207 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.094218 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.094236 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.094248 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:00Z","lastTransitionTime":"2025-10-07T11:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.198021 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.198126 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.198139 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.198160 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.198199 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:00Z","lastTransitionTime":"2025-10-07T11:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.301973 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.302133 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.302159 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.302197 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.302221 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:00Z","lastTransitionTime":"2025-10-07T11:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.405781 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.405827 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.405838 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.405860 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.405874 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:00Z","lastTransitionTime":"2025-10-07T11:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.508652 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.508895 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.508953 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.509016 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.509083 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:00Z","lastTransitionTime":"2025-10-07T11:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.611636 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.611696 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.611708 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.611731 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.611745 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:00Z","lastTransitionTime":"2025-10-07T11:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.715612 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.716045 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.716192 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.716367 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.716505 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:00Z","lastTransitionTime":"2025-10-07T11:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.819544 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.820019 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.820279 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.820531 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.820733 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:00Z","lastTransitionTime":"2025-10-07T11:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.924839 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.924916 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.924937 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.924962 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.924980 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:00Z","lastTransitionTime":"2025-10-07T11:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:00 crc kubenswrapper[4700]: I1007 11:22:00.956591 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:22:00 crc kubenswrapper[4700]: E1007 11:22:00.957040 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.028034 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.028103 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.028126 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.028157 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.028181 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:01Z","lastTransitionTime":"2025-10-07T11:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.131137 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.131507 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.131657 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.131922 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.132108 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:01Z","lastTransitionTime":"2025-10-07T11:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.235386 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.235715 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.235944 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.236161 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.236401 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:01Z","lastTransitionTime":"2025-10-07T11:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.339428 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.339492 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.339506 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.339532 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.339557 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:01Z","lastTransitionTime":"2025-10-07T11:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.442267 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.442392 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.442421 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.442449 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.442467 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:01Z","lastTransitionTime":"2025-10-07T11:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.545767 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.545832 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.545847 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.545869 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.545885 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:01Z","lastTransitionTime":"2025-10-07T11:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.649391 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.649429 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.649438 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.649455 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.649465 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:01Z","lastTransitionTime":"2025-10-07T11:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.720206 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.720521 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.720556 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.720576 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.720589 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:01Z","lastTransitionTime":"2025-10-07T11:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:01 crc kubenswrapper[4700]: E1007 11:22:01.738894 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:01Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.744842 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.744889 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.744900 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.744913 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.744924 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:01Z","lastTransitionTime":"2025-10-07T11:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:01 crc kubenswrapper[4700]: E1007 11:22:01.762376 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:01Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.767504 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.767548 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.767559 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.767600 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.767612 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:01Z","lastTransitionTime":"2025-10-07T11:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:01 crc kubenswrapper[4700]: E1007 11:22:01.785933 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:01Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.791614 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.791667 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.791688 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.791716 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.791739 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:01Z","lastTransitionTime":"2025-10-07T11:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:01 crc kubenswrapper[4700]: E1007 11:22:01.813608 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:01Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.818903 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.818936 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.818944 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.818958 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.818970 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:01Z","lastTransitionTime":"2025-10-07T11:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:01 crc kubenswrapper[4700]: E1007 11:22:01.831365 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:01Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:01 crc kubenswrapper[4700]: E1007 11:22:01.831479 4700 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.833029 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.833055 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.833067 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.833077 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.833086 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:01Z","lastTransitionTime":"2025-10-07T11:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.936883 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.936968 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.936994 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.937024 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.937045 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:01Z","lastTransitionTime":"2025-10-07T11:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.956605 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.956732 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:22:01 crc kubenswrapper[4700]: E1007 11:22:01.956915 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:22:01 crc kubenswrapper[4700]: I1007 11:22:01.957005 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:22:01 crc kubenswrapper[4700]: E1007 11:22:01.957256 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:22:01 crc kubenswrapper[4700]: E1007 11:22:01.957358 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.039701 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.039745 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.039757 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.039773 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.039784 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:02Z","lastTransitionTime":"2025-10-07T11:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.142746 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.142802 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.142819 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.142847 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.142869 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:02Z","lastTransitionTime":"2025-10-07T11:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.245804 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.245864 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.245881 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.245904 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.245922 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:02Z","lastTransitionTime":"2025-10-07T11:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.348522 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.348630 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.348656 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.348694 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.348721 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:02Z","lastTransitionTime":"2025-10-07T11:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.452174 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.452221 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.452232 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.452254 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.452267 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:02Z","lastTransitionTime":"2025-10-07T11:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.555591 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.555702 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.555724 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.555751 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.555770 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:02Z","lastTransitionTime":"2025-10-07T11:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.658873 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.658960 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.658983 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.659014 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.659043 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:02Z","lastTransitionTime":"2025-10-07T11:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.762528 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.762609 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.762635 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.762677 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.762703 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:02Z","lastTransitionTime":"2025-10-07T11:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.865401 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.865445 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.865456 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.865472 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.865483 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:02Z","lastTransitionTime":"2025-10-07T11:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.956203 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:22:02 crc kubenswrapper[4700]: E1007 11:22:02.956352 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.967673 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.967709 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.967717 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.967731 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:02 crc kubenswrapper[4700]: I1007 11:22:02.967745 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:02Z","lastTransitionTime":"2025-10-07T11:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.069843 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.069905 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.069923 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.069970 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.069989 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:03Z","lastTransitionTime":"2025-10-07T11:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.172571 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.172649 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.172666 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.172692 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.172711 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:03Z","lastTransitionTime":"2025-10-07T11:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.276086 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.276143 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.276160 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.276180 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.276198 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:03Z","lastTransitionTime":"2025-10-07T11:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.379004 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.379070 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.379087 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.379116 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.379135 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:03Z","lastTransitionTime":"2025-10-07T11:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.482516 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.482584 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.482650 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.482679 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.482721 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:03Z","lastTransitionTime":"2025-10-07T11:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.585710 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.585778 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.585795 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.585823 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.585847 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:03Z","lastTransitionTime":"2025-10-07T11:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.689254 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.689337 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.689354 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.689379 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.689399 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:03Z","lastTransitionTime":"2025-10-07T11:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.792131 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.792187 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.792203 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.792225 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.792243 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:03Z","lastTransitionTime":"2025-10-07T11:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.895511 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.895582 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.895600 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.895628 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.895651 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:03Z","lastTransitionTime":"2025-10-07T11:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.956399 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.956482 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.956399 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:22:03 crc kubenswrapper[4700]: E1007 11:22:03.956573 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:22:03 crc kubenswrapper[4700]: E1007 11:22:03.956723 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:22:03 crc kubenswrapper[4700]: E1007 11:22:03.956809 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.973552 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"183a9149-9c82-4391-ae03-8f8c0e95649b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e380a35d97b01333eeb9aaf48ff90dc4766a55fbcb27631fea7e3a64385475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7d621df1ddc7e0317936ccecce52285289c5e438edbc91e5ce187102c9ed27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddcaa62485eb2b39092d82fb0d67856085cabef7478c4f34b0b80a082579d088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05f33d7fc6a2c018fdfc7441ecbf9075c5e2cbe237c9f02d30e3dee8b553f3ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f33d7fc6a2c018fdfc7441ecbf9075c5e2cbe237c9f02d30e3dee8b553f3ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:03Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.988448 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:03Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.998442 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.998484 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.998499 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.998518 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:03 crc kubenswrapper[4700]: I1007 11:22:03.998532 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:03Z","lastTransitionTime":"2025-10-07T11:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.007301 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7313e5d6c49d403c834be26be9717bd719575d0f469773f15a1f982c8209d887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.021531 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3343c4ca5375885915b7fdee68df54d6460363b58b4802ff380f38334a9312bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:47Z\\\",\\\"message\\\":\\\"2025-10-07T11:21:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9add58a6-3aaf-4e8d-b919-5ceee6291a55\\\\n2025-10-07T11:21:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9add58a6-3aaf-4e8d-b919-5ceee6291a55 to /host/opt/cni/bin/\\\\n2025-10-07T11:21:02Z [verbose] multus-daemon started\\\\n2025-10-07T11:21:02Z [verbose] Readiness Indicator file check\\\\n2025-10-07T11:21:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.035125 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"244b2984-d3ec-4577-893d-b9b4030db764\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa43badab7b685d1e4e47cae1ae05b3c890acdf44b9723e2a594f4beb57ac60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d1ac3c28f1187deb559f1e8a08bd2989283f9e23cce4808fe6a6cd71f48b75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tbrzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.047953 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dhsvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25429408-169d-4998-9b40-44a882f5a89e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dhsvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.062019 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.076523 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.089541 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.100723 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.101087 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.101134 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.101147 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.101169 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.101181 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:04Z","lastTransitionTime":"2025-10-07T11:22:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.112212 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.123438 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e499a61ee1225e7b9ad2af5b6de48463ea247761b9e8414c343ac016eb66a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.134546 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee71f49a-bb16-4ed0-9777-91c1443097f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://881c4a00a6d28e60588dc9cc465730f0e190c9ac71665d691a6d0f46bbd75de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a67ca6dd1cda2310e84aa65eac4f8a81721b765bb612169f423a4a9fa5b7d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a67ca6dd1cda2310e84aa65eac4f8a81721b765bb612169f423a4a9fa5b7d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.147255 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.163529 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f802d78653da0d6865fa5c88a4c16488eb620476eb832a61905962260c1f0b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.178265 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.192287 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.204445 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.204505 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.204517 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.204536 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.204552 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:04Z","lastTransitionTime":"2025-10-07T11:22:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.209660 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:53Z\\\",\\\"message\\\":\\\"ost \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:53Z is after 2025-08-24T17:21:41Z]\\\\nI1007 11:21:53.887722 6734 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1007 11:21:53.887704 6734 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/community-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"d389393c-7ba9-422c-b3f5-06e391d537d2\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/community-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Ru\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fk4xc_openshift-ovn-kubernetes(d0a75e4c-2144-40de-9abc-f0bb7a143a0e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:04Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.307585 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.307646 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.307660 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.307683 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.307697 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:04Z","lastTransitionTime":"2025-10-07T11:22:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.410882 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.410928 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.410939 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.410955 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.410969 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:04Z","lastTransitionTime":"2025-10-07T11:22:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.514612 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.514671 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.514684 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.514708 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.514724 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:04Z","lastTransitionTime":"2025-10-07T11:22:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.617602 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.617676 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.617694 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.617721 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.617739 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:04Z","lastTransitionTime":"2025-10-07T11:22:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.720791 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.721084 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.721149 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.721233 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.721328 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:04Z","lastTransitionTime":"2025-10-07T11:22:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.824437 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.824495 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.824506 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.824526 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.824541 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:04Z","lastTransitionTime":"2025-10-07T11:22:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.927820 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.927872 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.927881 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.927900 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.927910 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:04Z","lastTransitionTime":"2025-10-07T11:22:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:04 crc kubenswrapper[4700]: I1007 11:22:04.956577 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:22:04 crc kubenswrapper[4700]: E1007 11:22:04.956739 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.031240 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.031343 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.031367 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.031450 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.031974 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:05Z","lastTransitionTime":"2025-10-07T11:22:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.136329 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.136413 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.136431 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.136457 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.136476 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:05Z","lastTransitionTime":"2025-10-07T11:22:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.240186 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.240266 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.240278 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.240298 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.240346 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:05Z","lastTransitionTime":"2025-10-07T11:22:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.343634 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.343679 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.343689 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.343703 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.343713 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:05Z","lastTransitionTime":"2025-10-07T11:22:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.446592 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.446678 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.446703 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.446736 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.446758 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:05Z","lastTransitionTime":"2025-10-07T11:22:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.549847 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.549915 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.549939 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.549968 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.549990 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:05Z","lastTransitionTime":"2025-10-07T11:22:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.652721 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.652800 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.652825 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.652853 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.652874 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:05Z","lastTransitionTime":"2025-10-07T11:22:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.755947 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.756032 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.756050 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.756076 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.756094 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:05Z","lastTransitionTime":"2025-10-07T11:22:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.859120 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.859164 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.859181 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.859204 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.859221 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:05Z","lastTransitionTime":"2025-10-07T11:22:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.956787 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.956933 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:22:05 crc kubenswrapper[4700]: E1007 11:22:05.956986 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:22:05 crc kubenswrapper[4700]: E1007 11:22:05.957178 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.957430 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:22:05 crc kubenswrapper[4700]: E1007 11:22:05.957558 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.962003 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.962078 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.962106 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.962138 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:05 crc kubenswrapper[4700]: I1007 11:22:05.962181 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:05Z","lastTransitionTime":"2025-10-07T11:22:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.064998 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.065034 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.065045 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.065060 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.065072 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:06Z","lastTransitionTime":"2025-10-07T11:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.168117 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.168171 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.168187 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.168207 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.168221 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:06Z","lastTransitionTime":"2025-10-07T11:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.271629 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.271701 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.271713 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.271736 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.271751 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:06Z","lastTransitionTime":"2025-10-07T11:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.374854 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.374947 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.374990 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.375013 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.375027 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:06Z","lastTransitionTime":"2025-10-07T11:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.478164 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.478207 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.478216 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.478232 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.478245 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:06Z","lastTransitionTime":"2025-10-07T11:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.581518 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.581580 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.581599 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.581624 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.581643 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:06Z","lastTransitionTime":"2025-10-07T11:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.684328 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.684373 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.684384 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.684406 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.684418 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:06Z","lastTransitionTime":"2025-10-07T11:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.788198 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.788262 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.788278 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.788338 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.788363 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:06Z","lastTransitionTime":"2025-10-07T11:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.891889 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.891976 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.891987 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.892007 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.892019 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:06Z","lastTransitionTime":"2025-10-07T11:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.956600 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:22:06 crc kubenswrapper[4700]: E1007 11:22:06.956786 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.995089 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.995133 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.995174 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.995192 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:06 crc kubenswrapper[4700]: I1007 11:22:06.995204 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:06Z","lastTransitionTime":"2025-10-07T11:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.102523 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.102586 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.102606 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.102632 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.102648 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:07Z","lastTransitionTime":"2025-10-07T11:22:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.205878 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.206350 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.206457 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.206563 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.206647 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:07Z","lastTransitionTime":"2025-10-07T11:22:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.310042 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.310107 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.310124 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.310149 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.310169 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:07Z","lastTransitionTime":"2025-10-07T11:22:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.413478 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.413920 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.414078 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.414258 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.414475 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:07Z","lastTransitionTime":"2025-10-07T11:22:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.517634 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.517678 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.517691 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.517707 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.517721 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:07Z","lastTransitionTime":"2025-10-07T11:22:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.619442 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.619513 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.619538 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.619570 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.619591 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:07Z","lastTransitionTime":"2025-10-07T11:22:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.721941 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.722401 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.722567 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.722711 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.722867 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:07Z","lastTransitionTime":"2025-10-07T11:22:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.825427 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.825709 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.825793 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.825875 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.825938 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:07Z","lastTransitionTime":"2025-10-07T11:22:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.929539 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.929587 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.929600 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.929617 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.929630 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:07Z","lastTransitionTime":"2025-10-07T11:22:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.956730 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:22:07 crc kubenswrapper[4700]: E1007 11:22:07.956918 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.957214 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:22:07 crc kubenswrapper[4700]: E1007 11:22:07.957349 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:22:07 crc kubenswrapper[4700]: I1007 11:22:07.957868 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:22:07 crc kubenswrapper[4700]: E1007 11:22:07.958369 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.032147 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.032220 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.032240 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.032267 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.032287 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:08Z","lastTransitionTime":"2025-10-07T11:22:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.140165 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.140241 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.140259 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.140289 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.140346 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:08Z","lastTransitionTime":"2025-10-07T11:22:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.244402 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.244483 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.244502 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.244531 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.244551 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:08Z","lastTransitionTime":"2025-10-07T11:22:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.346936 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.346993 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.347005 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.347025 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.347040 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:08Z","lastTransitionTime":"2025-10-07T11:22:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.449581 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.449635 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.449651 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.449670 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.449684 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:08Z","lastTransitionTime":"2025-10-07T11:22:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.551732 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.551799 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.551826 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.551858 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.551883 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:08Z","lastTransitionTime":"2025-10-07T11:22:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.655093 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.655154 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.655168 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.655187 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.655200 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:08Z","lastTransitionTime":"2025-10-07T11:22:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.758236 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.758380 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.758408 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.758439 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.758465 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:08Z","lastTransitionTime":"2025-10-07T11:22:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.860778 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.860816 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.860824 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.860838 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.860848 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:08Z","lastTransitionTime":"2025-10-07T11:22:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.956204 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:22:08 crc kubenswrapper[4700]: E1007 11:22:08.956629 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.957158 4700 scope.go:117] "RemoveContainer" containerID="86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb" Oct 07 11:22:08 crc kubenswrapper[4700]: E1007 11:22:08.957380 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fk4xc_openshift-ovn-kubernetes(d0a75e4c-2144-40de-9abc-f0bb7a143a0e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.963603 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.963669 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.963694 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.963718 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:08 crc kubenswrapper[4700]: I1007 11:22:08.963738 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:08Z","lastTransitionTime":"2025-10-07T11:22:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.067104 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.067188 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.067209 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.067237 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.067258 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:09Z","lastTransitionTime":"2025-10-07T11:22:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.170754 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.170799 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.170808 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.170825 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.170838 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:09Z","lastTransitionTime":"2025-10-07T11:22:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.275290 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.275394 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.275403 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.275419 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.275429 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:09Z","lastTransitionTime":"2025-10-07T11:22:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.379302 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.379810 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.379964 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.380178 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.380356 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:09Z","lastTransitionTime":"2025-10-07T11:22:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.484672 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.485037 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.485241 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.485478 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.485671 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:09Z","lastTransitionTime":"2025-10-07T11:22:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.589216 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.589259 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.589270 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.589286 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.589297 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:09Z","lastTransitionTime":"2025-10-07T11:22:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.692351 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.692406 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.692418 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.692442 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.692456 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:09Z","lastTransitionTime":"2025-10-07T11:22:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.795697 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.796113 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.796283 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.796579 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.796747 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:09Z","lastTransitionTime":"2025-10-07T11:22:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.901480 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.901548 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.901569 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.901613 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.901635 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:09Z","lastTransitionTime":"2025-10-07T11:22:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.956584 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.956661 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:22:09 crc kubenswrapper[4700]: E1007 11:22:09.956779 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:22:09 crc kubenswrapper[4700]: E1007 11:22:09.956995 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:22:09 crc kubenswrapper[4700]: I1007 11:22:09.957026 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:22:09 crc kubenswrapper[4700]: E1007 11:22:09.957154 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.004835 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.004904 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.004923 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.004949 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.004972 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:10Z","lastTransitionTime":"2025-10-07T11:22:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.108638 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.108713 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.108730 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.108761 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.108781 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:10Z","lastTransitionTime":"2025-10-07T11:22:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.212156 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.212217 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.212234 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.212369 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.212392 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:10Z","lastTransitionTime":"2025-10-07T11:22:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.315436 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.315491 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.315506 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.315528 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.315546 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:10Z","lastTransitionTime":"2025-10-07T11:22:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.419009 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.419062 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.419085 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.419112 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.419135 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:10Z","lastTransitionTime":"2025-10-07T11:22:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.523023 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.523113 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.523132 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.523163 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.523182 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:10Z","lastTransitionTime":"2025-10-07T11:22:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.626291 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.626444 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.626469 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.626497 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.626521 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:10Z","lastTransitionTime":"2025-10-07T11:22:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.729080 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.729125 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.729143 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.729165 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.729182 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:10Z","lastTransitionTime":"2025-10-07T11:22:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.832703 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.832761 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.832784 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.832813 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.832872 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:10Z","lastTransitionTime":"2025-10-07T11:22:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.936067 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.936132 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.936156 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.936184 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.936206 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:10Z","lastTransitionTime":"2025-10-07T11:22:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:10 crc kubenswrapper[4700]: I1007 11:22:10.956097 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:22:10 crc kubenswrapper[4700]: E1007 11:22:10.956365 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.039063 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.039132 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.039168 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.039200 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.039223 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:11Z","lastTransitionTime":"2025-10-07T11:22:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.143290 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.143401 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.143424 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.143452 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.143472 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:11Z","lastTransitionTime":"2025-10-07T11:22:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.247038 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.247094 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.247111 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.247137 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.247157 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:11Z","lastTransitionTime":"2025-10-07T11:22:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.350060 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.350564 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.350770 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.351019 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.351221 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:11Z","lastTransitionTime":"2025-10-07T11:22:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.454821 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.454874 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.454888 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.454913 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.454929 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:11Z","lastTransitionTime":"2025-10-07T11:22:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.558572 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.558618 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.558630 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.558647 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.558661 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:11Z","lastTransitionTime":"2025-10-07T11:22:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.661655 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.662021 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.662142 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.662268 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.662405 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:11Z","lastTransitionTime":"2025-10-07T11:22:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.764980 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.765018 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.765027 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.765041 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.765051 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:11Z","lastTransitionTime":"2025-10-07T11:22:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.868141 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.868219 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.868261 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.868451 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.868491 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:11Z","lastTransitionTime":"2025-10-07T11:22:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.956842 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.956873 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:22:11 crc kubenswrapper[4700]: E1007 11:22:11.957087 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.957169 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:22:11 crc kubenswrapper[4700]: E1007 11:22:11.957451 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:22:11 crc kubenswrapper[4700]: E1007 11:22:11.957606 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.970886 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.970933 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.970944 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.970968 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:11 crc kubenswrapper[4700]: I1007 11:22:11.970987 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:11Z","lastTransitionTime":"2025-10-07T11:22:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.074555 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.074623 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.074641 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.074666 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.074687 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:12Z","lastTransitionTime":"2025-10-07T11:22:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.154204 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.154295 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.154352 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.154384 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.154425 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:12Z","lastTransitionTime":"2025-10-07T11:22:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:12 crc kubenswrapper[4700]: E1007 11:22:12.177379 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:12Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.183607 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.183683 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.183706 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.183736 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.183771 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:12Z","lastTransitionTime":"2025-10-07T11:22:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:12 crc kubenswrapper[4700]: E1007 11:22:12.205945 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:12Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.212442 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.212520 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.212545 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.212580 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.212607 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:12Z","lastTransitionTime":"2025-10-07T11:22:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:12 crc kubenswrapper[4700]: E1007 11:22:12.236610 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:12Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.245116 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.245219 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.245243 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.245273 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.245332 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:12Z","lastTransitionTime":"2025-10-07T11:22:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:12 crc kubenswrapper[4700]: E1007 11:22:12.270246 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:12Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.275663 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.275733 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.275752 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.275778 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.275799 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:12Z","lastTransitionTime":"2025-10-07T11:22:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:12 crc kubenswrapper[4700]: E1007 11:22:12.296187 4700 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T11:22:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5b9a1a3-7b71-41ab-bb1b-c2fad22b19a9\\\",\\\"systemUUID\\\":\\\"a5893d7d-6930-4dc9-ad13-e4893f51c3ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:12Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:12 crc kubenswrapper[4700]: E1007 11:22:12.296471 4700 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.298957 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.299163 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.299188 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.299217 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.299238 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:12Z","lastTransitionTime":"2025-10-07T11:22:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.402490 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.402556 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.402575 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.402600 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.402619 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:12Z","lastTransitionTime":"2025-10-07T11:22:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.506490 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.506550 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.506573 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.506599 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.506618 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:12Z","lastTransitionTime":"2025-10-07T11:22:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.610688 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.610762 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.610785 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.610819 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.610842 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:12Z","lastTransitionTime":"2025-10-07T11:22:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.713010 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.713105 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.713126 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.713154 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.713172 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:12Z","lastTransitionTime":"2025-10-07T11:22:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.816379 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.816451 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.816469 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.816496 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.816515 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:12Z","lastTransitionTime":"2025-10-07T11:22:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.919032 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.919081 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.919093 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.919109 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.919122 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:12Z","lastTransitionTime":"2025-10-07T11:22:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:12 crc kubenswrapper[4700]: I1007 11:22:12.956492 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:22:12 crc kubenswrapper[4700]: E1007 11:22:12.956631 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.023031 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.023095 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.023113 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.023137 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.023156 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:13Z","lastTransitionTime":"2025-10-07T11:22:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.126818 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.126875 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.126893 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.126915 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.126932 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:13Z","lastTransitionTime":"2025-10-07T11:22:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.230475 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.230566 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.230594 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.230631 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.230658 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:13Z","lastTransitionTime":"2025-10-07T11:22:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.333946 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.334015 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.334046 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.334080 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.334101 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:13Z","lastTransitionTime":"2025-10-07T11:22:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.437848 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.437959 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.437983 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.438015 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.438039 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:13Z","lastTransitionTime":"2025-10-07T11:22:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.541427 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.541502 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.541521 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.541551 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.541570 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:13Z","lastTransitionTime":"2025-10-07T11:22:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.645003 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.645073 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.645094 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.645122 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.645143 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:13Z","lastTransitionTime":"2025-10-07T11:22:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.748507 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.748588 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.748608 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.748637 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.748661 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:13Z","lastTransitionTime":"2025-10-07T11:22:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.851869 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.851951 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.851969 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.851998 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.852021 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:13Z","lastTransitionTime":"2025-10-07T11:22:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.955549 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.955628 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.955664 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.955691 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.955709 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:13Z","lastTransitionTime":"2025-10-07T11:22:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.956404 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.956512 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:22:13 crc kubenswrapper[4700]: E1007 11:22:13.956708 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.956723 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:22:13 crc kubenswrapper[4700]: E1007 11:22:13.956838 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:22:13 crc kubenswrapper[4700]: E1007 11:22:13.956954 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.970881 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee71f49a-bb16-4ed0-9777-91c1443097f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://881c4a00a6d28e60588dc9cc465730f0e190c9ac71665d691a6d0f46bbd75de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a67ca6dd1cda2310e84aa65eac4f8a81721b765bb612169f423a4a9fa5b7d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a67ca6dd1cda2310e84aa65eac4f8a81721b765bb612169f423a4a9fa5b7d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:13Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:13 crc kubenswrapper[4700]: I1007 11:22:13.988011 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe3d54c3983aed5c3c45e203aef21517833df88110e8e7ec65204aa130dad418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:13Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.005795 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e55226-0d7b-44b6-8b79-7c66e8a6aeb4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e5bbbb23890270293ac08ccb9753961becf7cdbcc561b558561dcfaebceb809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fe738d5b341983814952ef01c1d4528bbacb8bbe4dc52175e6b91df206dec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767de29591065e7f697356e237bdfe5a54704f369b7d5461b81f0415c6460070\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f802d78653da0d6865fa5c88a4c16488eb620476eb832a61905962260c1f0b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34723ea0d7a6234ae8e49624ddfb46e0ca75685a5f1ebdc713218447317a64e2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T11:20:54Z\\\",\\\"message\\\":\\\"-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 11:20:54.008685 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 11:20:54.008729 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1007 11:20:54.008756 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1007 11:20:54.008575 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008831 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 11:20:54.008912 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\"\\\\nI1007 11:20:54.008931 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1707184374/tls.crt::/tmp/serving-cert-1707184374/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759836048\\\\\\\\\\\\\\\" (2025-10-07 11:20:47 +0000 UTC to 2025-11-06 11:20:48 +0000 UTC (now=2025-10-07 11:20:54.008887239 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009149 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759836053\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759836053\\\\\\\\\\\\\\\" (2025-10-07 10:20:53 +0000 UTC to 2026-10-07 10:20:53 +0000 UTC (now=2025-10-07 11:20:54.009117895 +0000 UTC))\\\\\\\"\\\\nI1007 11:20:54.009178 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1007 11:20:54.009213 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1007 11:20:54.009253 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1007 11:20:54.009448 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45c62bac29a212146a85969d42cc09640f23e23c211924fdba502166501a36b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d164aa5ca07ce1e4fedd8cd0d740aa7b426c738df8bee0d7b7c2438ac4251090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:14Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.025738 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:14Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.050914 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e638e01c23a35a60bccfdee8f6af2414dab0e881d71fa93e2e7de1717e28dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e58806675f1bd096e871917fd2b26d45f444156733004f8654bd69775791740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:14Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.059687 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.059766 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.059787 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.059815 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.059835 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:14Z","lastTransitionTime":"2025-10-07T11:22:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.079334 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:53Z\\\",\\\"message\\\":\\\"ost \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:21:53Z is after 2025-08-24T17:21:41Z]\\\\nI1007 11:21:53.887722 6734 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1007 11:21:53.887704 6734 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/community-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"d389393c-7ba9-422c-b3f5-06e391d537d2\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/community-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Ru\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fk4xc_openshift-ovn-kubernetes(d0a75e4c-2144-40de-9abc-f0bb7a143a0e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z5xw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fk4xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:14Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.095783 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"183a9149-9c82-4391-ae03-8f8c0e95649b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e380a35d97b01333eeb9aaf48ff90dc4766a55fbcb27631fea7e3a64385475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7d621df1ddc7e0317936ccecce52285289c5e438edbc91e5ce187102c9ed27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddcaa62485eb2b39092d82fb0d67856085cabef7478c4f34b0b80a082579d088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05f33d7fc6a2c018fdfc7441ecbf9075c5e2cbe237c9f02d30e3dee8b553f3ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f33d7fc6a2c018fdfc7441ecbf9075c5e2cbe237c9f02d30e3dee8b553f3ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:14Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.110098 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:14Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.127721 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ea8470-d501-4d05-acb7-554792918f7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7313e5d6c49d403c834be26be9717bd719575d0f469773f15a1f982c8209d887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3d196f19f95686c05db17c34e0f0c16a7b192a4c942ffa617c47900a42f5bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef86b31a885a3e1f2737893e872450eb99a6e6766b025ae1a08c89c29c05334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://675636e9550c936782c039d8c1efdf28609bda9c25245fc75c80d54388219d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85c0975b3ee379fde2711deeb6f99c4cbef92468694fd2a64d4c2bd5666aa71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9c686ae421762a582e26c1c9ad61ace64675f378085cbd72f104b0fce14930\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a940fb0090b114a359ab27099910ce5ef11552d0bb58bd24f196bfd4b2250375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T11:21:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nr55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wvx6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:14Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.142444 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zhd4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"869af552-a034-4af4-b46a-492798633d24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3343c4ca5375885915b7fdee68df54d6460363b58b4802ff380f38334a9312bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T11:21:47Z\\\",\\\"message\\\":\\\"2025-10-07T11:21:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9add58a6-3aaf-4e8d-b919-5ceee6291a55\\\\n2025-10-07T11:21:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9add58a6-3aaf-4e8d-b919-5ceee6291a55 to /host/opt/cni/bin/\\\\n2025-10-07T11:21:02Z [verbose] multus-daemon started\\\\n2025-10-07T11:21:02Z [verbose] Readiness Indicator file check\\\\n2025-10-07T11:21:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96pnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zhd4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:14Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.153542 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"244b2984-d3ec-4577-893d-b9b4030db764\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa43badab7b685d1e4e47cae1ae05b3c890acdf44b9723e2a594f4beb57ac60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d1ac3c28f1187deb559f1e8a08bd2989283f9e23cce4808fe6a6cd71f48b75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n2wgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tbrzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:14Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.162529 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.162676 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.162696 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.162719 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.162768 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:14Z","lastTransitionTime":"2025-10-07T11:22:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.167451 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dhsvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25429408-169d-4998-9b40-44a882f5a89e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xglxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:21:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dhsvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:14Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.179211 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"947ddbc6-7958-46a6-ae5f-944f2061b6de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5637d1a491ba35a20eab850d2c221c6ba7b0bdd046df554c7bd4b5249c3c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d45037be48cdbfc8e22bc72bb557972ddf427a7476519ef5d8a62e670b9ee6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f88d4892a1a4824b944278fa8d619a1f983e3dd2be5ea07dcaf558962dd2c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ee4fdc41475fe1faee63e6f61d62267d013a1ac99b433450dbc0eba0986e50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:14Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.189675 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:14Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.200082 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f60b9358a5a7e2d6e9fd31dd7db072cf94ff3118682db42bc9e9109c9d6cb0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:14Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.209037 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zfjvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71b27ab9-cbbc-40db-b3b8-aee0d4de26eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0da3ce5bc83b9fd98313f12951856e694a53bd0864890cd33c969f5fc1e0060b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g76nc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zfjvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:14Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.217796 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ndw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e752a84-4326-406c-9673-bd83defa2365\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2156543bfe3645cf5621a989e423d390a0b2b5cc94e4e55ce1fcecd3c4e4aebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lchk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ndw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:14Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.229002 4700 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97a77b38-e9b1-4243-ac3a-28d83d87cf15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T11:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e499a61ee1225e7b9ad2af5b6de48463ea247761b9e8414c343ac016eb66a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T11:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smbht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T11:20:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6h5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T11:22:14Z is after 2025-08-24T17:21:41Z" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.265389 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.265442 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.265460 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.265487 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.265507 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:14Z","lastTransitionTime":"2025-10-07T11:22:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.368239 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.368360 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.368389 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.368424 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.368446 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:14Z","lastTransitionTime":"2025-10-07T11:22:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.471876 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.472296 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.472534 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.472691 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.472841 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:14Z","lastTransitionTime":"2025-10-07T11:22:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.575369 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.575497 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.575541 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.575575 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.575598 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:14Z","lastTransitionTime":"2025-10-07T11:22:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.678845 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.678931 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.678944 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.678965 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.678980 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:14Z","lastTransitionTime":"2025-10-07T11:22:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.781446 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.781500 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.781512 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.781530 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.781542 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:14Z","lastTransitionTime":"2025-10-07T11:22:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.885479 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.885529 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.885539 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.885556 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.885568 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:14Z","lastTransitionTime":"2025-10-07T11:22:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.956417 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:22:14 crc kubenswrapper[4700]: E1007 11:22:14.956843 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.976346 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.989491 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.989594 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.989624 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.989665 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:14 crc kubenswrapper[4700]: I1007 11:22:14.989706 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:14Z","lastTransitionTime":"2025-10-07T11:22:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.093698 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.093946 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.094034 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.094119 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.094196 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:15Z","lastTransitionTime":"2025-10-07T11:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.197252 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.198150 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.198374 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.198526 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.198665 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:15Z","lastTransitionTime":"2025-10-07T11:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.301371 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.301728 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.301864 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.302052 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.302194 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:15Z","lastTransitionTime":"2025-10-07T11:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.405296 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.405404 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.405425 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.405456 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.405478 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:15Z","lastTransitionTime":"2025-10-07T11:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.509605 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.509657 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.509669 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.509689 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.509703 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:15Z","lastTransitionTime":"2025-10-07T11:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.613486 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.613551 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.613568 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.613594 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.613613 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:15Z","lastTransitionTime":"2025-10-07T11:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.716450 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.716504 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.716520 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.716544 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.716561 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:15Z","lastTransitionTime":"2025-10-07T11:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.819814 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.819867 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.819885 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.819907 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.819924 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:15Z","lastTransitionTime":"2025-10-07T11:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.923761 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.923836 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.923863 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.923910 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.923934 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:15Z","lastTransitionTime":"2025-10-07T11:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.956861 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.956891 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:22:15 crc kubenswrapper[4700]: I1007 11:22:15.956928 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:22:15 crc kubenswrapper[4700]: E1007 11:22:15.957503 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:22:15 crc kubenswrapper[4700]: E1007 11:22:15.957561 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:22:15 crc kubenswrapper[4700]: E1007 11:22:15.957613 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.027225 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.027299 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.027348 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.027374 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.027393 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:16Z","lastTransitionTime":"2025-10-07T11:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.130997 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.131045 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.131059 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.131080 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.131096 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:16Z","lastTransitionTime":"2025-10-07T11:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.234432 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.234482 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.234492 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.234511 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.234525 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:16Z","lastTransitionTime":"2025-10-07T11:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.337536 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.337611 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.337632 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.337662 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.337689 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:16Z","lastTransitionTime":"2025-10-07T11:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.440804 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.440895 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.440914 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.440946 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.440968 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:16Z","lastTransitionTime":"2025-10-07T11:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.544205 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.544270 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.544291 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.544350 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.544375 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:16Z","lastTransitionTime":"2025-10-07T11:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.646805 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.646861 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.646878 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.646901 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.646920 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:16Z","lastTransitionTime":"2025-10-07T11:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.750743 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.750832 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.750858 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.750891 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.750915 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:16Z","lastTransitionTime":"2025-10-07T11:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.854283 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.854412 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.854440 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.854468 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.854487 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:16Z","lastTransitionTime":"2025-10-07T11:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.956203 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:22:16 crc kubenswrapper[4700]: E1007 11:22:16.956403 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.957575 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.957631 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.957654 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.957685 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:16 crc kubenswrapper[4700]: I1007 11:22:16.957709 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:16Z","lastTransitionTime":"2025-10-07T11:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.061203 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.061266 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.061288 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.061333 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.061347 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:17Z","lastTransitionTime":"2025-10-07T11:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.111581 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/25429408-169d-4998-9b40-44a882f5a89e-metrics-certs\") pod \"network-metrics-daemon-dhsvm\" (UID: \"25429408-169d-4998-9b40-44a882f5a89e\") " pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:22:17 crc kubenswrapper[4700]: E1007 11:22:17.111873 4700 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 11:22:17 crc kubenswrapper[4700]: E1007 11:22:17.112013 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25429408-169d-4998-9b40-44a882f5a89e-metrics-certs podName:25429408-169d-4998-9b40-44a882f5a89e nodeName:}" failed. No retries permitted until 2025-10-07 11:23:21.11196461 +0000 UTC m=+167.908363629 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/25429408-169d-4998-9b40-44a882f5a89e-metrics-certs") pod "network-metrics-daemon-dhsvm" (UID: "25429408-169d-4998-9b40-44a882f5a89e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.165470 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.165523 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.165542 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.165567 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.165586 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:17Z","lastTransitionTime":"2025-10-07T11:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.269013 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.269103 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.269140 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.269182 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.269212 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:17Z","lastTransitionTime":"2025-10-07T11:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.371595 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.371672 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.371699 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.371730 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.371752 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:17Z","lastTransitionTime":"2025-10-07T11:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.474366 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.474434 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.474462 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.474493 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.474516 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:17Z","lastTransitionTime":"2025-10-07T11:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.578008 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.578087 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.578111 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.578152 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.578175 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:17Z","lastTransitionTime":"2025-10-07T11:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.681003 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.681077 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.681114 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.681147 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.681172 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:17Z","lastTransitionTime":"2025-10-07T11:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.785002 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.785525 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.785597 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.785626 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.785646 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:17Z","lastTransitionTime":"2025-10-07T11:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.888514 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.888632 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.888653 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.888680 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.888699 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:17Z","lastTransitionTime":"2025-10-07T11:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.957347 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.957379 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:22:17 crc kubenswrapper[4700]: E1007 11:22:17.957566 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:22:17 crc kubenswrapper[4700]: E1007 11:22:17.957738 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.957965 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:22:17 crc kubenswrapper[4700]: E1007 11:22:17.958248 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.992201 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.992268 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.992286 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.992733 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:17 crc kubenswrapper[4700]: I1007 11:22:17.992792 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:17Z","lastTransitionTime":"2025-10-07T11:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.095527 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.095574 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.095586 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.095604 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.095617 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:18Z","lastTransitionTime":"2025-10-07T11:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.198659 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.198704 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.198719 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.198738 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.198751 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:18Z","lastTransitionTime":"2025-10-07T11:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.307457 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.307527 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.307549 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.307576 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.307599 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:18Z","lastTransitionTime":"2025-10-07T11:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.410146 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.410198 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.410211 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.410233 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.410251 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:18Z","lastTransitionTime":"2025-10-07T11:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.513621 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.513670 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.513687 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.513711 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.513730 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:18Z","lastTransitionTime":"2025-10-07T11:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.617438 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.617495 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.617514 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.617544 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.617559 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:18Z","lastTransitionTime":"2025-10-07T11:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.720925 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.720981 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.721024 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.721050 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.721062 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:18Z","lastTransitionTime":"2025-10-07T11:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.824649 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.824719 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.824737 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.824764 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.824784 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:18Z","lastTransitionTime":"2025-10-07T11:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.928245 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.928348 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.928374 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.928403 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.928425 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:18Z","lastTransitionTime":"2025-10-07T11:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:18 crc kubenswrapper[4700]: I1007 11:22:18.956830 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:22:18 crc kubenswrapper[4700]: E1007 11:22:18.957060 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.031751 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.031815 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.031837 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.031866 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.031889 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:19Z","lastTransitionTime":"2025-10-07T11:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.134838 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.134917 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.134938 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.134970 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.134993 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:19Z","lastTransitionTime":"2025-10-07T11:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.237712 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.237765 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.237783 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.237806 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.237824 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:19Z","lastTransitionTime":"2025-10-07T11:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.340585 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.340647 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.340671 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.340771 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.340849 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:19Z","lastTransitionTime":"2025-10-07T11:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.444444 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.444508 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.444532 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.444560 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.444579 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:19Z","lastTransitionTime":"2025-10-07T11:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.548595 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.548687 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.548717 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.548754 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.548779 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:19Z","lastTransitionTime":"2025-10-07T11:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.651156 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.651218 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.651235 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.651258 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.651280 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:19Z","lastTransitionTime":"2025-10-07T11:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.754099 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.754183 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.754202 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.754230 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.754249 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:19Z","lastTransitionTime":"2025-10-07T11:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.857517 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.857608 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.857625 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.857651 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.857669 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:19Z","lastTransitionTime":"2025-10-07T11:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.957069 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.957069 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.957185 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:22:19 crc kubenswrapper[4700]: E1007 11:22:19.957365 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:22:19 crc kubenswrapper[4700]: E1007 11:22:19.957507 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:22:19 crc kubenswrapper[4700]: E1007 11:22:19.957591 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.966993 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.967083 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.967107 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.967137 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:19 crc kubenswrapper[4700]: I1007 11:22:19.967162 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:19Z","lastTransitionTime":"2025-10-07T11:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.070405 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.070490 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.070524 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.070561 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.070588 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:20Z","lastTransitionTime":"2025-10-07T11:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.174187 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.174241 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.174258 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.174285 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.174333 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:20Z","lastTransitionTime":"2025-10-07T11:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.277221 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.277281 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.277412 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.277449 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.277472 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:20Z","lastTransitionTime":"2025-10-07T11:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.381247 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.381365 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.381392 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.381422 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.381442 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:20Z","lastTransitionTime":"2025-10-07T11:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.485782 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.485869 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.485889 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.485955 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.485980 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:20Z","lastTransitionTime":"2025-10-07T11:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.589404 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.589494 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.589518 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.589557 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.589582 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:20Z","lastTransitionTime":"2025-10-07T11:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.692781 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.692849 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.692884 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.692914 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.692938 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:20Z","lastTransitionTime":"2025-10-07T11:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.796916 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.796988 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.797005 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.797033 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.797055 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:20Z","lastTransitionTime":"2025-10-07T11:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.901031 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.901115 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.901139 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.901173 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.901201 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:20Z","lastTransitionTime":"2025-10-07T11:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.956548 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:22:20 crc kubenswrapper[4700]: E1007 11:22:20.956765 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:22:20 crc kubenswrapper[4700]: I1007 11:22:20.957833 4700 scope.go:117] "RemoveContainer" containerID="86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb" Oct 07 11:22:20 crc kubenswrapper[4700]: E1007 11:22:20.958086 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fk4xc_openshift-ovn-kubernetes(d0a75e4c-2144-40de-9abc-f0bb7a143a0e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.004941 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.005023 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.005044 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.005082 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.005106 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:21Z","lastTransitionTime":"2025-10-07T11:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.109254 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.109372 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.109396 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.109427 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.109449 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:21Z","lastTransitionTime":"2025-10-07T11:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.212629 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.212865 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.212889 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.212916 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.212935 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:21Z","lastTransitionTime":"2025-10-07T11:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.315986 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.316029 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.316037 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.316053 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.316066 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:21Z","lastTransitionTime":"2025-10-07T11:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.418867 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.418925 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.419048 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.419075 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.419095 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:21Z","lastTransitionTime":"2025-10-07T11:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.522369 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.522437 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.522492 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.522532 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.522556 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:21Z","lastTransitionTime":"2025-10-07T11:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.625480 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.625541 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.625558 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.625582 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.625600 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:21Z","lastTransitionTime":"2025-10-07T11:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.728261 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.728345 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.728363 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.728390 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.728408 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:21Z","lastTransitionTime":"2025-10-07T11:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.831546 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.831603 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.831620 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.831641 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.831658 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:21Z","lastTransitionTime":"2025-10-07T11:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.934739 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.934794 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.934805 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.934826 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.934840 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:21Z","lastTransitionTime":"2025-10-07T11:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.957102 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.957140 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:22:21 crc kubenswrapper[4700]: I1007 11:22:21.957199 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:22:21 crc kubenswrapper[4700]: E1007 11:22:21.957449 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:22:21 crc kubenswrapper[4700]: E1007 11:22:21.957532 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:22:21 crc kubenswrapper[4700]: E1007 11:22:21.957622 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.037531 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.037583 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.037601 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.037626 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.037646 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:22Z","lastTransitionTime":"2025-10-07T11:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.140850 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.140914 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.140938 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.140964 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.140986 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:22Z","lastTransitionTime":"2025-10-07T11:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.243907 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.243979 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.244004 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.244034 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.244058 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:22Z","lastTransitionTime":"2025-10-07T11:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.348127 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.348196 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.348214 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.348266 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.348285 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:22Z","lastTransitionTime":"2025-10-07T11:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.364001 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.364067 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.364086 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.364108 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.364123 4700 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T11:22:22Z","lastTransitionTime":"2025-10-07T11:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.436163 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkhgx"] Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.436715 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkhgx" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.440365 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.440810 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.440929 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.443046 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.506004 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=54.505976652 podStartE2EDuration="54.505976652s" podCreationTimestamp="2025-10-07 11:21:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:22:22.485355494 +0000 UTC m=+109.281754523" watchObservedRunningTime="2025-10-07 11:22:22.505976652 +0000 UTC m=+109.302375671" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.535574 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-wvx6b" podStartSLOduration=84.535549888 podStartE2EDuration="1m24.535549888s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:22:22.534813928 +0000 UTC m=+109.331212957" watchObservedRunningTime="2025-10-07 11:22:22.535549888 +0000 UTC m=+109.331948917" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.577697 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1088c7b5-239c-445b-835b-7e88e96927ff-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-fkhgx\" (UID: \"1088c7b5-239c-445b-835b-7e88e96927ff\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkhgx" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.577780 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1088c7b5-239c-445b-835b-7e88e96927ff-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-fkhgx\" (UID: \"1088c7b5-239c-445b-835b-7e88e96927ff\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkhgx" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.577817 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1088c7b5-239c-445b-835b-7e88e96927ff-service-ca\") pod \"cluster-version-operator-5c965bbfc6-fkhgx\" (UID: \"1088c7b5-239c-445b-835b-7e88e96927ff\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkhgx" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.577931 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1088c7b5-239c-445b-835b-7e88e96927ff-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-fkhgx\" (UID: \"1088c7b5-239c-445b-835b-7e88e96927ff\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkhgx" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.577965 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1088c7b5-239c-445b-835b-7e88e96927ff-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-fkhgx\" (UID: \"1088c7b5-239c-445b-835b-7e88e96927ff\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkhgx" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.579815 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-zhd4s" podStartSLOduration=84.579773933 podStartE2EDuration="1m24.579773933s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:22:22.561368114 +0000 UTC m=+109.357767143" watchObservedRunningTime="2025-10-07 11:22:22.579773933 +0000 UTC m=+109.376172932" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.580052 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tbrzt" podStartSLOduration=84.58004169 podStartE2EDuration="1m24.58004169s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:22:22.579383883 +0000 UTC m=+109.375782902" watchObservedRunningTime="2025-10-07 11:22:22.58004169 +0000 UTC m=+109.376440689" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.594436 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podStartSLOduration=84.594413662 podStartE2EDuration="1m24.594413662s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:22:22.594410922 +0000 UTC m=+109.390809941" watchObservedRunningTime="2025-10-07 11:22:22.594413662 +0000 UTC m=+109.390812661" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.623838 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=85.623810633 podStartE2EDuration="1m25.623810633s" podCreationTimestamp="2025-10-07 11:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:22:22.622839777 +0000 UTC m=+109.419238776" watchObservedRunningTime="2025-10-07 11:22:22.623810633 +0000 UTC m=+109.420209632" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.679480 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1088c7b5-239c-445b-835b-7e88e96927ff-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-fkhgx\" (UID: \"1088c7b5-239c-445b-835b-7e88e96927ff\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkhgx" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.679545 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1088c7b5-239c-445b-835b-7e88e96927ff-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-fkhgx\" (UID: \"1088c7b5-239c-445b-835b-7e88e96927ff\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkhgx" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.679593 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1088c7b5-239c-445b-835b-7e88e96927ff-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-fkhgx\" (UID: \"1088c7b5-239c-445b-835b-7e88e96927ff\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkhgx" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.679643 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1088c7b5-239c-445b-835b-7e88e96927ff-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-fkhgx\" (UID: \"1088c7b5-239c-445b-835b-7e88e96927ff\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkhgx" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.679654 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1088c7b5-239c-445b-835b-7e88e96927ff-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-fkhgx\" (UID: \"1088c7b5-239c-445b-835b-7e88e96927ff\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkhgx" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.679753 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1088c7b5-239c-445b-835b-7e88e96927ff-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-fkhgx\" (UID: \"1088c7b5-239c-445b-835b-7e88e96927ff\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkhgx" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.679780 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1088c7b5-239c-445b-835b-7e88e96927ff-service-ca\") pod \"cluster-version-operator-5c965bbfc6-fkhgx\" (UID: \"1088c7b5-239c-445b-835b-7e88e96927ff\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkhgx" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.680238 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-zfjvk" podStartSLOduration=84.680225482 podStartE2EDuration="1m24.680225482s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:22:22.680031767 +0000 UTC m=+109.476430796" watchObservedRunningTime="2025-10-07 11:22:22.680225482 +0000 UTC m=+109.476624481" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.681606 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1088c7b5-239c-445b-835b-7e88e96927ff-service-ca\") pod \"cluster-version-operator-5c965bbfc6-fkhgx\" (UID: \"1088c7b5-239c-445b-835b-7e88e96927ff\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkhgx" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.688787 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1088c7b5-239c-445b-835b-7e88e96927ff-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-fkhgx\" (UID: \"1088c7b5-239c-445b-835b-7e88e96927ff\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkhgx" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.697191 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-ndw62" podStartSLOduration=84.697167932 podStartE2EDuration="1m24.697167932s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:22:22.696614928 +0000 UTC m=+109.493013927" watchObservedRunningTime="2025-10-07 11:22:22.697167932 +0000 UTC m=+109.493566931" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.713865 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1088c7b5-239c-445b-835b-7e88e96927ff-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-fkhgx\" (UID: \"1088c7b5-239c-445b-835b-7e88e96927ff\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkhgx" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.752693 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=39.752667027 podStartE2EDuration="39.752667027s" podCreationTimestamp="2025-10-07 11:21:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:22:22.721286203 +0000 UTC m=+109.517685212" watchObservedRunningTime="2025-10-07 11:22:22.752667027 +0000 UTC m=+109.549066036" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.763344 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkhgx" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.775864 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=8.775837763 podStartE2EDuration="8.775837763s" podCreationTimestamp="2025-10-07 11:22:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:22:22.752138143 +0000 UTC m=+109.548537182" watchObservedRunningTime="2025-10-07 11:22:22.775837763 +0000 UTC m=+109.572236762" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.827861 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=88.827834035 podStartE2EDuration="1m28.827834035s" podCreationTimestamp="2025-10-07 11:20:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:22:22.826992072 +0000 UTC m=+109.623391081" watchObservedRunningTime="2025-10-07 11:22:22.827834035 +0000 UTC m=+109.624233024" Oct 07 11:22:22 crc kubenswrapper[4700]: I1007 11:22:22.957160 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:22:22 crc kubenswrapper[4700]: E1007 11:22:22.957301 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:22:23 crc kubenswrapper[4700]: I1007 11:22:23.653484 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkhgx" event={"ID":"1088c7b5-239c-445b-835b-7e88e96927ff","Type":"ContainerStarted","Data":"f68849090e2f9a6def445cf2e8a875c0166c54fa3ffb343e11601daf3063ca04"} Oct 07 11:22:23 crc kubenswrapper[4700]: I1007 11:22:23.653563 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkhgx" event={"ID":"1088c7b5-239c-445b-835b-7e88e96927ff","Type":"ContainerStarted","Data":"980c2096e4ef9b650871e953bee295c8e4f2ff713a28bc4bc5913ffaaec29478"} Oct 07 11:22:23 crc kubenswrapper[4700]: I1007 11:22:23.676611 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkhgx" podStartSLOduration=85.676584128 podStartE2EDuration="1m25.676584128s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:22:23.676015933 +0000 UTC m=+110.472414962" watchObservedRunningTime="2025-10-07 11:22:23.676584128 +0000 UTC m=+110.472983127" Oct 07 11:22:23 crc kubenswrapper[4700]: I1007 11:22:23.956977 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:22:23 crc kubenswrapper[4700]: E1007 11:22:23.958924 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:22:23 crc kubenswrapper[4700]: I1007 11:22:23.959047 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:22:23 crc kubenswrapper[4700]: I1007 11:22:23.959156 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:22:23 crc kubenswrapper[4700]: E1007 11:22:23.959337 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:22:23 crc kubenswrapper[4700]: E1007 11:22:23.959670 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:22:24 crc kubenswrapper[4700]: I1007 11:22:24.956820 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:22:24 crc kubenswrapper[4700]: E1007 11:22:24.957085 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:22:25 crc kubenswrapper[4700]: I1007 11:22:25.956889 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:22:25 crc kubenswrapper[4700]: I1007 11:22:25.957009 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:22:25 crc kubenswrapper[4700]: E1007 11:22:25.957113 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:22:25 crc kubenswrapper[4700]: E1007 11:22:25.957176 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:22:25 crc kubenswrapper[4700]: I1007 11:22:25.957811 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:22:25 crc kubenswrapper[4700]: E1007 11:22:25.957977 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:22:26 crc kubenswrapper[4700]: I1007 11:22:26.957033 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:22:26 crc kubenswrapper[4700]: E1007 11:22:26.958159 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:22:27 crc kubenswrapper[4700]: I1007 11:22:27.956451 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:22:27 crc kubenswrapper[4700]: I1007 11:22:27.956513 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:22:27 crc kubenswrapper[4700]: E1007 11:22:27.956643 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:22:27 crc kubenswrapper[4700]: I1007 11:22:27.956740 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:22:27 crc kubenswrapper[4700]: E1007 11:22:27.956968 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:22:27 crc kubenswrapper[4700]: E1007 11:22:27.957132 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:22:28 crc kubenswrapper[4700]: I1007 11:22:28.956433 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:22:28 crc kubenswrapper[4700]: E1007 11:22:28.956918 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:22:29 crc kubenswrapper[4700]: I1007 11:22:29.957131 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:22:29 crc kubenswrapper[4700]: I1007 11:22:29.957238 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:22:29 crc kubenswrapper[4700]: I1007 11:22:29.957151 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:22:29 crc kubenswrapper[4700]: E1007 11:22:29.957439 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:22:29 crc kubenswrapper[4700]: E1007 11:22:29.957558 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:22:29 crc kubenswrapper[4700]: E1007 11:22:29.957746 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:22:30 crc kubenswrapper[4700]: I1007 11:22:30.956737 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:22:30 crc kubenswrapper[4700]: E1007 11:22:30.956985 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:22:31 crc kubenswrapper[4700]: I1007 11:22:31.957047 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:22:31 crc kubenswrapper[4700]: I1007 11:22:31.957130 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:22:31 crc kubenswrapper[4700]: I1007 11:22:31.957067 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:22:31 crc kubenswrapper[4700]: E1007 11:22:31.957246 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:22:31 crc kubenswrapper[4700]: E1007 11:22:31.957423 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:22:31 crc kubenswrapper[4700]: E1007 11:22:31.957722 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:22:32 crc kubenswrapper[4700]: I1007 11:22:32.956365 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:22:32 crc kubenswrapper[4700]: E1007 11:22:32.957140 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:22:33 crc kubenswrapper[4700]: E1007 11:22:33.945240 4700 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 07 11:22:33 crc kubenswrapper[4700]: I1007 11:22:33.956871 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:22:33 crc kubenswrapper[4700]: I1007 11:22:33.957082 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:22:33 crc kubenswrapper[4700]: I1007 11:22:33.960063 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:22:33 crc kubenswrapper[4700]: E1007 11:22:33.960230 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:22:33 crc kubenswrapper[4700]: E1007 11:22:33.960424 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:22:33 crc kubenswrapper[4700]: E1007 11:22:33.960605 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:22:33 crc kubenswrapper[4700]: I1007 11:22:33.960837 4700 scope.go:117] "RemoveContainer" containerID="86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb" Oct 07 11:22:34 crc kubenswrapper[4700]: E1007 11:22:34.060800 4700 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 07 11:22:34 crc kubenswrapper[4700]: I1007 11:22:34.697279 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zhd4s_869af552-a034-4af4-b46a-492798633d24/kube-multus/1.log" Oct 07 11:22:34 crc kubenswrapper[4700]: I1007 11:22:34.697922 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zhd4s_869af552-a034-4af4-b46a-492798633d24/kube-multus/0.log" Oct 07 11:22:34 crc kubenswrapper[4700]: I1007 11:22:34.697981 4700 generic.go:334] "Generic (PLEG): container finished" podID="869af552-a034-4af4-b46a-492798633d24" containerID="3343c4ca5375885915b7fdee68df54d6460363b58b4802ff380f38334a9312bb" exitCode=1 Oct 07 11:22:34 crc kubenswrapper[4700]: I1007 11:22:34.698066 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zhd4s" event={"ID":"869af552-a034-4af4-b46a-492798633d24","Type":"ContainerDied","Data":"3343c4ca5375885915b7fdee68df54d6460363b58b4802ff380f38334a9312bb"} Oct 07 11:22:34 crc kubenswrapper[4700]: I1007 11:22:34.698126 4700 scope.go:117] "RemoveContainer" containerID="8f4ca632ea79463df83b0d189baab4934b201f9b783575a8e9b70e62a92e9c3a" Oct 07 11:22:34 crc kubenswrapper[4700]: I1007 11:22:34.698728 4700 scope.go:117] "RemoveContainer" containerID="3343c4ca5375885915b7fdee68df54d6460363b58b4802ff380f38334a9312bb" Oct 07 11:22:34 crc kubenswrapper[4700]: E1007 11:22:34.698959 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-zhd4s_openshift-multus(869af552-a034-4af4-b46a-492798633d24)\"" pod="openshift-multus/multus-zhd4s" podUID="869af552-a034-4af4-b46a-492798633d24" Oct 07 11:22:34 crc kubenswrapper[4700]: I1007 11:22:34.700803 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fk4xc_d0a75e4c-2144-40de-9abc-f0bb7a143a0e/ovnkube-controller/3.log" Oct 07 11:22:34 crc kubenswrapper[4700]: I1007 11:22:34.707203 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" event={"ID":"d0a75e4c-2144-40de-9abc-f0bb7a143a0e","Type":"ContainerStarted","Data":"42401a336cd3f03495d93fb5f03a7d4274ada94d3d81f1e8290a31f25f00bbad"} Oct 07 11:22:34 crc kubenswrapper[4700]: I1007 11:22:34.707653 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:22:34 crc kubenswrapper[4700]: I1007 11:22:34.956859 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:22:34 crc kubenswrapper[4700]: E1007 11:22:34.956997 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:22:35 crc kubenswrapper[4700]: I1007 11:22:35.090010 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" podStartSLOduration=97.089988598 podStartE2EDuration="1m37.089988598s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:22:34.761586693 +0000 UTC m=+121.557985692" watchObservedRunningTime="2025-10-07 11:22:35.089988598 +0000 UTC m=+121.886387587" Oct 07 11:22:35 crc kubenswrapper[4700]: I1007 11:22:35.091072 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dhsvm"] Oct 07 11:22:35 crc kubenswrapper[4700]: I1007 11:22:35.712512 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zhd4s_869af552-a034-4af4-b46a-492798633d24/kube-multus/1.log" Oct 07 11:22:35 crc kubenswrapper[4700]: I1007 11:22:35.712622 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:22:35 crc kubenswrapper[4700]: E1007 11:22:35.712734 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:22:35 crc kubenswrapper[4700]: I1007 11:22:35.957214 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:22:35 crc kubenswrapper[4700]: E1007 11:22:35.957752 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:22:35 crc kubenswrapper[4700]: I1007 11:22:35.957259 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:22:35 crc kubenswrapper[4700]: I1007 11:22:35.957391 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:22:35 crc kubenswrapper[4700]: E1007 11:22:35.958012 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:22:35 crc kubenswrapper[4700]: E1007 11:22:35.958158 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:22:37 crc kubenswrapper[4700]: I1007 11:22:37.957354 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:22:37 crc kubenswrapper[4700]: I1007 11:22:37.957439 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:22:37 crc kubenswrapper[4700]: E1007 11:22:37.957612 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:22:37 crc kubenswrapper[4700]: I1007 11:22:37.957742 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:22:37 crc kubenswrapper[4700]: E1007 11:22:37.957833 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:22:37 crc kubenswrapper[4700]: I1007 11:22:37.957869 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:22:37 crc kubenswrapper[4700]: E1007 11:22:37.958052 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:22:37 crc kubenswrapper[4700]: E1007 11:22:37.958155 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:22:39 crc kubenswrapper[4700]: E1007 11:22:39.063281 4700 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 07 11:22:39 crc kubenswrapper[4700]: I1007 11:22:39.956623 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:22:39 crc kubenswrapper[4700]: I1007 11:22:39.956697 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:22:39 crc kubenswrapper[4700]: I1007 11:22:39.956761 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:22:39 crc kubenswrapper[4700]: I1007 11:22:39.956847 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:22:39 crc kubenswrapper[4700]: E1007 11:22:39.956845 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:22:39 crc kubenswrapper[4700]: E1007 11:22:39.956960 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:22:39 crc kubenswrapper[4700]: E1007 11:22:39.957039 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:22:39 crc kubenswrapper[4700]: E1007 11:22:39.957299 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:22:41 crc kubenswrapper[4700]: I1007 11:22:41.956922 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:22:41 crc kubenswrapper[4700]: I1007 11:22:41.957004 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:22:41 crc kubenswrapper[4700]: I1007 11:22:41.957021 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:22:41 crc kubenswrapper[4700]: I1007 11:22:41.956939 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:22:41 crc kubenswrapper[4700]: E1007 11:22:41.957263 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:22:41 crc kubenswrapper[4700]: E1007 11:22:41.957455 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:22:41 crc kubenswrapper[4700]: E1007 11:22:41.957570 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:22:41 crc kubenswrapper[4700]: E1007 11:22:41.957803 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:22:43 crc kubenswrapper[4700]: I1007 11:22:43.957026 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:22:43 crc kubenswrapper[4700]: I1007 11:22:43.957076 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:22:43 crc kubenswrapper[4700]: I1007 11:22:43.957037 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:22:43 crc kubenswrapper[4700]: I1007 11:22:43.957116 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:22:43 crc kubenswrapper[4700]: E1007 11:22:43.959275 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:22:43 crc kubenswrapper[4700]: E1007 11:22:43.959476 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:22:43 crc kubenswrapper[4700]: E1007 11:22:43.959974 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:22:43 crc kubenswrapper[4700]: E1007 11:22:43.960088 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:22:44 crc kubenswrapper[4700]: E1007 11:22:44.064004 4700 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 07 11:22:45 crc kubenswrapper[4700]: I1007 11:22:45.956414 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:22:45 crc kubenswrapper[4700]: I1007 11:22:45.956414 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:22:45 crc kubenswrapper[4700]: E1007 11:22:45.956639 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:22:45 crc kubenswrapper[4700]: E1007 11:22:45.956869 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:22:45 crc kubenswrapper[4700]: I1007 11:22:45.957053 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:22:45 crc kubenswrapper[4700]: E1007 11:22:45.957170 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:22:45 crc kubenswrapper[4700]: I1007 11:22:45.957050 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:22:45 crc kubenswrapper[4700]: E1007 11:22:45.957441 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:22:47 crc kubenswrapper[4700]: I1007 11:22:47.956505 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:22:47 crc kubenswrapper[4700]: I1007 11:22:47.956587 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:22:47 crc kubenswrapper[4700]: I1007 11:22:47.956579 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:22:47 crc kubenswrapper[4700]: I1007 11:22:47.956525 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:22:47 crc kubenswrapper[4700]: E1007 11:22:47.956790 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:22:47 crc kubenswrapper[4700]: E1007 11:22:47.956883 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:22:47 crc kubenswrapper[4700]: E1007 11:22:47.957009 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:22:47 crc kubenswrapper[4700]: E1007 11:22:47.957092 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:22:47 crc kubenswrapper[4700]: I1007 11:22:47.957647 4700 scope.go:117] "RemoveContainer" containerID="3343c4ca5375885915b7fdee68df54d6460363b58b4802ff380f38334a9312bb" Oct 07 11:22:48 crc kubenswrapper[4700]: I1007 11:22:48.765627 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zhd4s_869af552-a034-4af4-b46a-492798633d24/kube-multus/1.log" Oct 07 11:22:48 crc kubenswrapper[4700]: I1007 11:22:48.766096 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zhd4s" event={"ID":"869af552-a034-4af4-b46a-492798633d24","Type":"ContainerStarted","Data":"bc3f96fdc39238a4256a6eefb116100d5bff845db1385f44433113a8962718d5"} Oct 07 11:22:49 crc kubenswrapper[4700]: E1007 11:22:49.065625 4700 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 07 11:22:49 crc kubenswrapper[4700]: I1007 11:22:49.956815 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:22:49 crc kubenswrapper[4700]: I1007 11:22:49.956883 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:22:49 crc kubenswrapper[4700]: I1007 11:22:49.956970 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:22:49 crc kubenswrapper[4700]: I1007 11:22:49.957022 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:22:49 crc kubenswrapper[4700]: E1007 11:22:49.957030 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:22:49 crc kubenswrapper[4700]: E1007 11:22:49.957157 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:22:49 crc kubenswrapper[4700]: E1007 11:22:49.957252 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:22:49 crc kubenswrapper[4700]: E1007 11:22:49.957299 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:22:51 crc kubenswrapper[4700]: I1007 11:22:51.957009 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:22:51 crc kubenswrapper[4700]: I1007 11:22:51.957813 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:22:51 crc kubenswrapper[4700]: I1007 11:22:51.957870 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:22:51 crc kubenswrapper[4700]: I1007 11:22:51.957898 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:22:51 crc kubenswrapper[4700]: E1007 11:22:51.958016 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:22:51 crc kubenswrapper[4700]: E1007 11:22:51.958117 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:22:51 crc kubenswrapper[4700]: E1007 11:22:51.958214 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:22:51 crc kubenswrapper[4700]: E1007 11:22:51.958395 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:22:53 crc kubenswrapper[4700]: I1007 11:22:53.956556 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:22:53 crc kubenswrapper[4700]: I1007 11:22:53.956581 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:22:53 crc kubenswrapper[4700]: I1007 11:22:53.956635 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:22:53 crc kubenswrapper[4700]: E1007 11:22:53.957942 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 11:22:53 crc kubenswrapper[4700]: I1007 11:22:53.957957 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:22:53 crc kubenswrapper[4700]: E1007 11:22:53.958096 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 11:22:53 crc kubenswrapper[4700]: E1007 11:22:53.958220 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 11:22:53 crc kubenswrapper[4700]: E1007 11:22:53.958339 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhsvm" podUID="25429408-169d-4998-9b40-44a882f5a89e" Oct 07 11:22:55 crc kubenswrapper[4700]: I1007 11:22:55.956691 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:22:55 crc kubenswrapper[4700]: I1007 11:22:55.956758 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:22:55 crc kubenswrapper[4700]: I1007 11:22:55.956779 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:22:55 crc kubenswrapper[4700]: I1007 11:22:55.956952 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:22:55 crc kubenswrapper[4700]: I1007 11:22:55.959362 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 07 11:22:55 crc kubenswrapper[4700]: I1007 11:22:55.961033 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 07 11:22:55 crc kubenswrapper[4700]: I1007 11:22:55.961290 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 07 11:22:55 crc kubenswrapper[4700]: I1007 11:22:55.961551 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 07 11:22:55 crc kubenswrapper[4700]: I1007 11:22:55.961798 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 07 11:22:55 crc kubenswrapper[4700]: I1007 11:22:55.962000 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 07 11:23:01 crc kubenswrapper[4700]: I1007 11:23:01.852986 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:01 crc kubenswrapper[4700]: E1007 11:23:01.853463 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:25:03.853415422 +0000 UTC m=+270.649814411 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:01 crc kubenswrapper[4700]: I1007 11:23:01.955034 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:23:01 crc kubenswrapper[4700]: I1007 11:23:01.955107 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:23:01 crc kubenswrapper[4700]: I1007 11:23:01.955130 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:23:01 crc kubenswrapper[4700]: I1007 11:23:01.955154 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:23:01 crc kubenswrapper[4700]: I1007 11:23:01.956753 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:23:01 crc kubenswrapper[4700]: I1007 11:23:01.963598 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:23:01 crc kubenswrapper[4700]: I1007 11:23:01.964436 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:23:01 crc kubenswrapper[4700]: I1007 11:23:01.968061 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:23:01 crc kubenswrapper[4700]: I1007 11:23:01.982901 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 11:23:02 crc kubenswrapper[4700]: I1007 11:23:02.026794 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:23:02 crc kubenswrapper[4700]: I1007 11:23:02.038284 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 11:23:02 crc kubenswrapper[4700]: W1007 11:23:02.258380 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-fd869a39018c5a0505737f11035befeb290481215322a7bfb9b25f7d52257d54 WatchSource:0}: Error finding container fd869a39018c5a0505737f11035befeb290481215322a7bfb9b25f7d52257d54: Status 404 returned error can't find the container with id fd869a39018c5a0505737f11035befeb290481215322a7bfb9b25f7d52257d54 Oct 07 11:23:02 crc kubenswrapper[4700]: I1007 11:23:02.824690 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"da6a054cbde78c22531efec94b41bb6023141fe5ecf13dc9a8e3cc30c7a9deac"} Oct 07 11:23:02 crc kubenswrapper[4700]: I1007 11:23:02.824768 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"749941db9b3ce7c38284c06a7ea6a2876ccfe55603e9699baed29601f5340dcf"} Oct 07 11:23:02 crc kubenswrapper[4700]: I1007 11:23:02.826569 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"b4949e9e39e65b623f465515138a9168713301ddb104bcae0ec417905b26d50b"} Oct 07 11:23:02 crc kubenswrapper[4700]: I1007 11:23:02.826625 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"fd869a39018c5a0505737f11035befeb290481215322a7bfb9b25f7d52257d54"} Oct 07 11:23:02 crc kubenswrapper[4700]: I1007 11:23:02.828187 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"0fdff66d490906f96cdc220c1eb7d38a13a44296816fe12ea32d767cd76a20f3"} Oct 07 11:23:02 crc kubenswrapper[4700]: I1007 11:23:02.828283 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"12cde1bc92271e946232b5c05877667ec934a84c8fea7a391216b82661acdb50"} Oct 07 11:23:02 crc kubenswrapper[4700]: I1007 11:23:02.828536 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.392775 4700 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.455638 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8l887"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.456194 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8l887" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.461079 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-zhvq9"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.462136 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.462995 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-svs9t"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.464221 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-svs9t" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.464534 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wlgsp"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.465283 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.467179 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2xjmg"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.468195 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2xjmg" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.468788 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-b6rkf"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.469941 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b6rkf" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.470519 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-7l929"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.471189 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-7l929" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.471987 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-qmpjz"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.473117 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qmpjz" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.509441 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.509901 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.511704 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.512422 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.512493 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.512651 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.512427 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.512702 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.512944 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.526804 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.527065 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.527489 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.527652 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.527936 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.528107 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.528290 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.528469 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.528704 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.528883 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.529158 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.529290 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.529943 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.530225 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.530435 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.530625 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.530745 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.530825 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.531493 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.543691 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.557008 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.557434 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.557614 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.557777 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.561733 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-hbprv"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.562749 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-hbprv" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.564382 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.564510 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.564658 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.565455 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.565572 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.566511 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.569087 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.569338 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.569777 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.570161 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.570377 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.580112 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.580641 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8c79b619-73da-4918-b5fd-e45ac5463a91-audit-dir\") pod \"apiserver-7bbb656c7d-svs9t\" (UID: \"8c79b619-73da-4918-b5fd-e45ac5463a91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-svs9t" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.580675 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b03323a0-5b5d-431a-b5a5-532110e46ab9-config\") pod \"machine-approver-56656f9798-qmpjz\" (UID: \"b03323a0-5b5d-431a-b5a5-532110e46ab9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qmpjz" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.580697 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a40421f2-28c2-4eec-8a1b-77d96ac0b4fc-image-import-ca\") pod \"apiserver-76f77b778f-zhvq9\" (UID: \"a40421f2-28c2-4eec-8a1b-77d96ac0b4fc\") " pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.580715 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a40421f2-28c2-4eec-8a1b-77d96ac0b4fc-serving-cert\") pod \"apiserver-76f77b778f-zhvq9\" (UID: \"a40421f2-28c2-4eec-8a1b-77d96ac0b4fc\") " pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.580732 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e78af36c-9fef-4db9-80a6-b0d02485f7bf-available-featuregates\") pod \"openshift-config-operator-7777fb866f-b6rkf\" (UID: \"e78af36c-9fef-4db9-80a6-b0d02485f7bf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b6rkf" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.580750 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmghn\" (UniqueName: \"kubernetes.io/projected/dc801e9f-fe8b-425a-9d80-4c1c67cbd959-kube-api-access-cmghn\") pod \"openshift-apiserver-operator-796bbdcf4f-2xjmg\" (UID: \"dc801e9f-fe8b-425a-9d80-4c1c67cbd959\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2xjmg" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.580769 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wbdv\" (UniqueName: \"kubernetes.io/projected/274a19c3-7e08-4994-986f-d43d111bde3c-kube-api-access-9wbdv\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.580785 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2557333d-abb9-4863-81c9-397307a108f6-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-8l887\" (UID: \"2557333d-abb9-4863-81c9-397307a108f6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8l887" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.580801 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c79b619-73da-4918-b5fd-e45ac5463a91-serving-cert\") pod \"apiserver-7bbb656c7d-svs9t\" (UID: \"8c79b619-73da-4918-b5fd-e45ac5463a91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-svs9t" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.580815 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.580829 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.580843 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8c79b619-73da-4918-b5fd-e45ac5463a91-encryption-config\") pod \"apiserver-7bbb656c7d-svs9t\" (UID: \"8c79b619-73da-4918-b5fd-e45ac5463a91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-svs9t" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.580864 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.580881 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe7d5c4e-f774-4519-8696-781c27efb691-trusted-ca\") pod \"console-operator-58897d9998-7l929\" (UID: \"fe7d5c4e-f774-4519-8696-781c27efb691\") " pod="openshift-console-operator/console-operator-58897d9998-7l929" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.580897 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-428cg\" (UniqueName: \"kubernetes.io/projected/e78af36c-9fef-4db9-80a6-b0d02485f7bf-kube-api-access-428cg\") pod \"openshift-config-operator-7777fb866f-b6rkf\" (UID: \"e78af36c-9fef-4db9-80a6-b0d02485f7bf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b6rkf" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.580911 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8c79b619-73da-4918-b5fd-e45ac5463a91-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-svs9t\" (UID: \"8c79b619-73da-4918-b5fd-e45ac5463a91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-svs9t" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.580928 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/b03323a0-5b5d-431a-b5a5-532110e46ab9-machine-approver-tls\") pod \"machine-approver-56656f9798-qmpjz\" (UID: \"b03323a0-5b5d-431a-b5a5-532110e46ab9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qmpjz" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.580943 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e78af36c-9fef-4db9-80a6-b0d02485f7bf-serving-cert\") pod \"openshift-config-operator-7777fb866f-b6rkf\" (UID: \"e78af36c-9fef-4db9-80a6-b0d02485f7bf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b6rkf" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.580975 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.580997 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a40421f2-28c2-4eec-8a1b-77d96ac0b4fc-audit-dir\") pod \"apiserver-76f77b778f-zhvq9\" (UID: \"a40421f2-28c2-4eec-8a1b-77d96ac0b4fc\") " pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.581015 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe7d5c4e-f774-4519-8696-781c27efb691-config\") pod \"console-operator-58897d9998-7l929\" (UID: \"fe7d5c4e-f774-4519-8696-781c27efb691\") " pod="openshift-console-operator/console-operator-58897d9998-7l929" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.581030 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2557333d-abb9-4863-81c9-397307a108f6-client-ca\") pod \"controller-manager-879f6c89f-8l887\" (UID: \"2557333d-abb9-4863-81c9-397307a108f6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8l887" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.581046 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trt6k\" (UniqueName: \"kubernetes.io/projected/b03323a0-5b5d-431a-b5a5-532110e46ab9-kube-api-access-trt6k\") pod \"machine-approver-56656f9798-qmpjz\" (UID: \"b03323a0-5b5d-431a-b5a5-532110e46ab9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qmpjz" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.581062 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.581078 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a40421f2-28c2-4eec-8a1b-77d96ac0b4fc-config\") pod \"apiserver-76f77b778f-zhvq9\" (UID: \"a40421f2-28c2-4eec-8a1b-77d96ac0b4fc\") " pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.581092 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xktkr\" (UniqueName: \"kubernetes.io/projected/a40421f2-28c2-4eec-8a1b-77d96ac0b4fc-kube-api-access-xktkr\") pod \"apiserver-76f77b778f-zhvq9\" (UID: \"a40421f2-28c2-4eec-8a1b-77d96ac0b4fc\") " pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.581111 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.581131 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a40421f2-28c2-4eec-8a1b-77d96ac0b4fc-audit\") pod \"apiserver-76f77b778f-zhvq9\" (UID: \"a40421f2-28c2-4eec-8a1b-77d96ac0b4fc\") " pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.581150 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8c79b619-73da-4918-b5fd-e45ac5463a91-etcd-client\") pod \"apiserver-7bbb656c7d-svs9t\" (UID: \"8c79b619-73da-4918-b5fd-e45ac5463a91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-svs9t" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.581167 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c79b619-73da-4918-b5fd-e45ac5463a91-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-svs9t\" (UID: \"8c79b619-73da-4918-b5fd-e45ac5463a91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-svs9t" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.581185 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/274a19c3-7e08-4994-986f-d43d111bde3c-audit-dir\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.581202 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.581217 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.581231 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2557333d-abb9-4863-81c9-397307a108f6-serving-cert\") pod \"controller-manager-879f6c89f-8l887\" (UID: \"2557333d-abb9-4863-81c9-397307a108f6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8l887" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.581254 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9dgn\" (UniqueName: \"kubernetes.io/projected/fe7d5c4e-f774-4519-8696-781c27efb691-kube-api-access-q9dgn\") pod \"console-operator-58897d9998-7l929\" (UID: \"fe7d5c4e-f774-4519-8696-781c27efb691\") " pod="openshift-console-operator/console-operator-58897d9998-7l929" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.581269 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8c79b619-73da-4918-b5fd-e45ac5463a91-audit-policies\") pod \"apiserver-7bbb656c7d-svs9t\" (UID: \"8c79b619-73da-4918-b5fd-e45ac5463a91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-svs9t" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.581291 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.581322 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc801e9f-fe8b-425a-9d80-4c1c67cbd959-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-2xjmg\" (UID: \"dc801e9f-fe8b-425a-9d80-4c1c67cbd959\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2xjmg" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.581339 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gpjk\" (UniqueName: \"kubernetes.io/projected/8c79b619-73da-4918-b5fd-e45ac5463a91-kube-api-access-2gpjk\") pod \"apiserver-7bbb656c7d-svs9t\" (UID: \"8c79b619-73da-4918-b5fd-e45ac5463a91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-svs9t" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.581358 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.581375 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a40421f2-28c2-4eec-8a1b-77d96ac0b4fc-node-pullsecrets\") pod \"apiserver-76f77b778f-zhvq9\" (UID: \"a40421f2-28c2-4eec-8a1b-77d96ac0b4fc\") " pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.581390 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc801e9f-fe8b-425a-9d80-4c1c67cbd959-config\") pod \"openshift-apiserver-operator-796bbdcf4f-2xjmg\" (UID: \"dc801e9f-fe8b-425a-9d80-4c1c67cbd959\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2xjmg" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.581419 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/274a19c3-7e08-4994-986f-d43d111bde3c-audit-policies\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.581434 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe7d5c4e-f774-4519-8696-781c27efb691-serving-cert\") pod \"console-operator-58897d9998-7l929\" (UID: \"fe7d5c4e-f774-4519-8696-781c27efb691\") " pod="openshift-console-operator/console-operator-58897d9998-7l929" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.581450 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a40421f2-28c2-4eec-8a1b-77d96ac0b4fc-encryption-config\") pod \"apiserver-76f77b778f-zhvq9\" (UID: \"a40421f2-28c2-4eec-8a1b-77d96ac0b4fc\") " pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.581467 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9whct\" (UniqueName: \"kubernetes.io/projected/2557333d-abb9-4863-81c9-397307a108f6-kube-api-access-9whct\") pod \"controller-manager-879f6c89f-8l887\" (UID: \"2557333d-abb9-4863-81c9-397307a108f6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8l887" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.581489 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a40421f2-28c2-4eec-8a1b-77d96ac0b4fc-etcd-client\") pod \"apiserver-76f77b778f-zhvq9\" (UID: \"a40421f2-28c2-4eec-8a1b-77d96ac0b4fc\") " pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.581520 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a40421f2-28c2-4eec-8a1b-77d96ac0b4fc-etcd-serving-ca\") pod \"apiserver-76f77b778f-zhvq9\" (UID: \"a40421f2-28c2-4eec-8a1b-77d96ac0b4fc\") " pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.581538 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.581553 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2557333d-abb9-4863-81c9-397307a108f6-config\") pod \"controller-manager-879f6c89f-8l887\" (UID: \"2557333d-abb9-4863-81c9-397307a108f6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8l887" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.581572 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b03323a0-5b5d-431a-b5a5-532110e46ab9-auth-proxy-config\") pod \"machine-approver-56656f9798-qmpjz\" (UID: \"b03323a0-5b5d-431a-b5a5-532110e46ab9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qmpjz" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.581589 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a40421f2-28c2-4eec-8a1b-77d96ac0b4fc-trusted-ca-bundle\") pod \"apiserver-76f77b778f-zhvq9\" (UID: \"a40421f2-28c2-4eec-8a1b-77d96ac0b4fc\") " pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.582033 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.582471 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.582586 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.582691 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.582765 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.582777 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.582904 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.582930 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.582986 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.583104 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.583246 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.585805 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z84rb"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.586371 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-xx8r8"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.586665 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-xx8r8" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.587020 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z84rb" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.587284 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqwpq"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.587944 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqwpq" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.589386 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-4bv9h"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.589845 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-4bv9h" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.591585 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.593675 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.595775 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.600220 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xshkj"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.600661 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nc8d6"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.600968 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-tpvgp"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.601267 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-tpvgp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.601556 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-nc8d6" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.601592 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.607729 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.607942 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.623624 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wkhrv"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.624251 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wkhrv" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.624935 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-zrjt7"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.625290 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ttmh8"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.625661 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ttmh8" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.625857 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-zrjt7" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.625980 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vrqbv"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.627008 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vrqbv" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.627888 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-26t2g"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.628232 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-26t2g" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.630414 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4c7j6"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.631117 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4c7j6" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.639850 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g2fw7"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.640542 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g2fw7" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.641859 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.642054 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.642206 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.642381 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.651086 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-svs9t"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.660014 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8l887"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.660778 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.660968 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.660992 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.661148 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.661345 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.668761 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.672171 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.672331 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.692480 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.692698 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.692834 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.693579 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xktkr\" (UniqueName: \"kubernetes.io/projected/a40421f2-28c2-4eec-8a1b-77d96ac0b4fc-kube-api-access-xktkr\") pod \"apiserver-76f77b778f-zhvq9\" (UID: \"a40421f2-28c2-4eec-8a1b-77d96ac0b4fc\") " pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.693622 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-console-serving-cert\") pod \"console-f9d7485db-xx8r8\" (UID: \"96767da2-aab3-4f6d-a14b-ada1c0a4ded8\") " pod="openshift-console/console-f9d7485db-xx8r8" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.693645 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.693661 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a40421f2-28c2-4eec-8a1b-77d96ac0b4fc-config\") pod \"apiserver-76f77b778f-zhvq9\" (UID: \"a40421f2-28c2-4eec-8a1b-77d96ac0b4fc\") " pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.693688 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.693704 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a40421f2-28c2-4eec-8a1b-77d96ac0b4fc-audit\") pod \"apiserver-76f77b778f-zhvq9\" (UID: \"a40421f2-28c2-4eec-8a1b-77d96ac0b4fc\") " pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.693721 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dce1cb95-805f-4793-9a3e-45f4b18ca4cc-serving-cert\") pod \"authentication-operator-69f744f599-4bv9h\" (UID: \"dce1cb95-805f-4793-9a3e-45f4b18ca4cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4bv9h" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.693736 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-service-ca\") pod \"console-f9d7485db-xx8r8\" (UID: \"96767da2-aab3-4f6d-a14b-ada1c0a4ded8\") " pod="openshift-console/console-f9d7485db-xx8r8" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.693754 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xxwj\" (UniqueName: \"kubernetes.io/projected/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-kube-api-access-7xxwj\") pod \"console-f9d7485db-xx8r8\" (UID: \"96767da2-aab3-4f6d-a14b-ada1c0a4ded8\") " pod="openshift-console/console-f9d7485db-xx8r8" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.693771 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.693786 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.693801 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2557333d-abb9-4863-81c9-397307a108f6-serving-cert\") pod \"controller-manager-879f6c89f-8l887\" (UID: \"2557333d-abb9-4863-81c9-397307a108f6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8l887" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.693817 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8c79b619-73da-4918-b5fd-e45ac5463a91-etcd-client\") pod \"apiserver-7bbb656c7d-svs9t\" (UID: \"8c79b619-73da-4918-b5fd-e45ac5463a91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-svs9t" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.693835 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c79b619-73da-4918-b5fd-e45ac5463a91-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-svs9t\" (UID: \"8c79b619-73da-4918-b5fd-e45ac5463a91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-svs9t" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.693851 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/274a19c3-7e08-4994-986f-d43d111bde3c-audit-dir\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.693866 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-console-oauth-config\") pod \"console-f9d7485db-xx8r8\" (UID: \"96767da2-aab3-4f6d-a14b-ada1c0a4ded8\") " pod="openshift-console/console-f9d7485db-xx8r8" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.693887 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9dgn\" (UniqueName: \"kubernetes.io/projected/fe7d5c4e-f774-4519-8696-781c27efb691-kube-api-access-q9dgn\") pod \"console-operator-58897d9998-7l929\" (UID: \"fe7d5c4e-f774-4519-8696-781c27efb691\") " pod="openshift-console-operator/console-operator-58897d9998-7l929" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.693902 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8c79b619-73da-4918-b5fd-e45ac5463a91-audit-policies\") pod \"apiserver-7bbb656c7d-svs9t\" (UID: \"8c79b619-73da-4918-b5fd-e45ac5463a91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-svs9t" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.693917 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.693931 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc801e9f-fe8b-425a-9d80-4c1c67cbd959-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-2xjmg\" (UID: \"dc801e9f-fe8b-425a-9d80-4c1c67cbd959\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2xjmg" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.693950 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mkkg\" (UniqueName: \"kubernetes.io/projected/fe97f639-3a7f-437e-945c-e5a530726ced-kube-api-access-4mkkg\") pod \"cluster-samples-operator-665b6dd947-z84rb\" (UID: \"fe97f639-3a7f-437e-945c-e5a530726ced\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z84rb" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.693967 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gpjk\" (UniqueName: \"kubernetes.io/projected/8c79b619-73da-4918-b5fd-e45ac5463a91-kube-api-access-2gpjk\") pod \"apiserver-7bbb656c7d-svs9t\" (UID: \"8c79b619-73da-4918-b5fd-e45ac5463a91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-svs9t" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.693985 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c92443d-f8f6-4941-9729-013d10138707-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-hbprv\" (UID: \"4c92443d-f8f6-4941-9729-013d10138707\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hbprv" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694002 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694020 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dce1cb95-805f-4793-9a3e-45f4b18ca4cc-service-ca-bundle\") pod \"authentication-operator-69f744f599-4bv9h\" (UID: \"dce1cb95-805f-4793-9a3e-45f4b18ca4cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4bv9h" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694037 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a40421f2-28c2-4eec-8a1b-77d96ac0b4fc-node-pullsecrets\") pod \"apiserver-76f77b778f-zhvq9\" (UID: \"a40421f2-28c2-4eec-8a1b-77d96ac0b4fc\") " pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694053 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc801e9f-fe8b-425a-9d80-4c1c67cbd959-config\") pod \"openshift-apiserver-operator-796bbdcf4f-2xjmg\" (UID: \"dc801e9f-fe8b-425a-9d80-4c1c67cbd959\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2xjmg" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694070 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/274a19c3-7e08-4994-986f-d43d111bde3c-audit-policies\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694089 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0705c1f1-8602-47fc-9b52-82a70cf21976-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jqwpq\" (UID: \"0705c1f1-8602-47fc-9b52-82a70cf21976\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqwpq" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694106 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0705c1f1-8602-47fc-9b52-82a70cf21976-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jqwpq\" (UID: \"0705c1f1-8602-47fc-9b52-82a70cf21976\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqwpq" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694123 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfrmn\" (UniqueName: \"kubernetes.io/projected/0705c1f1-8602-47fc-9b52-82a70cf21976-kube-api-access-hfrmn\") pod \"openshift-controller-manager-operator-756b6f6bc6-jqwpq\" (UID: \"0705c1f1-8602-47fc-9b52-82a70cf21976\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqwpq" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694140 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe7d5c4e-f774-4519-8696-781c27efb691-serving-cert\") pod \"console-operator-58897d9998-7l929\" (UID: \"fe7d5c4e-f774-4519-8696-781c27efb691\") " pod="openshift-console-operator/console-operator-58897d9998-7l929" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694155 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a40421f2-28c2-4eec-8a1b-77d96ac0b4fc-encryption-config\") pod \"apiserver-76f77b778f-zhvq9\" (UID: \"a40421f2-28c2-4eec-8a1b-77d96ac0b4fc\") " pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694171 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-console-config\") pod \"console-f9d7485db-xx8r8\" (UID: \"96767da2-aab3-4f6d-a14b-ada1c0a4ded8\") " pod="openshift-console/console-f9d7485db-xx8r8" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694226 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9whct\" (UniqueName: \"kubernetes.io/projected/2557333d-abb9-4863-81c9-397307a108f6-kube-api-access-9whct\") pod \"controller-manager-879f6c89f-8l887\" (UID: \"2557333d-abb9-4863-81c9-397307a108f6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8l887" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694246 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4c92443d-f8f6-4941-9729-013d10138707-images\") pod \"machine-api-operator-5694c8668f-hbprv\" (UID: \"4c92443d-f8f6-4941-9729-013d10138707\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hbprv" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694266 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a40421f2-28c2-4eec-8a1b-77d96ac0b4fc-etcd-serving-ca\") pod \"apiserver-76f77b778f-zhvq9\" (UID: \"a40421f2-28c2-4eec-8a1b-77d96ac0b4fc\") " pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694284 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a40421f2-28c2-4eec-8a1b-77d96ac0b4fc-etcd-client\") pod \"apiserver-76f77b778f-zhvq9\" (UID: \"a40421f2-28c2-4eec-8a1b-77d96ac0b4fc\") " pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694312 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694329 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2557333d-abb9-4863-81c9-397307a108f6-config\") pod \"controller-manager-879f6c89f-8l887\" (UID: \"2557333d-abb9-4863-81c9-397307a108f6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8l887" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694351 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b03323a0-5b5d-431a-b5a5-532110e46ab9-auth-proxy-config\") pod \"machine-approver-56656f9798-qmpjz\" (UID: \"b03323a0-5b5d-431a-b5a5-532110e46ab9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qmpjz" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694365 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a40421f2-28c2-4eec-8a1b-77d96ac0b4fc-trusted-ca-bundle\") pod \"apiserver-76f77b778f-zhvq9\" (UID: \"a40421f2-28c2-4eec-8a1b-77d96ac0b4fc\") " pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694382 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq8h9\" (UniqueName: \"kubernetes.io/projected/dce1cb95-805f-4793-9a3e-45f4b18ca4cc-kube-api-access-bq8h9\") pod \"authentication-operator-69f744f599-4bv9h\" (UID: \"dce1cb95-805f-4793-9a3e-45f4b18ca4cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4bv9h" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694398 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b03323a0-5b5d-431a-b5a5-532110e46ab9-config\") pod \"machine-approver-56656f9798-qmpjz\" (UID: \"b03323a0-5b5d-431a-b5a5-532110e46ab9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qmpjz" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694414 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a40421f2-28c2-4eec-8a1b-77d96ac0b4fc-image-import-ca\") pod \"apiserver-76f77b778f-zhvq9\" (UID: \"a40421f2-28c2-4eec-8a1b-77d96ac0b4fc\") " pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694427 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a40421f2-28c2-4eec-8a1b-77d96ac0b4fc-serving-cert\") pod \"apiserver-76f77b778f-zhvq9\" (UID: \"a40421f2-28c2-4eec-8a1b-77d96ac0b4fc\") " pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694441 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8c79b619-73da-4918-b5fd-e45ac5463a91-audit-dir\") pod \"apiserver-7bbb656c7d-svs9t\" (UID: \"8c79b619-73da-4918-b5fd-e45ac5463a91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-svs9t" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694458 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e78af36c-9fef-4db9-80a6-b0d02485f7bf-available-featuregates\") pod \"openshift-config-operator-7777fb866f-b6rkf\" (UID: \"e78af36c-9fef-4db9-80a6-b0d02485f7bf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b6rkf" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694475 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmghn\" (UniqueName: \"kubernetes.io/projected/dc801e9f-fe8b-425a-9d80-4c1c67cbd959-kube-api-access-cmghn\") pod \"openshift-apiserver-operator-796bbdcf4f-2xjmg\" (UID: \"dc801e9f-fe8b-425a-9d80-4c1c67cbd959\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2xjmg" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694492 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe97f639-3a7f-437e-945c-e5a530726ced-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-z84rb\" (UID: \"fe97f639-3a7f-437e-945c-e5a530726ced\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z84rb" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694509 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wbdv\" (UniqueName: \"kubernetes.io/projected/274a19c3-7e08-4994-986f-d43d111bde3c-kube-api-access-9wbdv\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694524 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2557333d-abb9-4863-81c9-397307a108f6-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-8l887\" (UID: \"2557333d-abb9-4863-81c9-397307a108f6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8l887" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694542 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-trusted-ca-bundle\") pod \"console-f9d7485db-xx8r8\" (UID: \"96767da2-aab3-4f6d-a14b-ada1c0a4ded8\") " pod="openshift-console/console-f9d7485db-xx8r8" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694557 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c79b619-73da-4918-b5fd-e45ac5463a91-serving-cert\") pod \"apiserver-7bbb656c7d-svs9t\" (UID: \"8c79b619-73da-4918-b5fd-e45ac5463a91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-svs9t" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694574 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqrhq\" (UniqueName: \"kubernetes.io/projected/4c92443d-f8f6-4941-9729-013d10138707-kube-api-access-zqrhq\") pod \"machine-api-operator-5694c8668f-hbprv\" (UID: \"4c92443d-f8f6-4941-9729-013d10138707\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hbprv" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694591 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694607 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694623 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c92443d-f8f6-4941-9729-013d10138707-config\") pod \"machine-api-operator-5694c8668f-hbprv\" (UID: \"4c92443d-f8f6-4941-9729-013d10138707\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hbprv" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694641 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe7d5c4e-f774-4519-8696-781c27efb691-trusted-ca\") pod \"console-operator-58897d9998-7l929\" (UID: \"fe7d5c4e-f774-4519-8696-781c27efb691\") " pod="openshift-console-operator/console-operator-58897d9998-7l929" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694659 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-428cg\" (UniqueName: \"kubernetes.io/projected/e78af36c-9fef-4db9-80a6-b0d02485f7bf-kube-api-access-428cg\") pod \"openshift-config-operator-7777fb866f-b6rkf\" (UID: \"e78af36c-9fef-4db9-80a6-b0d02485f7bf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b6rkf" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694676 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8c79b619-73da-4918-b5fd-e45ac5463a91-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-svs9t\" (UID: \"8c79b619-73da-4918-b5fd-e45ac5463a91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-svs9t" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694692 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8c79b619-73da-4918-b5fd-e45ac5463a91-encryption-config\") pod \"apiserver-7bbb656c7d-svs9t\" (UID: \"8c79b619-73da-4918-b5fd-e45ac5463a91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-svs9t" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694709 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694723 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/b03323a0-5b5d-431a-b5a5-532110e46ab9-machine-approver-tls\") pod \"machine-approver-56656f9798-qmpjz\" (UID: \"b03323a0-5b5d-431a-b5a5-532110e46ab9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qmpjz" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694740 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e78af36c-9fef-4db9-80a6-b0d02485f7bf-serving-cert\") pod \"openshift-config-operator-7777fb866f-b6rkf\" (UID: \"e78af36c-9fef-4db9-80a6-b0d02485f7bf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b6rkf" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694766 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694783 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-oauth-serving-cert\") pod \"console-f9d7485db-xx8r8\" (UID: \"96767da2-aab3-4f6d-a14b-ada1c0a4ded8\") " pod="openshift-console/console-f9d7485db-xx8r8" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694801 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a40421f2-28c2-4eec-8a1b-77d96ac0b4fc-audit-dir\") pod \"apiserver-76f77b778f-zhvq9\" (UID: \"a40421f2-28c2-4eec-8a1b-77d96ac0b4fc\") " pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694819 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dce1cb95-805f-4793-9a3e-45f4b18ca4cc-config\") pod \"authentication-operator-69f744f599-4bv9h\" (UID: \"dce1cb95-805f-4793-9a3e-45f4b18ca4cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4bv9h" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694836 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dce1cb95-805f-4793-9a3e-45f4b18ca4cc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-4bv9h\" (UID: \"dce1cb95-805f-4793-9a3e-45f4b18ca4cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4bv9h" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694853 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe7d5c4e-f774-4519-8696-781c27efb691-config\") pod \"console-operator-58897d9998-7l929\" (UID: \"fe7d5c4e-f774-4519-8696-781c27efb691\") " pod="openshift-console-operator/console-operator-58897d9998-7l929" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694870 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2557333d-abb9-4863-81c9-397307a108f6-client-ca\") pod \"controller-manager-879f6c89f-8l887\" (UID: \"2557333d-abb9-4863-81c9-397307a108f6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8l887" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.694889 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trt6k\" (UniqueName: \"kubernetes.io/projected/b03323a0-5b5d-431a-b5a5-532110e46ab9-kube-api-access-trt6k\") pod \"machine-approver-56656f9798-qmpjz\" (UID: \"b03323a0-5b5d-431a-b5a5-532110e46ab9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qmpjz" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.695411 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.695537 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-fvtx9"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.696169 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-lhj8c"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.696621 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lhj8c" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.696900 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fvtx9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.697177 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/274a19c3-7e08-4994-986f-d43d111bde3c-audit-dir\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.699433 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.699591 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.699682 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.699780 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.700280 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c79b619-73da-4918-b5fd-e45ac5463a91-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-svs9t\" (UID: \"8c79b619-73da-4918-b5fd-e45ac5463a91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-svs9t" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.700951 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.701292 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a40421f2-28c2-4eec-8a1b-77d96ac0b4fc-config\") pod \"apiserver-76f77b778f-zhvq9\" (UID: \"a40421f2-28c2-4eec-8a1b-77d96ac0b4fc\") " pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.701548 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.701643 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc801e9f-fe8b-425a-9d80-4c1c67cbd959-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-2xjmg\" (UID: \"dc801e9f-fe8b-425a-9d80-4c1c67cbd959\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2xjmg" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.702147 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.702214 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a40421f2-28c2-4eec-8a1b-77d96ac0b4fc-node-pullsecrets\") pod \"apiserver-76f77b778f-zhvq9\" (UID: \"a40421f2-28c2-4eec-8a1b-77d96ac0b4fc\") " pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.702325 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8c79b619-73da-4918-b5fd-e45ac5463a91-audit-policies\") pod \"apiserver-7bbb656c7d-svs9t\" (UID: \"8c79b619-73da-4918-b5fd-e45ac5463a91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-svs9t" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.702503 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a40421f2-28c2-4eec-8a1b-77d96ac0b4fc-audit\") pod \"apiserver-76f77b778f-zhvq9\" (UID: \"a40421f2-28c2-4eec-8a1b-77d96ac0b4fc\") " pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.702564 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8c79b619-73da-4918-b5fd-e45ac5463a91-audit-dir\") pod \"apiserver-7bbb656c7d-svs9t\" (UID: \"8c79b619-73da-4918-b5fd-e45ac5463a91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-svs9t" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.702698 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/274a19c3-7e08-4994-986f-d43d111bde3c-audit-policies\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.703133 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc801e9f-fe8b-425a-9d80-4c1c67cbd959-config\") pod \"openshift-apiserver-operator-796bbdcf4f-2xjmg\" (UID: \"dc801e9f-fe8b-425a-9d80-4c1c67cbd959\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2xjmg" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.703155 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e78af36c-9fef-4db9-80a6-b0d02485f7bf-available-featuregates\") pod \"openshift-config-operator-7777fb866f-b6rkf\" (UID: \"e78af36c-9fef-4db9-80a6-b0d02485f7bf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b6rkf" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.703755 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2557333d-abb9-4863-81c9-397307a108f6-config\") pod \"controller-manager-879f6c89f-8l887\" (UID: \"2557333d-abb9-4863-81c9-397307a108f6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8l887" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.703998 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a40421f2-28c2-4eec-8a1b-77d96ac0b4fc-etcd-serving-ca\") pod \"apiserver-76f77b778f-zhvq9\" (UID: \"a40421f2-28c2-4eec-8a1b-77d96ac0b4fc\") " pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.704367 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2557333d-abb9-4863-81c9-397307a108f6-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-8l887\" (UID: \"2557333d-abb9-4863-81c9-397307a108f6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8l887" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.704517 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a40421f2-28c2-4eec-8a1b-77d96ac0b4fc-etcd-client\") pod \"apiserver-76f77b778f-zhvq9\" (UID: \"a40421f2-28c2-4eec-8a1b-77d96ac0b4fc\") " pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.704662 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.705557 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a40421f2-28c2-4eec-8a1b-77d96ac0b4fc-encryption-config\") pod \"apiserver-76f77b778f-zhvq9\" (UID: \"a40421f2-28c2-4eec-8a1b-77d96ac0b4fc\") " pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.707403 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a40421f2-28c2-4eec-8a1b-77d96ac0b4fc-trusted-ca-bundle\") pod \"apiserver-76f77b778f-zhvq9\" (UID: \"a40421f2-28c2-4eec-8a1b-77d96ac0b4fc\") " pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.707864 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.708444 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b03323a0-5b5d-431a-b5a5-532110e46ab9-config\") pod \"machine-approver-56656f9798-qmpjz\" (UID: \"b03323a0-5b5d-431a-b5a5-532110e46ab9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qmpjz" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.708676 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.709143 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.709343 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2557333d-abb9-4863-81c9-397307a108f6-serving-cert\") pod \"controller-manager-879f6c89f-8l887\" (UID: \"2557333d-abb9-4863-81c9-397307a108f6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8l887" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.709655 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8c79b619-73da-4918-b5fd-e45ac5463a91-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-svs9t\" (UID: \"8c79b619-73da-4918-b5fd-e45ac5463a91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-svs9t" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.709875 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.709894 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b03323a0-5b5d-431a-b5a5-532110e46ab9-auth-proxy-config\") pod \"machine-approver-56656f9798-qmpjz\" (UID: \"b03323a0-5b5d-431a-b5a5-532110e46ab9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qmpjz" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.710123 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a40421f2-28c2-4eec-8a1b-77d96ac0b4fc-serving-cert\") pod \"apiserver-76f77b778f-zhvq9\" (UID: \"a40421f2-28c2-4eec-8a1b-77d96ac0b4fc\") " pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.710900 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.711057 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.711601 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.711644 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a40421f2-28c2-4eec-8a1b-77d96ac0b4fc-audit-dir\") pod \"apiserver-76f77b778f-zhvq9\" (UID: \"a40421f2-28c2-4eec-8a1b-77d96ac0b4fc\") " pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.711712 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.711742 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.711875 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.711975 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.712059 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.712505 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2557333d-abb9-4863-81c9-397307a108f6-client-ca\") pod \"controller-manager-879f6c89f-8l887\" (UID: \"2557333d-abb9-4863-81c9-397307a108f6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8l887" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.712787 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c79b619-73da-4918-b5fd-e45ac5463a91-serving-cert\") pod \"apiserver-7bbb656c7d-svs9t\" (UID: \"8c79b619-73da-4918-b5fd-e45ac5463a91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-svs9t" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.713038 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe7d5c4e-f774-4519-8696-781c27efb691-serving-cert\") pod \"console-operator-58897d9998-7l929\" (UID: \"fe7d5c4e-f774-4519-8696-781c27efb691\") " pod="openshift-console-operator/console-operator-58897d9998-7l929" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.713180 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe7d5c4e-f774-4519-8696-781c27efb691-config\") pod \"console-operator-58897d9998-7l929\" (UID: \"fe7d5c4e-f774-4519-8696-781c27efb691\") " pod="openshift-console-operator/console-operator-58897d9998-7l929" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.713454 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.713593 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.714187 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.714582 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e78af36c-9fef-4db9-80a6-b0d02485f7bf-serving-cert\") pod \"openshift-config-operator-7777fb866f-b6rkf\" (UID: \"e78af36c-9fef-4db9-80a6-b0d02485f7bf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b6rkf" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.714898 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.715345 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.715476 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.715559 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.715485 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.715644 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.715724 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.715935 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.716079 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.716129 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.716139 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.716260 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.716415 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.716474 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.716627 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.716680 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.716825 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.717157 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8c79b619-73da-4918-b5fd-e45ac5463a91-etcd-client\") pod \"apiserver-7bbb656c7d-svs9t\" (UID: \"8c79b619-73da-4918-b5fd-e45ac5463a91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-svs9t" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.717530 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a40421f2-28c2-4eec-8a1b-77d96ac0b4fc-image-import-ca\") pod \"apiserver-76f77b778f-zhvq9\" (UID: \"a40421f2-28c2-4eec-8a1b-77d96ac0b4fc\") " pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.718850 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/b03323a0-5b5d-431a-b5a5-532110e46ab9-machine-approver-tls\") pod \"machine-approver-56656f9798-qmpjz\" (UID: \"b03323a0-5b5d-431a-b5a5-532110e46ab9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qmpjz" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.720226 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.721633 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8c79b619-73da-4918-b5fd-e45ac5463a91-encryption-config\") pod \"apiserver-7bbb656c7d-svs9t\" (UID: \"8c79b619-73da-4918-b5fd-e45ac5463a91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-svs9t" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.722364 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.722492 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe7d5c4e-f774-4519-8696-781c27efb691-trusted-ca\") pod \"console-operator-58897d9998-7l929\" (UID: \"fe7d5c4e-f774-4519-8696-781c27efb691\") " pod="openshift-console-operator/console-operator-58897d9998-7l929" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.725037 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.725062 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.727742 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.728346 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-b6rkf"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.730927 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rqvfm"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.731814 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-7l929"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.731838 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wx8hg"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.732318 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-wx8hg" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.732599 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rqvfm" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.735861 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mcv46"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.736477 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ww2cw"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.737031 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nxnf"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.738459 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mcv46" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.738851 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ww2cw" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.740269 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.750059 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4f7h4"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.750686 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nxnf" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.756490 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wcjjj"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.759273 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7qgxt"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.759488 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4f7h4" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.759906 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wcjjj" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.762383 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-d47lw"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.762497 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.762495 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7qgxt" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.763437 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-9lqrf"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.763809 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-d47lw" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.763970 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330595-k68rm"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.764343 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9lqrf" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.764511 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gwc5q"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.764926 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330595-k68rm" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.764988 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-tpvgp"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.765011 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xshkj"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.765026 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqwpq"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.765042 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-zhvq9"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.765106 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gwc5q" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.766436 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nc8d6"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.768478 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wkhrv"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.769844 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-wflpr"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.770573 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wflpr" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.770979 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-4bv9h"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.771972 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vrqbv"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.773171 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wlgsp"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.774124 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l8kk7"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.774686 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l8kk7" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.775123 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-v9nfw"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.775712 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-v9nfw" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.776139 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ttmh8"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.777469 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2xjmg"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.778552 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.779067 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-hbprv"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.781755 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-zrjt7"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.782822 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nxnf"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.783948 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l8kk7"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.785057 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-9lqrf"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.786151 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z84rb"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.787187 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wflpr"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.788641 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-fvtx9"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.790255 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wcjjj"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.791641 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rqvfm"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.792694 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wx8hg"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.794164 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-xx8r8"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.795849 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkf47\" (UniqueName: \"kubernetes.io/projected/1530c252-8f8d-4bf2-89f9-6d358d9cd617-kube-api-access-vkf47\") pod \"machine-config-operator-74547568cd-lhj8c\" (UID: \"1530c252-8f8d-4bf2-89f9-6d358d9cd617\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lhj8c" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.795911 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt5w4\" (UniqueName: \"kubernetes.io/projected/351bb56c-499b-456c-81e2-ea2664ca5960-kube-api-access-dt5w4\") pod \"downloads-7954f5f757-tpvgp\" (UID: \"351bb56c-499b-456c-81e2-ea2664ca5960\") " pod="openshift-console/downloads-7954f5f757-tpvgp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.795941 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq8h9\" (UniqueName: \"kubernetes.io/projected/dce1cb95-805f-4793-9a3e-45f4b18ca4cc-kube-api-access-bq8h9\") pod \"authentication-operator-69f744f599-4bv9h\" (UID: \"dce1cb95-805f-4793-9a3e-45f4b18ca4cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4bv9h" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.795963 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe97f639-3a7f-437e-945c-e5a530726ced-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-z84rb\" (UID: \"fe97f639-3a7f-437e-945c-e5a530726ced\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z84rb" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.796002 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86dce66a-8c06-4f71-8ce9-dec87390310d-config\") pod \"route-controller-manager-6576b87f9c-4c7j6\" (UID: \"86dce66a-8c06-4f71-8ce9-dec87390310d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4c7j6" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.796036 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-trusted-ca-bundle\") pod \"console-f9d7485db-xx8r8\" (UID: \"96767da2-aab3-4f6d-a14b-ada1c0a4ded8\") " pod="openshift-console/console-f9d7485db-xx8r8" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.796075 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqrhq\" (UniqueName: \"kubernetes.io/projected/4c92443d-f8f6-4941-9729-013d10138707-kube-api-access-zqrhq\") pod \"machine-api-operator-5694c8668f-hbprv\" (UID: \"4c92443d-f8f6-4941-9729-013d10138707\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hbprv" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.796092 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86dce66a-8c06-4f71-8ce9-dec87390310d-serving-cert\") pod \"route-controller-manager-6576b87f9c-4c7j6\" (UID: \"86dce66a-8c06-4f71-8ce9-dec87390310d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4c7j6" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.796111 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/111689f4-eb0d-47d5-800c-0eaa9acd7425-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-fvtx9\" (UID: \"111689f4-eb0d-47d5-800c-0eaa9acd7425\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fvtx9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.796150 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c92443d-f8f6-4941-9729-013d10138707-config\") pod \"machine-api-operator-5694c8668f-hbprv\" (UID: \"4c92443d-f8f6-4941-9729-013d10138707\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hbprv" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.796171 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2e3a06e3-520e-4096-ad0e-550a3cdd2a2b-etcd-client\") pod \"etcd-operator-b45778765-zrjt7\" (UID: \"2e3a06e3-520e-4096-ad0e-550a3cdd2a2b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zrjt7" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.796188 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1530c252-8f8d-4bf2-89f9-6d358d9cd617-auth-proxy-config\") pod \"machine-config-operator-74547568cd-lhj8c\" (UID: \"1530c252-8f8d-4bf2-89f9-6d358d9cd617\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lhj8c" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.796206 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/adfb3989-3a4d-4796-8c41-d6f554acea6d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-wkhrv\" (UID: \"adfb3989-3a4d-4796-8c41-d6f554acea6d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wkhrv" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.796255 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25wsz\" (UniqueName: \"kubernetes.io/projected/3d72c6ea-d44c-4da5-9fa7-17cd4b3f3cd4-kube-api-access-25wsz\") pod \"dns-operator-744455d44c-nc8d6\" (UID: \"3d72c6ea-d44c-4da5-9fa7-17cd4b3f3cd4\") " pod="openshift-dns-operator/dns-operator-744455d44c-nc8d6" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.796272 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdqk6\" (UniqueName: \"kubernetes.io/projected/111689f4-eb0d-47d5-800c-0eaa9acd7425-kube-api-access-cdqk6\") pod \"machine-config-controller-84d6567774-fvtx9\" (UID: \"111689f4-eb0d-47d5-800c-0eaa9acd7425\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fvtx9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.796322 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz5mc\" (UniqueName: \"kubernetes.io/projected/2e3a06e3-520e-4096-ad0e-550a3cdd2a2b-kube-api-access-jz5mc\") pod \"etcd-operator-b45778765-zrjt7\" (UID: \"2e3a06e3-520e-4096-ad0e-550a3cdd2a2b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zrjt7" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.796296 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g2fw7"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.796355 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/111689f4-eb0d-47d5-800c-0eaa9acd7425-proxy-tls\") pod \"machine-config-controller-84d6567774-fvtx9\" (UID: \"111689f4-eb0d-47d5-800c-0eaa9acd7425\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fvtx9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.796398 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-oauth-serving-cert\") pod \"console-f9d7485db-xx8r8\" (UID: \"96767da2-aab3-4f6d-a14b-ada1c0a4ded8\") " pod="openshift-console/console-f9d7485db-xx8r8" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.796419 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3d72c6ea-d44c-4da5-9fa7-17cd4b3f3cd4-metrics-tls\") pod \"dns-operator-744455d44c-nc8d6\" (UID: \"3d72c6ea-d44c-4da5-9fa7-17cd4b3f3cd4\") " pod="openshift-dns-operator/dns-operator-744455d44c-nc8d6" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.796489 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dce1cb95-805f-4793-9a3e-45f4b18ca4cc-config\") pod \"authentication-operator-69f744f599-4bv9h\" (UID: \"dce1cb95-805f-4793-9a3e-45f4b18ca4cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4bv9h" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.796527 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dce1cb95-805f-4793-9a3e-45f4b18ca4cc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-4bv9h\" (UID: \"dce1cb95-805f-4793-9a3e-45f4b18ca4cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4bv9h" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.796576 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcfgc\" (UniqueName: \"kubernetes.io/projected/01f5674d-e0dc-4a62-94a8-7200c30f524d-kube-api-access-mcfgc\") pod \"cluster-image-registry-operator-dc59b4c8b-g2fw7\" (UID: \"01f5674d-e0dc-4a62-94a8-7200c30f524d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g2fw7" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.796621 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-console-serving-cert\") pod \"console-f9d7485db-xx8r8\" (UID: \"96767da2-aab3-4f6d-a14b-ada1c0a4ded8\") " pod="openshift-console/console-f9d7485db-xx8r8" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.796651 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/01f5674d-e0dc-4a62-94a8-7200c30f524d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-g2fw7\" (UID: \"01f5674d-e0dc-4a62-94a8-7200c30f524d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g2fw7" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.796684 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1530c252-8f8d-4bf2-89f9-6d358d9cd617-images\") pod \"machine-config-operator-74547568cd-lhj8c\" (UID: \"1530c252-8f8d-4bf2-89f9-6d358d9cd617\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lhj8c" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.796714 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dce1cb95-805f-4793-9a3e-45f4b18ca4cc-serving-cert\") pod \"authentication-operator-69f744f599-4bv9h\" (UID: \"dce1cb95-805f-4793-9a3e-45f4b18ca4cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4bv9h" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.796768 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-service-ca\") pod \"console-f9d7485db-xx8r8\" (UID: \"96767da2-aab3-4f6d-a14b-ada1c0a4ded8\") " pod="openshift-console/console-f9d7485db-xx8r8" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.796795 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xxwj\" (UniqueName: \"kubernetes.io/projected/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-kube-api-access-7xxwj\") pod \"console-f9d7485db-xx8r8\" (UID: \"96767da2-aab3-4f6d-a14b-ada1c0a4ded8\") " pod="openshift-console/console-f9d7485db-xx8r8" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.796828 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2007404d-830b-44fe-b627-18b205d9f566-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ttmh8\" (UID: \"2007404d-830b-44fe-b627-18b205d9f566\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ttmh8" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.796849 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-console-oauth-config\") pod \"console-f9d7485db-xx8r8\" (UID: \"96767da2-aab3-4f6d-a14b-ada1c0a4ded8\") " pod="openshift-console/console-f9d7485db-xx8r8" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.796865 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1530c252-8f8d-4bf2-89f9-6d358d9cd617-proxy-tls\") pod \"machine-config-operator-74547568cd-lhj8c\" (UID: \"1530c252-8f8d-4bf2-89f9-6d358d9cd617\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lhj8c" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.796885 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/01f5674d-e0dc-4a62-94a8-7200c30f524d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-g2fw7\" (UID: \"01f5674d-e0dc-4a62-94a8-7200c30f524d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g2fw7" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.796902 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2007404d-830b-44fe-b627-18b205d9f566-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ttmh8\" (UID: \"2007404d-830b-44fe-b627-18b205d9f566\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ttmh8" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.796919 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e3a06e3-520e-4096-ad0e-550a3cdd2a2b-serving-cert\") pod \"etcd-operator-b45778765-zrjt7\" (UID: \"2e3a06e3-520e-4096-ad0e-550a3cdd2a2b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zrjt7" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.796935 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86dce66a-8c06-4f71-8ce9-dec87390310d-client-ca\") pod \"route-controller-manager-6576b87f9c-4c7j6\" (UID: \"86dce66a-8c06-4f71-8ce9-dec87390310d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4c7j6" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.796951 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2007404d-830b-44fe-b627-18b205d9f566-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ttmh8\" (UID: \"2007404d-830b-44fe-b627-18b205d9f566\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ttmh8" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.796984 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mkkg\" (UniqueName: \"kubernetes.io/projected/fe97f639-3a7f-437e-945c-e5a530726ced-kube-api-access-4mkkg\") pod \"cluster-samples-operator-665b6dd947-z84rb\" (UID: \"fe97f639-3a7f-437e-945c-e5a530726ced\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z84rb" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.797011 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c92443d-f8f6-4941-9729-013d10138707-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-hbprv\" (UID: \"4c92443d-f8f6-4941-9729-013d10138707\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hbprv" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.797058 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/adfb3989-3a4d-4796-8c41-d6f554acea6d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-wkhrv\" (UID: \"adfb3989-3a4d-4796-8c41-d6f554acea6d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wkhrv" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.797079 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dce1cb95-805f-4793-9a3e-45f4b18ca4cc-service-ca-bundle\") pod \"authentication-operator-69f744f599-4bv9h\" (UID: \"dce1cb95-805f-4793-9a3e-45f4b18ca4cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4bv9h" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.797099 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adfb3989-3a4d-4796-8c41-d6f554acea6d-config\") pod \"kube-controller-manager-operator-78b949d7b-wkhrv\" (UID: \"adfb3989-3a4d-4796-8c41-d6f554acea6d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wkhrv" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.797118 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/2e3a06e3-520e-4096-ad0e-550a3cdd2a2b-etcd-service-ca\") pod \"etcd-operator-b45778765-zrjt7\" (UID: \"2e3a06e3-520e-4096-ad0e-550a3cdd2a2b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zrjt7" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.797136 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmc5k\" (UniqueName: \"kubernetes.io/projected/86dce66a-8c06-4f71-8ce9-dec87390310d-kube-api-access-mmc5k\") pod \"route-controller-manager-6576b87f9c-4c7j6\" (UID: \"86dce66a-8c06-4f71-8ce9-dec87390310d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4c7j6" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.797155 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/01f5674d-e0dc-4a62-94a8-7200c30f524d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-g2fw7\" (UID: \"01f5674d-e0dc-4a62-94a8-7200c30f524d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g2fw7" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.797178 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0705c1f1-8602-47fc-9b52-82a70cf21976-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jqwpq\" (UID: \"0705c1f1-8602-47fc-9b52-82a70cf21976\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqwpq" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.797198 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0705c1f1-8602-47fc-9b52-82a70cf21976-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jqwpq\" (UID: \"0705c1f1-8602-47fc-9b52-82a70cf21976\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqwpq" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.797216 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/2e3a06e3-520e-4096-ad0e-550a3cdd2a2b-etcd-ca\") pod \"etcd-operator-b45778765-zrjt7\" (UID: \"2e3a06e3-520e-4096-ad0e-550a3cdd2a2b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zrjt7" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.797236 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfrmn\" (UniqueName: \"kubernetes.io/projected/0705c1f1-8602-47fc-9b52-82a70cf21976-kube-api-access-hfrmn\") pod \"openshift-controller-manager-operator-756b6f6bc6-jqwpq\" (UID: \"0705c1f1-8602-47fc-9b52-82a70cf21976\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqwpq" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.797252 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e3a06e3-520e-4096-ad0e-550a3cdd2a2b-config\") pod \"etcd-operator-b45778765-zrjt7\" (UID: \"2e3a06e3-520e-4096-ad0e-550a3cdd2a2b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zrjt7" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.797271 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-console-config\") pod \"console-f9d7485db-xx8r8\" (UID: \"96767da2-aab3-4f6d-a14b-ada1c0a4ded8\") " pod="openshift-console/console-f9d7485db-xx8r8" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.797297 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4c92443d-f8f6-4941-9729-013d10138707-images\") pod \"machine-api-operator-5694c8668f-hbprv\" (UID: \"4c92443d-f8f6-4941-9729-013d10138707\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hbprv" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.797672 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c92443d-f8f6-4941-9729-013d10138707-config\") pod \"machine-api-operator-5694c8668f-hbprv\" (UID: \"4c92443d-f8f6-4941-9729-013d10138707\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hbprv" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.797982 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4c92443d-f8f6-4941-9729-013d10138707-images\") pod \"machine-api-operator-5694c8668f-hbprv\" (UID: \"4c92443d-f8f6-4941-9729-013d10138707\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hbprv" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.798329 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-oauth-serving-cert\") pod \"console-f9d7485db-xx8r8\" (UID: \"96767da2-aab3-4f6d-a14b-ada1c0a4ded8\") " pod="openshift-console/console-f9d7485db-xx8r8" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.799200 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-trusted-ca-bundle\") pod \"console-f9d7485db-xx8r8\" (UID: \"96767da2-aab3-4f6d-a14b-ada1c0a4ded8\") " pod="openshift-console/console-f9d7485db-xx8r8" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.799225 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dce1cb95-805f-4793-9a3e-45f4b18ca4cc-config\") pod \"authentication-operator-69f744f599-4bv9h\" (UID: \"dce1cb95-805f-4793-9a3e-45f4b18ca4cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4bv9h" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.799982 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dce1cb95-805f-4793-9a3e-45f4b18ca4cc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-4bv9h\" (UID: \"dce1cb95-805f-4793-9a3e-45f4b18ca4cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4bv9h" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.800924 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dce1cb95-805f-4793-9a3e-45f4b18ca4cc-service-ca-bundle\") pod \"authentication-operator-69f744f599-4bv9h\" (UID: \"dce1cb95-805f-4793-9a3e-45f4b18ca4cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4bv9h" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.801039 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0705c1f1-8602-47fc-9b52-82a70cf21976-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jqwpq\" (UID: \"0705c1f1-8602-47fc-9b52-82a70cf21976\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqwpq" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.801145 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-console-config\") pod \"console-f9d7485db-xx8r8\" (UID: \"96767da2-aab3-4f6d-a14b-ada1c0a4ded8\") " pod="openshift-console/console-f9d7485db-xx8r8" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.801419 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.802575 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe97f639-3a7f-437e-945c-e5a530726ced-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-z84rb\" (UID: \"fe97f639-3a7f-437e-945c-e5a530726ced\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z84rb" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.802833 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-service-ca\") pod \"console-f9d7485db-xx8r8\" (UID: \"96767da2-aab3-4f6d-a14b-ada1c0a4ded8\") " pod="openshift-console/console-f9d7485db-xx8r8" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.803069 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c92443d-f8f6-4941-9729-013d10138707-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-hbprv\" (UID: \"4c92443d-f8f6-4941-9729-013d10138707\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hbprv" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.803282 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mcv46"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.803412 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dce1cb95-805f-4793-9a3e-45f4b18ca4cc-serving-cert\") pod \"authentication-operator-69f744f599-4bv9h\" (UID: \"dce1cb95-805f-4793-9a3e-45f4b18ca4cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4bv9h" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.803762 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-console-oauth-config\") pod \"console-f9d7485db-xx8r8\" (UID: \"96767da2-aab3-4f6d-a14b-ada1c0a4ded8\") " pod="openshift-console/console-f9d7485db-xx8r8" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.805086 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-console-serving-cert\") pod \"console-f9d7485db-xx8r8\" (UID: \"96767da2-aab3-4f6d-a14b-ada1c0a4ded8\") " pod="openshift-console/console-f9d7485db-xx8r8" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.805325 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4c7j6"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.806646 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ww2cw"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.807813 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-lhj8c"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.809028 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-v9nfw"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.810226 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-d47lw"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.811433 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gwc5q"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.812710 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4f7h4"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.813964 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-8j6qj"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.814710 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8j6qj" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.816249 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-p6cjh"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.818545 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330595-k68rm"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.818630 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-p6cjh" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.818896 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0705c1f1-8602-47fc-9b52-82a70cf21976-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jqwpq\" (UID: \"0705c1f1-8602-47fc-9b52-82a70cf21976\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqwpq" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.819181 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.819742 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7qgxt"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.821993 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-p6cjh"] Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.838698 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.859588 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.879675 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.898982 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.900221 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/111689f4-eb0d-47d5-800c-0eaa9acd7425-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-fvtx9\" (UID: \"111689f4-eb0d-47d5-800c-0eaa9acd7425\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fvtx9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.900289 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86dce66a-8c06-4f71-8ce9-dec87390310d-serving-cert\") pod \"route-controller-manager-6576b87f9c-4c7j6\" (UID: \"86dce66a-8c06-4f71-8ce9-dec87390310d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4c7j6" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.900342 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2e3a06e3-520e-4096-ad0e-550a3cdd2a2b-etcd-client\") pod \"etcd-operator-b45778765-zrjt7\" (UID: \"2e3a06e3-520e-4096-ad0e-550a3cdd2a2b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zrjt7" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.900364 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1530c252-8f8d-4bf2-89f9-6d358d9cd617-auth-proxy-config\") pod \"machine-config-operator-74547568cd-lhj8c\" (UID: \"1530c252-8f8d-4bf2-89f9-6d358d9cd617\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lhj8c" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.900396 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/adfb3989-3a4d-4796-8c41-d6f554acea6d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-wkhrv\" (UID: \"adfb3989-3a4d-4796-8c41-d6f554acea6d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wkhrv" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.900452 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25wsz\" (UniqueName: \"kubernetes.io/projected/3d72c6ea-d44c-4da5-9fa7-17cd4b3f3cd4-kube-api-access-25wsz\") pod \"dns-operator-744455d44c-nc8d6\" (UID: \"3d72c6ea-d44c-4da5-9fa7-17cd4b3f3cd4\") " pod="openshift-dns-operator/dns-operator-744455d44c-nc8d6" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.900494 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdqk6\" (UniqueName: \"kubernetes.io/projected/111689f4-eb0d-47d5-800c-0eaa9acd7425-kube-api-access-cdqk6\") pod \"machine-config-controller-84d6567774-fvtx9\" (UID: \"111689f4-eb0d-47d5-800c-0eaa9acd7425\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fvtx9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.900520 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz5mc\" (UniqueName: \"kubernetes.io/projected/2e3a06e3-520e-4096-ad0e-550a3cdd2a2b-kube-api-access-jz5mc\") pod \"etcd-operator-b45778765-zrjt7\" (UID: \"2e3a06e3-520e-4096-ad0e-550a3cdd2a2b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zrjt7" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.900563 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/111689f4-eb0d-47d5-800c-0eaa9acd7425-proxy-tls\") pod \"machine-config-controller-84d6567774-fvtx9\" (UID: \"111689f4-eb0d-47d5-800c-0eaa9acd7425\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fvtx9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.900584 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3d72c6ea-d44c-4da5-9fa7-17cd4b3f3cd4-metrics-tls\") pod \"dns-operator-744455d44c-nc8d6\" (UID: \"3d72c6ea-d44c-4da5-9fa7-17cd4b3f3cd4\") " pod="openshift-dns-operator/dns-operator-744455d44c-nc8d6" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.900630 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcfgc\" (UniqueName: \"kubernetes.io/projected/01f5674d-e0dc-4a62-94a8-7200c30f524d-kube-api-access-mcfgc\") pod \"cluster-image-registry-operator-dc59b4c8b-g2fw7\" (UID: \"01f5674d-e0dc-4a62-94a8-7200c30f524d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g2fw7" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.900661 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/01f5674d-e0dc-4a62-94a8-7200c30f524d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-g2fw7\" (UID: \"01f5674d-e0dc-4a62-94a8-7200c30f524d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g2fw7" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.900691 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1530c252-8f8d-4bf2-89f9-6d358d9cd617-images\") pod \"machine-config-operator-74547568cd-lhj8c\" (UID: \"1530c252-8f8d-4bf2-89f9-6d358d9cd617\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lhj8c" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.900734 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1530c252-8f8d-4bf2-89f9-6d358d9cd617-proxy-tls\") pod \"machine-config-operator-74547568cd-lhj8c\" (UID: \"1530c252-8f8d-4bf2-89f9-6d358d9cd617\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lhj8c" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.900772 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2007404d-830b-44fe-b627-18b205d9f566-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ttmh8\" (UID: \"2007404d-830b-44fe-b627-18b205d9f566\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ttmh8" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.900834 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2007404d-830b-44fe-b627-18b205d9f566-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ttmh8\" (UID: \"2007404d-830b-44fe-b627-18b205d9f566\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ttmh8" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.900868 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/01f5674d-e0dc-4a62-94a8-7200c30f524d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-g2fw7\" (UID: \"01f5674d-e0dc-4a62-94a8-7200c30f524d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g2fw7" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.900903 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e3a06e3-520e-4096-ad0e-550a3cdd2a2b-serving-cert\") pod \"etcd-operator-b45778765-zrjt7\" (UID: \"2e3a06e3-520e-4096-ad0e-550a3cdd2a2b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zrjt7" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.900936 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86dce66a-8c06-4f71-8ce9-dec87390310d-client-ca\") pod \"route-controller-manager-6576b87f9c-4c7j6\" (UID: \"86dce66a-8c06-4f71-8ce9-dec87390310d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4c7j6" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.900958 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2007404d-830b-44fe-b627-18b205d9f566-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ttmh8\" (UID: \"2007404d-830b-44fe-b627-18b205d9f566\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ttmh8" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.901015 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/adfb3989-3a4d-4796-8c41-d6f554acea6d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-wkhrv\" (UID: \"adfb3989-3a4d-4796-8c41-d6f554acea6d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wkhrv" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.901042 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adfb3989-3a4d-4796-8c41-d6f554acea6d-config\") pod \"kube-controller-manager-operator-78b949d7b-wkhrv\" (UID: \"adfb3989-3a4d-4796-8c41-d6f554acea6d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wkhrv" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.901069 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/2e3a06e3-520e-4096-ad0e-550a3cdd2a2b-etcd-service-ca\") pod \"etcd-operator-b45778765-zrjt7\" (UID: \"2e3a06e3-520e-4096-ad0e-550a3cdd2a2b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zrjt7" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.901093 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmc5k\" (UniqueName: \"kubernetes.io/projected/86dce66a-8c06-4f71-8ce9-dec87390310d-kube-api-access-mmc5k\") pod \"route-controller-manager-6576b87f9c-4c7j6\" (UID: \"86dce66a-8c06-4f71-8ce9-dec87390310d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4c7j6" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.901120 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/01f5674d-e0dc-4a62-94a8-7200c30f524d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-g2fw7\" (UID: \"01f5674d-e0dc-4a62-94a8-7200c30f524d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g2fw7" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.901146 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/111689f4-eb0d-47d5-800c-0eaa9acd7425-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-fvtx9\" (UID: \"111689f4-eb0d-47d5-800c-0eaa9acd7425\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fvtx9" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.901159 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e3a06e3-520e-4096-ad0e-550a3cdd2a2b-config\") pod \"etcd-operator-b45778765-zrjt7\" (UID: \"2e3a06e3-520e-4096-ad0e-550a3cdd2a2b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zrjt7" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.901189 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/2e3a06e3-520e-4096-ad0e-550a3cdd2a2b-etcd-ca\") pod \"etcd-operator-b45778765-zrjt7\" (UID: \"2e3a06e3-520e-4096-ad0e-550a3cdd2a2b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zrjt7" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.901242 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dt5w4\" (UniqueName: \"kubernetes.io/projected/351bb56c-499b-456c-81e2-ea2664ca5960-kube-api-access-dt5w4\") pod \"downloads-7954f5f757-tpvgp\" (UID: \"351bb56c-499b-456c-81e2-ea2664ca5960\") " pod="openshift-console/downloads-7954f5f757-tpvgp" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.901271 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkf47\" (UniqueName: \"kubernetes.io/projected/1530c252-8f8d-4bf2-89f9-6d358d9cd617-kube-api-access-vkf47\") pod \"machine-config-operator-74547568cd-lhj8c\" (UID: \"1530c252-8f8d-4bf2-89f9-6d358d9cd617\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lhj8c" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.901343 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86dce66a-8c06-4f71-8ce9-dec87390310d-config\") pod \"route-controller-manager-6576b87f9c-4c7j6\" (UID: \"86dce66a-8c06-4f71-8ce9-dec87390310d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4c7j6" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.903097 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1530c252-8f8d-4bf2-89f9-6d358d9cd617-auth-proxy-config\") pod \"machine-config-operator-74547568cd-lhj8c\" (UID: \"1530c252-8f8d-4bf2-89f9-6d358d9cd617\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lhj8c" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.903795 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2007404d-830b-44fe-b627-18b205d9f566-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ttmh8\" (UID: \"2007404d-830b-44fe-b627-18b205d9f566\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ttmh8" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.904419 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e3a06e3-520e-4096-ad0e-550a3cdd2a2b-config\") pod \"etcd-operator-b45778765-zrjt7\" (UID: \"2e3a06e3-520e-4096-ad0e-550a3cdd2a2b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zrjt7" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.904664 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/2e3a06e3-520e-4096-ad0e-550a3cdd2a2b-etcd-service-ca\") pod \"etcd-operator-b45778765-zrjt7\" (UID: \"2e3a06e3-520e-4096-ad0e-550a3cdd2a2b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zrjt7" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.904689 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/2e3a06e3-520e-4096-ad0e-550a3cdd2a2b-etcd-ca\") pod \"etcd-operator-b45778765-zrjt7\" (UID: \"2e3a06e3-520e-4096-ad0e-550a3cdd2a2b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zrjt7" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.905207 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/adfb3989-3a4d-4796-8c41-d6f554acea6d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-wkhrv\" (UID: \"adfb3989-3a4d-4796-8c41-d6f554acea6d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wkhrv" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.905247 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2e3a06e3-520e-4096-ad0e-550a3cdd2a2b-etcd-client\") pod \"etcd-operator-b45778765-zrjt7\" (UID: \"2e3a06e3-520e-4096-ad0e-550a3cdd2a2b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zrjt7" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.905741 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adfb3989-3a4d-4796-8c41-d6f554acea6d-config\") pod \"kube-controller-manager-operator-78b949d7b-wkhrv\" (UID: \"adfb3989-3a4d-4796-8c41-d6f554acea6d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wkhrv" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.906467 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e3a06e3-520e-4096-ad0e-550a3cdd2a2b-serving-cert\") pod \"etcd-operator-b45778765-zrjt7\" (UID: \"2e3a06e3-520e-4096-ad0e-550a3cdd2a2b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zrjt7" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.906921 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3d72c6ea-d44c-4da5-9fa7-17cd4b3f3cd4-metrics-tls\") pod \"dns-operator-744455d44c-nc8d6\" (UID: \"3d72c6ea-d44c-4da5-9fa7-17cd4b3f3cd4\") " pod="openshift-dns-operator/dns-operator-744455d44c-nc8d6" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.907146 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/01f5674d-e0dc-4a62-94a8-7200c30f524d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-g2fw7\" (UID: \"01f5674d-e0dc-4a62-94a8-7200c30f524d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g2fw7" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.908052 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2007404d-830b-44fe-b627-18b205d9f566-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ttmh8\" (UID: \"2007404d-830b-44fe-b627-18b205d9f566\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ttmh8" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.919556 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.939856 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.960311 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.981699 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 07 11:23:03 crc kubenswrapper[4700]: I1007 11:23:03.999507 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.019634 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.039183 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.059895 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.078969 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.099733 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.118701 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.139291 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.145939 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86dce66a-8c06-4f71-8ce9-dec87390310d-client-ca\") pod \"route-controller-manager-6576b87f9c-4c7j6\" (UID: \"86dce66a-8c06-4f71-8ce9-dec87390310d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4c7j6" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.158967 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.179889 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.189329 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86dce66a-8c06-4f71-8ce9-dec87390310d-serving-cert\") pod \"route-controller-manager-6576b87f9c-4c7j6\" (UID: \"86dce66a-8c06-4f71-8ce9-dec87390310d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4c7j6" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.198831 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.220459 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.223685 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86dce66a-8c06-4f71-8ce9-dec87390310d-config\") pod \"route-controller-manager-6576b87f9c-4c7j6\" (UID: \"86dce66a-8c06-4f71-8ce9-dec87390310d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4c7j6" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.241851 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.248857 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/01f5674d-e0dc-4a62-94a8-7200c30f524d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-g2fw7\" (UID: \"01f5674d-e0dc-4a62-94a8-7200c30f524d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g2fw7" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.258676 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.295027 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trt6k\" (UniqueName: \"kubernetes.io/projected/b03323a0-5b5d-431a-b5a5-532110e46ab9-kube-api-access-trt6k\") pod \"machine-approver-56656f9798-qmpjz\" (UID: \"b03323a0-5b5d-431a-b5a5-532110e46ab9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qmpjz" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.318212 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xktkr\" (UniqueName: \"kubernetes.io/projected/a40421f2-28c2-4eec-8a1b-77d96ac0b4fc-kube-api-access-xktkr\") pod \"apiserver-76f77b778f-zhvq9\" (UID: \"a40421f2-28c2-4eec-8a1b-77d96ac0b4fc\") " pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.319455 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.339966 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.347903 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1530c252-8f8d-4bf2-89f9-6d358d9cd617-proxy-tls\") pod \"machine-config-operator-74547568cd-lhj8c\" (UID: \"1530c252-8f8d-4bf2-89f9-6d358d9cd617\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lhj8c" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.360017 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.364987 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1530c252-8f8d-4bf2-89f9-6d358d9cd617-images\") pod \"machine-config-operator-74547568cd-lhj8c\" (UID: \"1530c252-8f8d-4bf2-89f9-6d358d9cd617\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lhj8c" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.379015 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.388336 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/111689f4-eb0d-47d5-800c-0eaa9acd7425-proxy-tls\") pod \"machine-config-controller-84d6567774-fvtx9\" (UID: \"111689f4-eb0d-47d5-800c-0eaa9acd7425\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fvtx9" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.388544 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.408768 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.434881 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gpjk\" (UniqueName: \"kubernetes.io/projected/8c79b619-73da-4918-b5fd-e45ac5463a91-kube-api-access-2gpjk\") pod \"apiserver-7bbb656c7d-svs9t\" (UID: \"8c79b619-73da-4918-b5fd-e45ac5463a91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-svs9t" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.453162 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qmpjz" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.458106 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9dgn\" (UniqueName: \"kubernetes.io/projected/fe7d5c4e-f774-4519-8696-781c27efb691-kube-api-access-q9dgn\") pod \"console-operator-58897d9998-7l929\" (UID: \"fe7d5c4e-f774-4519-8696-781c27efb691\") " pod="openshift-console-operator/console-operator-58897d9998-7l929" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.498807 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wbdv\" (UniqueName: \"kubernetes.io/projected/274a19c3-7e08-4994-986f-d43d111bde3c-kube-api-access-9wbdv\") pod \"oauth-openshift-558db77b4-wlgsp\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.500211 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmghn\" (UniqueName: \"kubernetes.io/projected/dc801e9f-fe8b-425a-9d80-4c1c67cbd959-kube-api-access-cmghn\") pod \"openshift-apiserver-operator-796bbdcf4f-2xjmg\" (UID: \"dc801e9f-fe8b-425a-9d80-4c1c67cbd959\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2xjmg" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.516276 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9whct\" (UniqueName: \"kubernetes.io/projected/2557333d-abb9-4863-81c9-397307a108f6-kube-api-access-9whct\") pod \"controller-manager-879f6c89f-8l887\" (UID: \"2557333d-abb9-4863-81c9-397307a108f6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8l887" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.538414 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-428cg\" (UniqueName: \"kubernetes.io/projected/e78af36c-9fef-4db9-80a6-b0d02485f7bf-kube-api-access-428cg\") pod \"openshift-config-operator-7777fb866f-b6rkf\" (UID: \"e78af36c-9fef-4db9-80a6-b0d02485f7bf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b6rkf" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.578949 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.599884 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.618995 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.639676 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.659918 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.677352 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8l887" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.688228 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.699156 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.699570 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-svs9t" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.716131 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.719262 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.731377 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2xjmg" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.737680 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b6rkf" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.740465 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.747406 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-7l929" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.757673 4700 request.go:700] Waited for 1.018930043s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/configmaps?fieldSelector=metadata.name%3Dconfig&limit=500&resourceVersion=0 Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.765714 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.781396 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.806609 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.819481 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.841159 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-zhvq9"] Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.841453 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.863692 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.865344 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qmpjz" event={"ID":"b03323a0-5b5d-431a-b5a5-532110e46ab9","Type":"ContainerStarted","Data":"5543d8dcc845e72ba93608cab5d6e6741139bd63b409ad402d2b9e5bb4c93e68"} Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.865409 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qmpjz" event={"ID":"b03323a0-5b5d-431a-b5a5-532110e46ab9","Type":"ContainerStarted","Data":"ab8fdd6ed5f288603d3c44775a036d73f896704e4a980a3ebe5940b85ed619a7"} Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.879412 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.902954 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.924367 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.927160 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8l887"] Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.939131 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.958804 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 07 11:23:04 crc kubenswrapper[4700]: I1007 11:23:04.980857 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.001151 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.004031 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-svs9t"] Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.020810 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 07 11:23:05 crc kubenswrapper[4700]: W1007 11:23:05.033264 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c79b619_73da_4918_b5fd_e45ac5463a91.slice/crio-d8dfb0e4a3d58431aa0e416e091ae705afbb6b8507ebbd4639319b724ead922b WatchSource:0}: Error finding container d8dfb0e4a3d58431aa0e416e091ae705afbb6b8507ebbd4639319b724ead922b: Status 404 returned error can't find the container with id d8dfb0e4a3d58431aa0e416e091ae705afbb6b8507ebbd4639319b724ead922b Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.037097 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wlgsp"] Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.038839 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.046383 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-7l929"] Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.067566 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.071132 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-b6rkf"] Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.081068 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.082387 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2xjmg"] Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.098706 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 07 11:23:05 crc kubenswrapper[4700]: W1007 11:23:05.112044 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode78af36c_9fef_4db9_80a6_b0d02485f7bf.slice/crio-285675edab4e32f91d84afeb9542f149a6aad6457d1ab0f983347f872b8259e6 WatchSource:0}: Error finding container 285675edab4e32f91d84afeb9542f149a6aad6457d1ab0f983347f872b8259e6: Status 404 returned error can't find the container with id 285675edab4e32f91d84afeb9542f149a6aad6457d1ab0f983347f872b8259e6 Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.118977 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.139132 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.159290 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.179135 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.199287 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.219203 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.238738 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.259360 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.279895 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.299101 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.319171 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.339561 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.359342 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.379241 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.399475 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.419333 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.439322 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.458476 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.480774 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.499154 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.519490 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.539232 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.582432 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq8h9\" (UniqueName: \"kubernetes.io/projected/dce1cb95-805f-4793-9a3e-45f4b18ca4cc-kube-api-access-bq8h9\") pod \"authentication-operator-69f744f599-4bv9h\" (UID: \"dce1cb95-805f-4793-9a3e-45f4b18ca4cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4bv9h" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.601925 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqrhq\" (UniqueName: \"kubernetes.io/projected/4c92443d-f8f6-4941-9729-013d10138707-kube-api-access-zqrhq\") pod \"machine-api-operator-5694c8668f-hbprv\" (UID: \"4c92443d-f8f6-4941-9729-013d10138707\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hbprv" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.613859 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mkkg\" (UniqueName: \"kubernetes.io/projected/fe97f639-3a7f-437e-945c-e5a530726ced-kube-api-access-4mkkg\") pod \"cluster-samples-operator-665b6dd947-z84rb\" (UID: \"fe97f639-3a7f-437e-945c-e5a530726ced\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z84rb" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.647316 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xxwj\" (UniqueName: \"kubernetes.io/projected/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-kube-api-access-7xxwj\") pod \"console-f9d7485db-xx8r8\" (UID: \"96767da2-aab3-4f6d-a14b-ada1c0a4ded8\") " pod="openshift-console/console-f9d7485db-xx8r8" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.659921 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.671891 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfrmn\" (UniqueName: \"kubernetes.io/projected/0705c1f1-8602-47fc-9b52-82a70cf21976-kube-api-access-hfrmn\") pod \"openshift-controller-manager-operator-756b6f6bc6-jqwpq\" (UID: \"0705c1f1-8602-47fc-9b52-82a70cf21976\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqwpq" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.682451 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.699273 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.713111 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-hbprv" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.718711 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-xx8r8" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.719356 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.740076 4700 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.754167 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z84rb" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.759693 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.778439 4700 request.go:700] Waited for 1.876538933s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns-operator/serviceaccounts/dns-operator/token Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.802221 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25wsz\" (UniqueName: \"kubernetes.io/projected/3d72c6ea-d44c-4da5-9fa7-17cd4b3f3cd4-kube-api-access-25wsz\") pod \"dns-operator-744455d44c-nc8d6\" (UID: \"3d72c6ea-d44c-4da5-9fa7-17cd4b3f3cd4\") " pod="openshift-dns-operator/dns-operator-744455d44c-nc8d6" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.864984 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqwpq" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.872246 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-4bv9h" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.875545 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2xjmg" event={"ID":"dc801e9f-fe8b-425a-9d80-4c1c67cbd959","Type":"ContainerStarted","Data":"33937973b1c082a6d37ac24c310e0ffd715835144f57b4b67ea1675ddcd618ef"} Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.875585 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2xjmg" event={"ID":"dc801e9f-fe8b-425a-9d80-4c1c67cbd959","Type":"ContainerStarted","Data":"8b4131e99c7b0da91ba3756d4623fb4e412e157f09dc9431866d156c6fc2f870"} Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.882064 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz5mc\" (UniqueName: \"kubernetes.io/projected/2e3a06e3-520e-4096-ad0e-550a3cdd2a2b-kube-api-access-jz5mc\") pod \"etcd-operator-b45778765-zrjt7\" (UID: \"2e3a06e3-520e-4096-ad0e-550a3cdd2a2b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zrjt7" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.883325 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/adfb3989-3a4d-4796-8c41-d6f554acea6d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-wkhrv\" (UID: \"adfb3989-3a4d-4796-8c41-d6f554acea6d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wkhrv" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.883488 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdqk6\" (UniqueName: \"kubernetes.io/projected/111689f4-eb0d-47d5-800c-0eaa9acd7425-kube-api-access-cdqk6\") pod \"machine-config-controller-84d6567774-fvtx9\" (UID: \"111689f4-eb0d-47d5-800c-0eaa9acd7425\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fvtx9" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.885821 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcfgc\" (UniqueName: \"kubernetes.io/projected/01f5674d-e0dc-4a62-94a8-7200c30f524d-kube-api-access-mcfgc\") pod \"cluster-image-registry-operator-dc59b4c8b-g2fw7\" (UID: \"01f5674d-e0dc-4a62-94a8-7200c30f524d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g2fw7" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.892115 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-nc8d6" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.892415 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qmpjz" event={"ID":"b03323a0-5b5d-431a-b5a5-532110e46ab9","Type":"ContainerStarted","Data":"778fd6405446a2b73370134489b106b6dbd3a37dd0c279966a57565101d5b95f"} Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.895814 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/01f5674d-e0dc-4a62-94a8-7200c30f524d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-g2fw7\" (UID: \"01f5674d-e0dc-4a62-94a8-7200c30f524d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g2fw7" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.899937 4700 generic.go:334] "Generic (PLEG): container finished" podID="e78af36c-9fef-4db9-80a6-b0d02485f7bf" containerID="c5858d06a65a49af7ca6c7cb2d8f55ab47a297d925e2e0b866378f86ce56b7c2" exitCode=0 Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.900312 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b6rkf" event={"ID":"e78af36c-9fef-4db9-80a6-b0d02485f7bf","Type":"ContainerDied","Data":"c5858d06a65a49af7ca6c7cb2d8f55ab47a297d925e2e0b866378f86ce56b7c2"} Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.900375 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b6rkf" event={"ID":"e78af36c-9fef-4db9-80a6-b0d02485f7bf","Type":"ContainerStarted","Data":"285675edab4e32f91d84afeb9542f149a6aad6457d1ab0f983347f872b8259e6"} Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.907446 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-7l929" event={"ID":"fe7d5c4e-f774-4519-8696-781c27efb691","Type":"ContainerStarted","Data":"b504523119ab393cfb615e6e3006a1f399102099b47822690303b245dbf8d92e"} Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.907507 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-7l929" event={"ID":"fe7d5c4e-f774-4519-8696-781c27efb691","Type":"ContainerStarted","Data":"3af06060b4dcf718a8d0d4de6a879314bdbe9c8544dbea7ed83c3d8620f885ad"} Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.907666 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-7l929" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.909449 4700 generic.go:334] "Generic (PLEG): container finished" podID="8c79b619-73da-4918-b5fd-e45ac5463a91" containerID="9edcdac044a475e735d4a2cb46af097e819e3b6d619f954fcee175c9d6e21e9e" exitCode=0 Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.909719 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-svs9t" event={"ID":"8c79b619-73da-4918-b5fd-e45ac5463a91","Type":"ContainerDied","Data":"9edcdac044a475e735d4a2cb46af097e819e3b6d619f954fcee175c9d6e21e9e"} Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.909753 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-svs9t" event={"ID":"8c79b619-73da-4918-b5fd-e45ac5463a91","Type":"ContainerStarted","Data":"d8dfb0e4a3d58431aa0e416e091ae705afbb6b8507ebbd4639319b724ead922b"} Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.912659 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" event={"ID":"274a19c3-7e08-4994-986f-d43d111bde3c","Type":"ContainerStarted","Data":"37784e25a0e2dc2e3c9b3254490435972bb12f0e74718e9e6d8269ded191cf20"} Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.912757 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" event={"ID":"274a19c3-7e08-4994-986f-d43d111bde3c","Type":"ContainerStarted","Data":"ec2b30a4a9c5458750cfda38b4855805c77705295d8bbdbbf6f916eed128cf6c"} Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.912822 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.914192 4700 patch_prober.go:28] interesting pod/console-operator-58897d9998-7l929 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.914250 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-7l929" podUID="fe7d5c4e-f774-4519-8696-781c27efb691" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.915558 4700 generic.go:334] "Generic (PLEG): container finished" podID="a40421f2-28c2-4eec-8a1b-77d96ac0b4fc" containerID="d20ad89685cbd323162aa2bf484de534a9d419551f4dc2bc0c7a5beb070cf5b0" exitCode=0 Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.915622 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" event={"ID":"a40421f2-28c2-4eec-8a1b-77d96ac0b4fc","Type":"ContainerDied","Data":"d20ad89685cbd323162aa2bf484de534a9d419551f4dc2bc0c7a5beb070cf5b0"} Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.915658 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" event={"ID":"a40421f2-28c2-4eec-8a1b-77d96ac0b4fc","Type":"ContainerStarted","Data":"5114cfdb7b01b83cf6d0290fc3de3b81f51c8d3d71b6baaa0722f68ea3fe7c7f"} Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.918282 4700 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-wlgsp container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" start-of-body= Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.918351 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" podUID="274a19c3-7e08-4994-986f-d43d111bde3c" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.921358 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2007404d-830b-44fe-b627-18b205d9f566-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ttmh8\" (UID: \"2007404d-830b-44fe-b627-18b205d9f566\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ttmh8" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.925638 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8l887" event={"ID":"2557333d-abb9-4863-81c9-397307a108f6","Type":"ContainerStarted","Data":"39e4ebfa264df42fd9eacf75383dfc334325d3e2a6b41394ee3e60d2671296a5"} Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.925705 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8l887" event={"ID":"2557333d-abb9-4863-81c9-397307a108f6","Type":"ContainerStarted","Data":"c241313bd9a927ee4267bbafe5d7419fc930cf2569335ebbdcee6f498c357b4d"} Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.926357 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-8l887" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.929978 4700 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-8l887 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.930125 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-8l887" podUID="2557333d-abb9-4863-81c9-397307a108f6" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.933144 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmc5k\" (UniqueName: \"kubernetes.io/projected/86dce66a-8c06-4f71-8ce9-dec87390310d-kube-api-access-mmc5k\") pod \"route-controller-manager-6576b87f9c-4c7j6\" (UID: \"86dce66a-8c06-4f71-8ce9-dec87390310d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4c7j6" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.956326 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wkhrv" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.960269 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt5w4\" (UniqueName: \"kubernetes.io/projected/351bb56c-499b-456c-81e2-ea2664ca5960-kube-api-access-dt5w4\") pod \"downloads-7954f5f757-tpvgp\" (UID: \"351bb56c-499b-456c-81e2-ea2664ca5960\") " pod="openshift-console/downloads-7954f5f757-tpvgp" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.962620 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ttmh8" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.972914 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-zrjt7" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.973450 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkf47\" (UniqueName: \"kubernetes.io/projected/1530c252-8f8d-4bf2-89f9-6d358d9cd617-kube-api-access-vkf47\") pod \"machine-config-operator-74547568cd-lhj8c\" (UID: \"1530c252-8f8d-4bf2-89f9-6d358d9cd617\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lhj8c" Oct 07 11:23:05 crc kubenswrapper[4700]: I1007 11:23:05.994391 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-xx8r8"] Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.002787 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4c7j6" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.009178 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g2fw7" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.021232 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lhj8c" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.026416 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fvtx9" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.067373 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8bc1e1cc-eb00-4dd9-82e5-895638bbde14-default-certificate\") pod \"router-default-5444994796-26t2g\" (UID: \"8bc1e1cc-eb00-4dd9-82e5-895638bbde14\") " pod="openshift-ingress/router-default-5444994796-26t2g" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.067415 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.067439 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b8df7762-8c45-42ad-a645-83eb0bbed34a-registry-certificates\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.067458 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8bc1e1cc-eb00-4dd9-82e5-895638bbde14-stats-auth\") pod \"router-default-5444994796-26t2g\" (UID: \"8bc1e1cc-eb00-4dd9-82e5-895638bbde14\") " pod="openshift-ingress/router-default-5444994796-26t2g" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.067477 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8bc1e1cc-eb00-4dd9-82e5-895638bbde14-metrics-certs\") pod \"router-default-5444994796-26t2g\" (UID: \"8bc1e1cc-eb00-4dd9-82e5-895638bbde14\") " pod="openshift-ingress/router-default-5444994796-26t2g" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.067502 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b8df7762-8c45-42ad-a645-83eb0bbed34a-trusted-ca\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.067562 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b8df7762-8c45-42ad-a645-83eb0bbed34a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.067582 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77938bd8-bce7-4ca6-abcd-e0e965e47530-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vrqbv\" (UID: \"77938bd8-bce7-4ca6-abcd-e0e965e47530\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vrqbv" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.067601 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7jsq\" (UniqueName: \"kubernetes.io/projected/b8df7762-8c45-42ad-a645-83eb0bbed34a-kube-api-access-x7jsq\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.067633 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b8df7762-8c45-42ad-a645-83eb0bbed34a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.067650 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b8df7762-8c45-42ad-a645-83eb0bbed34a-registry-tls\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.067673 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdkk2\" (UniqueName: \"kubernetes.io/projected/8bc1e1cc-eb00-4dd9-82e5-895638bbde14-kube-api-access-tdkk2\") pod \"router-default-5444994796-26t2g\" (UID: \"8bc1e1cc-eb00-4dd9-82e5-895638bbde14\") " pod="openshift-ingress/router-default-5444994796-26t2g" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.067703 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b8df7762-8c45-42ad-a645-83eb0bbed34a-bound-sa-token\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.067720 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bc1e1cc-eb00-4dd9-82e5-895638bbde14-service-ca-bundle\") pod \"router-default-5444994796-26t2g\" (UID: \"8bc1e1cc-eb00-4dd9-82e5-895638bbde14\") " pod="openshift-ingress/router-default-5444994796-26t2g" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.067741 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/77938bd8-bce7-4ca6-abcd-e0e965e47530-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vrqbv\" (UID: \"77938bd8-bce7-4ca6-abcd-e0e965e47530\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vrqbv" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.067767 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77938bd8-bce7-4ca6-abcd-e0e965e47530-config\") pod \"kube-apiserver-operator-766d6c64bb-vrqbv\" (UID: \"77938bd8-bce7-4ca6-abcd-e0e965e47530\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vrqbv" Oct 07 11:23:06 crc kubenswrapper[4700]: E1007 11:23:06.067914 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:06.567892471 +0000 UTC m=+153.364291460 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:06 crc kubenswrapper[4700]: W1007 11:23:06.082987 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96767da2_aab3_4f6d_a14b_ada1c0a4ded8.slice/crio-c6538b61de4ebf7d45dffdffbabd115c544a735d56563decaa9c4b00e96a73f5 WatchSource:0}: Error finding container c6538b61de4ebf7d45dffdffbabd115c544a735d56563decaa9c4b00e96a73f5: Status 404 returned error can't find the container with id c6538b61de4ebf7d45dffdffbabd115c544a735d56563decaa9c4b00e96a73f5 Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.121823 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-4bv9h"] Oct 07 11:23:06 crc kubenswrapper[4700]: W1007 11:23:06.160691 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddce1cb95_805f_4793_9a3e_45f4b18ca4cc.slice/crio-68f6db70100b8a020ad910ae3f3e876c798d83921a3c7e25c2f93ed61ec4df23 WatchSource:0}: Error finding container 68f6db70100b8a020ad910ae3f3e876c798d83921a3c7e25c2f93ed61ec4df23: Status 404 returned error can't find the container with id 68f6db70100b8a020ad910ae3f3e876c798d83921a3c7e25c2f93ed61ec4df23 Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.168972 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.169351 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bc1e1cc-eb00-4dd9-82e5-895638bbde14-service-ca-bundle\") pod \"router-default-5444994796-26t2g\" (UID: \"8bc1e1cc-eb00-4dd9-82e5-895638bbde14\") " pod="openshift-ingress/router-default-5444994796-26t2g" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.169388 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b8df7762-8c45-42ad-a645-83eb0bbed34a-bound-sa-token\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.169408 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/77938bd8-bce7-4ca6-abcd-e0e965e47530-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vrqbv\" (UID: \"77938bd8-bce7-4ca6-abcd-e0e965e47530\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vrqbv" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.169457 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c271ef31-887c-4b30-857a-7969eb9063bf-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-wcjjj\" (UID: \"c271ef31-887c-4b30-857a-7969eb9063bf\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wcjjj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.169495 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqszd\" (UniqueName: \"kubernetes.io/projected/deda04fb-8c07-460f-9756-c6ee7572925f-kube-api-access-vqszd\") pod \"package-server-manager-789f6589d5-4f7h4\" (UID: \"deda04fb-8c07-460f-9756-c6ee7572925f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4f7h4" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.169523 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bf93caea-b071-422d-ab31-9739c096160c-srv-cert\") pod \"catalog-operator-68c6474976-l8kk7\" (UID: \"bf93caea-b071-422d-ab31-9739c096160c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l8kk7" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.169543 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/402cf0e4-8561-4c43-8d2b-f2cb5b369345-trusted-ca\") pod \"ingress-operator-5b745b69d9-rqvfm\" (UID: \"402cf0e4-8561-4c43-8d2b-f2cb5b369345\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rqvfm" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.169563 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77938bd8-bce7-4ca6-abcd-e0e965e47530-config\") pod \"kube-apiserver-operator-766d6c64bb-vrqbv\" (UID: \"77938bd8-bce7-4ca6-abcd-e0e965e47530\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vrqbv" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.169650 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9249\" (UniqueName: \"kubernetes.io/projected/d4a12197-db67-481c-8508-a908ee9e9f01-kube-api-access-s9249\") pod \"olm-operator-6b444d44fb-2nxnf\" (UID: \"d4a12197-db67-481c-8508-a908ee9e9f01\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nxnf" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.169675 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdc2s\" (UniqueName: \"kubernetes.io/projected/c271ef31-887c-4b30-857a-7969eb9063bf-kube-api-access-sdc2s\") pod \"control-plane-machine-set-operator-78cbb6b69f-wcjjj\" (UID: \"c271ef31-887c-4b30-857a-7969eb9063bf\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wcjjj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.169707 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d4a12197-db67-481c-8508-a908ee9e9f01-srv-cert\") pod \"olm-operator-6b444d44fb-2nxnf\" (UID: \"d4a12197-db67-481c-8508-a908ee9e9f01\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nxnf" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.169737 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x29h\" (UniqueName: \"kubernetes.io/projected/296e1c83-6ec8-4e68-9052-a1d3ea0e1d29-kube-api-access-6x29h\") pod \"multus-admission-controller-857f4d67dd-wx8hg\" (UID: \"296e1c83-6ec8-4e68-9052-a1d3ea0e1d29\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wx8hg" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.169753 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3c0381fa-4154-4d46-8531-51dc924c58fc-socket-dir\") pod \"csi-hostpathplugin-p6cjh\" (UID: \"3c0381fa-4154-4d46-8531-51dc924c58fc\") " pod="hostpath-provisioner/csi-hostpathplugin-p6cjh" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.169770 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqs2x\" (UniqueName: \"kubernetes.io/projected/ca3c8e5f-3994-409a-b8b2-58ee2ee245b6-kube-api-access-mqs2x\") pod \"marketplace-operator-79b997595-7qgxt\" (UID: \"ca3c8e5f-3994-409a-b8b2-58ee2ee245b6\") " pod="openshift-marketplace/marketplace-operator-79b997595-7qgxt" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.169801 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8zfr\" (UniqueName: \"kubernetes.io/projected/bf93caea-b071-422d-ab31-9739c096160c-kube-api-access-c8zfr\") pod \"catalog-operator-68c6474976-l8kk7\" (UID: \"bf93caea-b071-422d-ab31-9739c096160c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l8kk7" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.169819 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a7759fe-2642-4798-95ad-09f9c3d0b901-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-mcv46\" (UID: \"8a7759fe-2642-4798-95ad-09f9c3d0b901\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mcv46" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.169893 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6e20de47-81b2-4b3c-9fbc-ab4ea511d276-tmpfs\") pod \"packageserver-d55dfcdfc-gwc5q\" (UID: \"6e20de47-81b2-4b3c-9fbc-ab4ea511d276\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gwc5q" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.170021 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8bc1e1cc-eb00-4dd9-82e5-895638bbde14-default-certificate\") pod \"router-default-5444994796-26t2g\" (UID: \"8bc1e1cc-eb00-4dd9-82e5-895638bbde14\") " pod="openshift-ingress/router-default-5444994796-26t2g" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.170042 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwzns\" (UniqueName: \"kubernetes.io/projected/402cf0e4-8561-4c43-8d2b-f2cb5b369345-kube-api-access-bwzns\") pod \"ingress-operator-5b745b69d9-rqvfm\" (UID: \"402cf0e4-8561-4c43-8d2b-f2cb5b369345\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rqvfm" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.170072 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/402cf0e4-8561-4c43-8d2b-f2cb5b369345-metrics-tls\") pod \"ingress-operator-5b745b69d9-rqvfm\" (UID: \"402cf0e4-8561-4c43-8d2b-f2cb5b369345\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rqvfm" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.170088 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfkcw\" (UniqueName: \"kubernetes.io/projected/3c0381fa-4154-4d46-8531-51dc924c58fc-kube-api-access-cfkcw\") pod \"csi-hostpathplugin-p6cjh\" (UID: \"3c0381fa-4154-4d46-8531-51dc924c58fc\") " pod="hostpath-provisioner/csi-hostpathplugin-p6cjh" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.170114 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b8df7762-8c45-42ad-a645-83eb0bbed34a-registry-certificates\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.170153 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8bc1e1cc-eb00-4dd9-82e5-895638bbde14-stats-auth\") pod \"router-default-5444994796-26t2g\" (UID: \"8bc1e1cc-eb00-4dd9-82e5-895638bbde14\") " pod="openshift-ingress/router-default-5444994796-26t2g" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.170170 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3c0381fa-4154-4d46-8531-51dc924c58fc-registration-dir\") pod \"csi-hostpathplugin-p6cjh\" (UID: \"3c0381fa-4154-4d46-8531-51dc924c58fc\") " pod="hostpath-provisioner/csi-hostpathplugin-p6cjh" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.170196 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/cb37980a-f40a-4a5d-94c1-3197db1e7206-certs\") pod \"machine-config-server-8j6qj\" (UID: \"cb37980a-f40a-4a5d-94c1-3197db1e7206\") " pod="openshift-machine-config-operator/machine-config-server-8j6qj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.170214 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksr87\" (UniqueName: \"kubernetes.io/projected/cb37980a-f40a-4a5d-94c1-3197db1e7206-kube-api-access-ksr87\") pod \"machine-config-server-8j6qj\" (UID: \"cb37980a-f40a-4a5d-94c1-3197db1e7206\") " pod="openshift-machine-config-operator/machine-config-server-8j6qj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.170230 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/296e1c83-6ec8-4e68-9052-a1d3ea0e1d29-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wx8hg\" (UID: \"296e1c83-6ec8-4e68-9052-a1d3ea0e1d29\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wx8hg" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.170246 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/deda04fb-8c07-460f-9756-c6ee7572925f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-4f7h4\" (UID: \"deda04fb-8c07-460f-9756-c6ee7572925f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4f7h4" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.170276 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8bc1e1cc-eb00-4dd9-82e5-895638bbde14-metrics-certs\") pod \"router-default-5444994796-26t2g\" (UID: \"8bc1e1cc-eb00-4dd9-82e5-895638bbde14\") " pod="openshift-ingress/router-default-5444994796-26t2g" Oct 07 11:23:06 crc kubenswrapper[4700]: E1007 11:23:06.170460 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:06.670439099 +0000 UTC m=+153.466838278 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.170934 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77938bd8-bce7-4ca6-abcd-e0e965e47530-config\") pod \"kube-apiserver-operator-766d6c64bb-vrqbv\" (UID: \"77938bd8-bce7-4ca6-abcd-e0e965e47530\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vrqbv" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.174431 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bc1e1cc-eb00-4dd9-82e5-895638bbde14-service-ca-bundle\") pod \"router-default-5444994796-26t2g\" (UID: \"8bc1e1cc-eb00-4dd9-82e5-895638bbde14\") " pod="openshift-ingress/router-default-5444994796-26t2g" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.176375 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d4a12197-db67-481c-8508-a908ee9e9f01-profile-collector-cert\") pod \"olm-operator-6b444d44fb-2nxnf\" (UID: \"d4a12197-db67-481c-8508-a908ee9e9f01\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nxnf" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.176436 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6e20de47-81b2-4b3c-9fbc-ab4ea511d276-apiservice-cert\") pod \"packageserver-d55dfcdfc-gwc5q\" (UID: \"6e20de47-81b2-4b3c-9fbc-ab4ea511d276\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gwc5q" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.176464 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb8m6\" (UniqueName: \"kubernetes.io/projected/6e20de47-81b2-4b3c-9fbc-ab4ea511d276-kube-api-access-kb8m6\") pod \"packageserver-d55dfcdfc-gwc5q\" (UID: \"6e20de47-81b2-4b3c-9fbc-ab4ea511d276\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gwc5q" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.176671 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b24e9005-dccf-4f8b-b476-19d9c5bf6544-config\") pod \"service-ca-operator-777779d784-9lqrf\" (UID: \"b24e9005-dccf-4f8b-b476-19d9c5bf6544\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9lqrf" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.176701 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3c0381fa-4154-4d46-8531-51dc924c58fc-mountpoint-dir\") pod \"csi-hostpathplugin-p6cjh\" (UID: \"3c0381fa-4154-4d46-8531-51dc924c58fc\") " pod="hostpath-provisioner/csi-hostpathplugin-p6cjh" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.176898 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b8df7762-8c45-42ad-a645-83eb0bbed34a-trusted-ca\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.178273 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x79fq\" (UniqueName: \"kubernetes.io/projected/c3de352c-546d-48ce-bc08-d8731761b018-kube-api-access-x79fq\") pod \"ingress-canary-v9nfw\" (UID: \"c3de352c-546d-48ce-bc08-d8731761b018\") " pod="openshift-ingress-canary/ingress-canary-v9nfw" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.178323 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b24e9005-dccf-4f8b-b476-19d9c5bf6544-serving-cert\") pod \"service-ca-operator-777779d784-9lqrf\" (UID: \"b24e9005-dccf-4f8b-b476-19d9c5bf6544\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9lqrf" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.178365 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6e20de47-81b2-4b3c-9fbc-ab4ea511d276-webhook-cert\") pod \"packageserver-d55dfcdfc-gwc5q\" (UID: \"6e20de47-81b2-4b3c-9fbc-ab4ea511d276\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gwc5q" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.178433 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b8df7762-8c45-42ad-a645-83eb0bbed34a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.178455 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b8df7762-8c45-42ad-a645-83eb0bbed34a-registry-certificates\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.178460 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a7759fe-2642-4798-95ad-09f9c3d0b901-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-mcv46\" (UID: \"8a7759fe-2642-4798-95ad-09f9c3d0b901\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mcv46" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.178512 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ca3c8e5f-3994-409a-b8b2-58ee2ee245b6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7qgxt\" (UID: \"ca3c8e5f-3994-409a-b8b2-58ee2ee245b6\") " pod="openshift-marketplace/marketplace-operator-79b997595-7qgxt" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.178808 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bf93caea-b071-422d-ab31-9739c096160c-profile-collector-cert\") pod \"catalog-operator-68c6474976-l8kk7\" (UID: \"bf93caea-b071-422d-ab31-9739c096160c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l8kk7" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.178827 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3c0381fa-4154-4d46-8531-51dc924c58fc-plugins-dir\") pod \"csi-hostpathplugin-p6cjh\" (UID: \"3c0381fa-4154-4d46-8531-51dc924c58fc\") " pod="hostpath-provisioner/csi-hostpathplugin-p6cjh" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.178969 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b8df7762-8c45-42ad-a645-83eb0bbed34a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.179730 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77938bd8-bce7-4ca6-abcd-e0e965e47530-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vrqbv\" (UID: \"77938bd8-bce7-4ca6-abcd-e0e965e47530\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vrqbv" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.179783 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7jsq\" (UniqueName: \"kubernetes.io/projected/b8df7762-8c45-42ad-a645-83eb0bbed34a-kube-api-access-x7jsq\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.180348 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-tpvgp" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.182084 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bd3a2401-7fe7-4270-8437-8cdc5d4b4a65-metrics-tls\") pod \"dns-default-wflpr\" (UID: \"bd3a2401-7fe7-4270-8437-8cdc5d4b4a65\") " pod="openshift-dns/dns-default-wflpr" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.182122 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-565pf\" (UniqueName: \"kubernetes.io/projected/b24e9005-dccf-4f8b-b476-19d9c5bf6544-kube-api-access-565pf\") pod \"service-ca-operator-777779d784-9lqrf\" (UID: \"b24e9005-dccf-4f8b-b476-19d9c5bf6544\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9lqrf" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.182180 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bd3a2401-7fe7-4270-8437-8cdc5d4b4a65-config-volume\") pod \"dns-default-wflpr\" (UID: \"bd3a2401-7fe7-4270-8437-8cdc5d4b4a65\") " pod="openshift-dns/dns-default-wflpr" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.182457 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqwpq"] Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.182912 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b8df7762-8c45-42ad-a645-83eb0bbed34a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.182999 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b8df7762-8c45-42ad-a645-83eb0bbed34a-registry-tls\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.183052 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx2dl\" (UniqueName: \"kubernetes.io/projected/8a7759fe-2642-4798-95ad-09f9c3d0b901-kube-api-access-hx2dl\") pod \"kube-storage-version-migrator-operator-b67b599dd-mcv46\" (UID: \"8a7759fe-2642-4798-95ad-09f9c3d0b901\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mcv46" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.183089 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d56sw\" (UniqueName: \"kubernetes.io/projected/2a1405be-c33d-417f-839f-cb2f16ee0b70-kube-api-access-d56sw\") pod \"collect-profiles-29330595-k68rm\" (UID: \"2a1405be-c33d-417f-839f-cb2f16ee0b70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330595-k68rm" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.183222 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd245\" (UniqueName: \"kubernetes.io/projected/24978587-9dc8-4e03-aa11-7ac8b7cb716e-kube-api-access-wd245\") pod \"migrator-59844c95c7-ww2cw\" (UID: \"24978587-9dc8-4e03-aa11-7ac8b7cb716e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ww2cw" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.183250 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnpn4\" (UniqueName: \"kubernetes.io/projected/d2f894dd-ca98-4d54-aa15-a4603f1e14c8-kube-api-access-nnpn4\") pod \"service-ca-9c57cc56f-d47lw\" (UID: \"d2f894dd-ca98-4d54-aa15-a4603f1e14c8\") " pod="openshift-service-ca/service-ca-9c57cc56f-d47lw" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.183271 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/cb37980a-f40a-4a5d-94c1-3197db1e7206-node-bootstrap-token\") pod \"machine-config-server-8j6qj\" (UID: \"cb37980a-f40a-4a5d-94c1-3197db1e7206\") " pod="openshift-machine-config-operator/machine-config-server-8j6qj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.184149 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d2f894dd-ca98-4d54-aa15-a4603f1e14c8-signing-key\") pod \"service-ca-9c57cc56f-d47lw\" (UID: \"d2f894dd-ca98-4d54-aa15-a4603f1e14c8\") " pod="openshift-service-ca/service-ca-9c57cc56f-d47lw" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.184179 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a1405be-c33d-417f-839f-cb2f16ee0b70-config-volume\") pod \"collect-profiles-29330595-k68rm\" (UID: \"2a1405be-c33d-417f-839f-cb2f16ee0b70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330595-k68rm" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.184202 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3c0381fa-4154-4d46-8531-51dc924c58fc-csi-data-dir\") pod \"csi-hostpathplugin-p6cjh\" (UID: \"3c0381fa-4154-4d46-8531-51dc924c58fc\") " pod="hostpath-provisioner/csi-hostpathplugin-p6cjh" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.184694 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8bc1e1cc-eb00-4dd9-82e5-895638bbde14-metrics-certs\") pod \"router-default-5444994796-26t2g\" (UID: \"8bc1e1cc-eb00-4dd9-82e5-895638bbde14\") " pod="openshift-ingress/router-default-5444994796-26t2g" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.185891 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdkk2\" (UniqueName: \"kubernetes.io/projected/8bc1e1cc-eb00-4dd9-82e5-895638bbde14-kube-api-access-tdkk2\") pod \"router-default-5444994796-26t2g\" (UID: \"8bc1e1cc-eb00-4dd9-82e5-895638bbde14\") " pod="openshift-ingress/router-default-5444994796-26t2g" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.185982 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ca3c8e5f-3994-409a-b8b2-58ee2ee245b6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7qgxt\" (UID: \"ca3c8e5f-3994-409a-b8b2-58ee2ee245b6\") " pod="openshift-marketplace/marketplace-operator-79b997595-7qgxt" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.186105 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d2f894dd-ca98-4d54-aa15-a4603f1e14c8-signing-cabundle\") pod \"service-ca-9c57cc56f-d47lw\" (UID: \"d2f894dd-ca98-4d54-aa15-a4603f1e14c8\") " pod="openshift-service-ca/service-ca-9c57cc56f-d47lw" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.186133 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a1405be-c33d-417f-839f-cb2f16ee0b70-secret-volume\") pod \"collect-profiles-29330595-k68rm\" (UID: \"2a1405be-c33d-417f-839f-cb2f16ee0b70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330595-k68rm" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.186261 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktlpx\" (UniqueName: \"kubernetes.io/projected/bd3a2401-7fe7-4270-8437-8cdc5d4b4a65-kube-api-access-ktlpx\") pod \"dns-default-wflpr\" (UID: \"bd3a2401-7fe7-4270-8437-8cdc5d4b4a65\") " pod="openshift-dns/dns-default-wflpr" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.186560 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3de352c-546d-48ce-bc08-d8731761b018-cert\") pod \"ingress-canary-v9nfw\" (UID: \"c3de352c-546d-48ce-bc08-d8731761b018\") " pod="openshift-ingress-canary/ingress-canary-v9nfw" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.186867 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/402cf0e4-8561-4c43-8d2b-f2cb5b369345-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rqvfm\" (UID: \"402cf0e4-8561-4c43-8d2b-f2cb5b369345\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rqvfm" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.191095 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77938bd8-bce7-4ca6-abcd-e0e965e47530-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vrqbv\" (UID: \"77938bd8-bce7-4ca6-abcd-e0e965e47530\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vrqbv" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.192259 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b8df7762-8c45-42ad-a645-83eb0bbed34a-trusted-ca\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.192647 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b8df7762-8c45-42ad-a645-83eb0bbed34a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.196572 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b8df7762-8c45-42ad-a645-83eb0bbed34a-registry-tls\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.197852 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8bc1e1cc-eb00-4dd9-82e5-895638bbde14-stats-auth\") pod \"router-default-5444994796-26t2g\" (UID: \"8bc1e1cc-eb00-4dd9-82e5-895638bbde14\") " pod="openshift-ingress/router-default-5444994796-26t2g" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.203116 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8bc1e1cc-eb00-4dd9-82e5-895638bbde14-default-certificate\") pod \"router-default-5444994796-26t2g\" (UID: \"8bc1e1cc-eb00-4dd9-82e5-895638bbde14\") " pod="openshift-ingress/router-default-5444994796-26t2g" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.216494 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b8df7762-8c45-42ad-a645-83eb0bbed34a-bound-sa-token\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.234396 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/77938bd8-bce7-4ca6-abcd-e0e965e47530-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vrqbv\" (UID: \"77938bd8-bce7-4ca6-abcd-e0e965e47530\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vrqbv" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.251917 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nc8d6"] Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.267588 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-hbprv"] Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.277461 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z84rb"] Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.278201 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vrqbv" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.288442 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ca3c8e5f-3994-409a-b8b2-58ee2ee245b6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7qgxt\" (UID: \"ca3c8e5f-3994-409a-b8b2-58ee2ee245b6\") " pod="openshift-marketplace/marketplace-operator-79b997595-7qgxt" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.288709 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d2f894dd-ca98-4d54-aa15-a4603f1e14c8-signing-cabundle\") pod \"service-ca-9c57cc56f-d47lw\" (UID: \"d2f894dd-ca98-4d54-aa15-a4603f1e14c8\") " pod="openshift-service-ca/service-ca-9c57cc56f-d47lw" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.288791 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a1405be-c33d-417f-839f-cb2f16ee0b70-secret-volume\") pod \"collect-profiles-29330595-k68rm\" (UID: \"2a1405be-c33d-417f-839f-cb2f16ee0b70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330595-k68rm" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.291846 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d2f894dd-ca98-4d54-aa15-a4603f1e14c8-signing-cabundle\") pod \"service-ca-9c57cc56f-d47lw\" (UID: \"d2f894dd-ca98-4d54-aa15-a4603f1e14c8\") " pod="openshift-service-ca/service-ca-9c57cc56f-d47lw" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.293230 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktlpx\" (UniqueName: \"kubernetes.io/projected/bd3a2401-7fe7-4270-8437-8cdc5d4b4a65-kube-api-access-ktlpx\") pod \"dns-default-wflpr\" (UID: \"bd3a2401-7fe7-4270-8437-8cdc5d4b4a65\") " pod="openshift-dns/dns-default-wflpr" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.293657 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3de352c-546d-48ce-bc08-d8731761b018-cert\") pod \"ingress-canary-v9nfw\" (UID: \"c3de352c-546d-48ce-bc08-d8731761b018\") " pod="openshift-ingress-canary/ingress-canary-v9nfw" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.293823 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/402cf0e4-8561-4c43-8d2b-f2cb5b369345-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rqvfm\" (UID: \"402cf0e4-8561-4c43-8d2b-f2cb5b369345\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rqvfm" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.293886 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c271ef31-887c-4b30-857a-7969eb9063bf-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-wcjjj\" (UID: \"c271ef31-887c-4b30-857a-7969eb9063bf\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wcjjj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.293915 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/402cf0e4-8561-4c43-8d2b-f2cb5b369345-trusted-ca\") pod \"ingress-operator-5b745b69d9-rqvfm\" (UID: \"402cf0e4-8561-4c43-8d2b-f2cb5b369345\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rqvfm" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.293935 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqszd\" (UniqueName: \"kubernetes.io/projected/deda04fb-8c07-460f-9756-c6ee7572925f-kube-api-access-vqszd\") pod \"package-server-manager-789f6589d5-4f7h4\" (UID: \"deda04fb-8c07-460f-9756-c6ee7572925f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4f7h4" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.293958 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bf93caea-b071-422d-ab31-9739c096160c-srv-cert\") pod \"catalog-operator-68c6474976-l8kk7\" (UID: \"bf93caea-b071-422d-ab31-9739c096160c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l8kk7" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.293987 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9249\" (UniqueName: \"kubernetes.io/projected/d4a12197-db67-481c-8508-a908ee9e9f01-kube-api-access-s9249\") pod \"olm-operator-6b444d44fb-2nxnf\" (UID: \"d4a12197-db67-481c-8508-a908ee9e9f01\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nxnf" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.294002 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d4a12197-db67-481c-8508-a908ee9e9f01-srv-cert\") pod \"olm-operator-6b444d44fb-2nxnf\" (UID: \"d4a12197-db67-481c-8508-a908ee9e9f01\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nxnf" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.294022 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdc2s\" (UniqueName: \"kubernetes.io/projected/c271ef31-887c-4b30-857a-7969eb9063bf-kube-api-access-sdc2s\") pod \"control-plane-machine-set-operator-78cbb6b69f-wcjjj\" (UID: \"c271ef31-887c-4b30-857a-7969eb9063bf\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wcjjj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.294076 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3c0381fa-4154-4d46-8531-51dc924c58fc-socket-dir\") pod \"csi-hostpathplugin-p6cjh\" (UID: \"3c0381fa-4154-4d46-8531-51dc924c58fc\") " pod="hostpath-provisioner/csi-hostpathplugin-p6cjh" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.294097 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x29h\" (UniqueName: \"kubernetes.io/projected/296e1c83-6ec8-4e68-9052-a1d3ea0e1d29-kube-api-access-6x29h\") pod \"multus-admission-controller-857f4d67dd-wx8hg\" (UID: \"296e1c83-6ec8-4e68-9052-a1d3ea0e1d29\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wx8hg" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.294116 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a7759fe-2642-4798-95ad-09f9c3d0b901-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-mcv46\" (UID: \"8a7759fe-2642-4798-95ad-09f9c3d0b901\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mcv46" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.294136 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqs2x\" (UniqueName: \"kubernetes.io/projected/ca3c8e5f-3994-409a-b8b2-58ee2ee245b6-kube-api-access-mqs2x\") pod \"marketplace-operator-79b997595-7qgxt\" (UID: \"ca3c8e5f-3994-409a-b8b2-58ee2ee245b6\") " pod="openshift-marketplace/marketplace-operator-79b997595-7qgxt" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.294156 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8zfr\" (UniqueName: \"kubernetes.io/projected/bf93caea-b071-422d-ab31-9739c096160c-kube-api-access-c8zfr\") pod \"catalog-operator-68c6474976-l8kk7\" (UID: \"bf93caea-b071-422d-ab31-9739c096160c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l8kk7" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.294179 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6e20de47-81b2-4b3c-9fbc-ab4ea511d276-tmpfs\") pod \"packageserver-d55dfcdfc-gwc5q\" (UID: \"6e20de47-81b2-4b3c-9fbc-ab4ea511d276\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gwc5q" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.294215 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwzns\" (UniqueName: \"kubernetes.io/projected/402cf0e4-8561-4c43-8d2b-f2cb5b369345-kube-api-access-bwzns\") pod \"ingress-operator-5b745b69d9-rqvfm\" (UID: \"402cf0e4-8561-4c43-8d2b-f2cb5b369345\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rqvfm" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.294237 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfkcw\" (UniqueName: \"kubernetes.io/projected/3c0381fa-4154-4d46-8531-51dc924c58fc-kube-api-access-cfkcw\") pod \"csi-hostpathplugin-p6cjh\" (UID: \"3c0381fa-4154-4d46-8531-51dc924c58fc\") " pod="hostpath-provisioner/csi-hostpathplugin-p6cjh" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.294257 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/402cf0e4-8561-4c43-8d2b-f2cb5b369345-metrics-tls\") pod \"ingress-operator-5b745b69d9-rqvfm\" (UID: \"402cf0e4-8561-4c43-8d2b-f2cb5b369345\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rqvfm" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.294286 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.294331 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksr87\" (UniqueName: \"kubernetes.io/projected/cb37980a-f40a-4a5d-94c1-3197db1e7206-kube-api-access-ksr87\") pod \"machine-config-server-8j6qj\" (UID: \"cb37980a-f40a-4a5d-94c1-3197db1e7206\") " pod="openshift-machine-config-operator/machine-config-server-8j6qj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.294354 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3c0381fa-4154-4d46-8531-51dc924c58fc-registration-dir\") pod \"csi-hostpathplugin-p6cjh\" (UID: \"3c0381fa-4154-4d46-8531-51dc924c58fc\") " pod="hostpath-provisioner/csi-hostpathplugin-p6cjh" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.294375 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/cb37980a-f40a-4a5d-94c1-3197db1e7206-certs\") pod \"machine-config-server-8j6qj\" (UID: \"cb37980a-f40a-4a5d-94c1-3197db1e7206\") " pod="openshift-machine-config-operator/machine-config-server-8j6qj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.294392 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/deda04fb-8c07-460f-9756-c6ee7572925f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-4f7h4\" (UID: \"deda04fb-8c07-460f-9756-c6ee7572925f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4f7h4" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.294422 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/296e1c83-6ec8-4e68-9052-a1d3ea0e1d29-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wx8hg\" (UID: \"296e1c83-6ec8-4e68-9052-a1d3ea0e1d29\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wx8hg" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.294448 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d4a12197-db67-481c-8508-a908ee9e9f01-profile-collector-cert\") pod \"olm-operator-6b444d44fb-2nxnf\" (UID: \"d4a12197-db67-481c-8508-a908ee9e9f01\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nxnf" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.294474 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6e20de47-81b2-4b3c-9fbc-ab4ea511d276-apiservice-cert\") pod \"packageserver-d55dfcdfc-gwc5q\" (UID: \"6e20de47-81b2-4b3c-9fbc-ab4ea511d276\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gwc5q" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.294491 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb8m6\" (UniqueName: \"kubernetes.io/projected/6e20de47-81b2-4b3c-9fbc-ab4ea511d276-kube-api-access-kb8m6\") pod \"packageserver-d55dfcdfc-gwc5q\" (UID: \"6e20de47-81b2-4b3c-9fbc-ab4ea511d276\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gwc5q" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.294522 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b24e9005-dccf-4f8b-b476-19d9c5bf6544-config\") pod \"service-ca-operator-777779d784-9lqrf\" (UID: \"b24e9005-dccf-4f8b-b476-19d9c5bf6544\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9lqrf" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.294539 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3c0381fa-4154-4d46-8531-51dc924c58fc-mountpoint-dir\") pod \"csi-hostpathplugin-p6cjh\" (UID: \"3c0381fa-4154-4d46-8531-51dc924c58fc\") " pod="hostpath-provisioner/csi-hostpathplugin-p6cjh" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.294575 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x79fq\" (UniqueName: \"kubernetes.io/projected/c3de352c-546d-48ce-bc08-d8731761b018-kube-api-access-x79fq\") pod \"ingress-canary-v9nfw\" (UID: \"c3de352c-546d-48ce-bc08-d8731761b018\") " pod="openshift-ingress-canary/ingress-canary-v9nfw" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.294594 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6e20de47-81b2-4b3c-9fbc-ab4ea511d276-webhook-cert\") pod \"packageserver-d55dfcdfc-gwc5q\" (UID: \"6e20de47-81b2-4b3c-9fbc-ab4ea511d276\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gwc5q" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.294612 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b24e9005-dccf-4f8b-b476-19d9c5bf6544-serving-cert\") pod \"service-ca-operator-777779d784-9lqrf\" (UID: \"b24e9005-dccf-4f8b-b476-19d9c5bf6544\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9lqrf" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.294637 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a7759fe-2642-4798-95ad-09f9c3d0b901-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-mcv46\" (UID: \"8a7759fe-2642-4798-95ad-09f9c3d0b901\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mcv46" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.294657 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ca3c8e5f-3994-409a-b8b2-58ee2ee245b6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7qgxt\" (UID: \"ca3c8e5f-3994-409a-b8b2-58ee2ee245b6\") " pod="openshift-marketplace/marketplace-operator-79b997595-7qgxt" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.294680 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bf93caea-b071-422d-ab31-9739c096160c-profile-collector-cert\") pod \"catalog-operator-68c6474976-l8kk7\" (UID: \"bf93caea-b071-422d-ab31-9739c096160c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l8kk7" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.294699 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3c0381fa-4154-4d46-8531-51dc924c58fc-plugins-dir\") pod \"csi-hostpathplugin-p6cjh\" (UID: \"3c0381fa-4154-4d46-8531-51dc924c58fc\") " pod="hostpath-provisioner/csi-hostpathplugin-p6cjh" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.294742 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bd3a2401-7fe7-4270-8437-8cdc5d4b4a65-config-volume\") pod \"dns-default-wflpr\" (UID: \"bd3a2401-7fe7-4270-8437-8cdc5d4b4a65\") " pod="openshift-dns/dns-default-wflpr" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.294746 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6e20de47-81b2-4b3c-9fbc-ab4ea511d276-tmpfs\") pod \"packageserver-d55dfcdfc-gwc5q\" (UID: \"6e20de47-81b2-4b3c-9fbc-ab4ea511d276\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gwc5q" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.294763 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bd3a2401-7fe7-4270-8437-8cdc5d4b4a65-metrics-tls\") pod \"dns-default-wflpr\" (UID: \"bd3a2401-7fe7-4270-8437-8cdc5d4b4a65\") " pod="openshift-dns/dns-default-wflpr" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.294789 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-565pf\" (UniqueName: \"kubernetes.io/projected/b24e9005-dccf-4f8b-b476-19d9c5bf6544-kube-api-access-565pf\") pod \"service-ca-operator-777779d784-9lqrf\" (UID: \"b24e9005-dccf-4f8b-b476-19d9c5bf6544\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9lqrf" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.295519 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bd3a2401-7fe7-4270-8437-8cdc5d4b4a65-config-volume\") pod \"dns-default-wflpr\" (UID: \"bd3a2401-7fe7-4270-8437-8cdc5d4b4a65\") " pod="openshift-dns/dns-default-wflpr" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.295765 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3c0381fa-4154-4d46-8531-51dc924c58fc-socket-dir\") pod \"csi-hostpathplugin-p6cjh\" (UID: \"3c0381fa-4154-4d46-8531-51dc924c58fc\") " pod="hostpath-provisioner/csi-hostpathplugin-p6cjh" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.296170 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx2dl\" (UniqueName: \"kubernetes.io/projected/8a7759fe-2642-4798-95ad-09f9c3d0b901-kube-api-access-hx2dl\") pod \"kube-storage-version-migrator-operator-b67b599dd-mcv46\" (UID: \"8a7759fe-2642-4798-95ad-09f9c3d0b901\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mcv46" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.296648 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3c0381fa-4154-4d46-8531-51dc924c58fc-registration-dir\") pod \"csi-hostpathplugin-p6cjh\" (UID: \"3c0381fa-4154-4d46-8531-51dc924c58fc\") " pod="hostpath-provisioner/csi-hostpathplugin-p6cjh" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.296232 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d56sw\" (UniqueName: \"kubernetes.io/projected/2a1405be-c33d-417f-839f-cb2f16ee0b70-kube-api-access-d56sw\") pod \"collect-profiles-29330595-k68rm\" (UID: \"2a1405be-c33d-417f-839f-cb2f16ee0b70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330595-k68rm" Oct 07 11:23:06 crc kubenswrapper[4700]: E1007 11:23:06.297120 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:06.797103668 +0000 UTC m=+153.593502657 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.300803 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a7759fe-2642-4798-95ad-09f9c3d0b901-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-mcv46\" (UID: \"8a7759fe-2642-4798-95ad-09f9c3d0b901\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mcv46" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.301402 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd245\" (UniqueName: \"kubernetes.io/projected/24978587-9dc8-4e03-aa11-7ac8b7cb716e-kube-api-access-wd245\") pod \"migrator-59844c95c7-ww2cw\" (UID: \"24978587-9dc8-4e03-aa11-7ac8b7cb716e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ww2cw" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.301498 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnpn4\" (UniqueName: \"kubernetes.io/projected/d2f894dd-ca98-4d54-aa15-a4603f1e14c8-kube-api-access-nnpn4\") pod \"service-ca-9c57cc56f-d47lw\" (UID: \"d2f894dd-ca98-4d54-aa15-a4603f1e14c8\") " pod="openshift-service-ca/service-ca-9c57cc56f-d47lw" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.301573 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/cb37980a-f40a-4a5d-94c1-3197db1e7206-node-bootstrap-token\") pod \"machine-config-server-8j6qj\" (UID: \"cb37980a-f40a-4a5d-94c1-3197db1e7206\") " pod="openshift-machine-config-operator/machine-config-server-8j6qj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.301653 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/402cf0e4-8561-4c43-8d2b-f2cb5b369345-trusted-ca\") pod \"ingress-operator-5b745b69d9-rqvfm\" (UID: \"402cf0e4-8561-4c43-8d2b-f2cb5b369345\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rqvfm" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.301798 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d2f894dd-ca98-4d54-aa15-a4603f1e14c8-signing-key\") pod \"service-ca-9c57cc56f-d47lw\" (UID: \"d2f894dd-ca98-4d54-aa15-a4603f1e14c8\") " pod="openshift-service-ca/service-ca-9c57cc56f-d47lw" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.301880 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a1405be-c33d-417f-839f-cb2f16ee0b70-config-volume\") pod \"collect-profiles-29330595-k68rm\" (UID: \"2a1405be-c33d-417f-839f-cb2f16ee0b70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330595-k68rm" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.301952 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3c0381fa-4154-4d46-8531-51dc924c58fc-csi-data-dir\") pod \"csi-hostpathplugin-p6cjh\" (UID: \"3c0381fa-4154-4d46-8531-51dc924c58fc\") " pod="hostpath-provisioner/csi-hostpathplugin-p6cjh" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.302146 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3c0381fa-4154-4d46-8531-51dc924c58fc-csi-data-dir\") pod \"csi-hostpathplugin-p6cjh\" (UID: \"3c0381fa-4154-4d46-8531-51dc924c58fc\") " pod="hostpath-provisioner/csi-hostpathplugin-p6cjh" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.302260 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdkk2\" (UniqueName: \"kubernetes.io/projected/8bc1e1cc-eb00-4dd9-82e5-895638bbde14-kube-api-access-tdkk2\") pod \"router-default-5444994796-26t2g\" (UID: \"8bc1e1cc-eb00-4dd9-82e5-895638bbde14\") " pod="openshift-ingress/router-default-5444994796-26t2g" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.302379 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3c0381fa-4154-4d46-8531-51dc924c58fc-mountpoint-dir\") pod \"csi-hostpathplugin-p6cjh\" (UID: \"3c0381fa-4154-4d46-8531-51dc924c58fc\") " pod="hostpath-provisioner/csi-hostpathplugin-p6cjh" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.303035 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a1405be-c33d-417f-839f-cb2f16ee0b70-config-volume\") pod \"collect-profiles-29330595-k68rm\" (UID: \"2a1405be-c33d-417f-839f-cb2f16ee0b70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330595-k68rm" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.303120 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3c0381fa-4154-4d46-8531-51dc924c58fc-plugins-dir\") pod \"csi-hostpathplugin-p6cjh\" (UID: \"3c0381fa-4154-4d46-8531-51dc924c58fc\") " pod="hostpath-provisioner/csi-hostpathplugin-p6cjh" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.305430 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b24e9005-dccf-4f8b-b476-19d9c5bf6544-config\") pod \"service-ca-operator-777779d784-9lqrf\" (UID: \"b24e9005-dccf-4f8b-b476-19d9c5bf6544\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9lqrf" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.309379 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3de352c-546d-48ce-bc08-d8731761b018-cert\") pod \"ingress-canary-v9nfw\" (UID: \"c3de352c-546d-48ce-bc08-d8731761b018\") " pod="openshift-ingress-canary/ingress-canary-v9nfw" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.309627 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ca3c8e5f-3994-409a-b8b2-58ee2ee245b6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7qgxt\" (UID: \"ca3c8e5f-3994-409a-b8b2-58ee2ee245b6\") " pod="openshift-marketplace/marketplace-operator-79b997595-7qgxt" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.315399 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ca3c8e5f-3994-409a-b8b2-58ee2ee245b6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7qgxt\" (UID: \"ca3c8e5f-3994-409a-b8b2-58ee2ee245b6\") " pod="openshift-marketplace/marketplace-operator-79b997595-7qgxt" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.316096 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d4a12197-db67-481c-8508-a908ee9e9f01-srv-cert\") pod \"olm-operator-6b444d44fb-2nxnf\" (UID: \"d4a12197-db67-481c-8508-a908ee9e9f01\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nxnf" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.316920 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bf93caea-b071-422d-ab31-9739c096160c-srv-cert\") pod \"catalog-operator-68c6474976-l8kk7\" (UID: \"bf93caea-b071-422d-ab31-9739c096160c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l8kk7" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.317231 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c271ef31-887c-4b30-857a-7969eb9063bf-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-wcjjj\" (UID: \"c271ef31-887c-4b30-857a-7969eb9063bf\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wcjjj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.317440 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7jsq\" (UniqueName: \"kubernetes.io/projected/b8df7762-8c45-42ad-a645-83eb0bbed34a-kube-api-access-x7jsq\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.318230 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d4a12197-db67-481c-8508-a908ee9e9f01-profile-collector-cert\") pod \"olm-operator-6b444d44fb-2nxnf\" (UID: \"d4a12197-db67-481c-8508-a908ee9e9f01\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nxnf" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.320015 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a1405be-c33d-417f-839f-cb2f16ee0b70-secret-volume\") pod \"collect-profiles-29330595-k68rm\" (UID: \"2a1405be-c33d-417f-839f-cb2f16ee0b70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330595-k68rm" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.327332 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/cb37980a-f40a-4a5d-94c1-3197db1e7206-node-bootstrap-token\") pod \"machine-config-server-8j6qj\" (UID: \"cb37980a-f40a-4a5d-94c1-3197db1e7206\") " pod="openshift-machine-config-operator/machine-config-server-8j6qj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.329090 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a7759fe-2642-4798-95ad-09f9c3d0b901-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-mcv46\" (UID: \"8a7759fe-2642-4798-95ad-09f9c3d0b901\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mcv46" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.330042 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/402cf0e4-8561-4c43-8d2b-f2cb5b369345-metrics-tls\") pod \"ingress-operator-5b745b69d9-rqvfm\" (UID: \"402cf0e4-8561-4c43-8d2b-f2cb5b369345\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rqvfm" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.331479 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/cb37980a-f40a-4a5d-94c1-3197db1e7206-certs\") pod \"machine-config-server-8j6qj\" (UID: \"cb37980a-f40a-4a5d-94c1-3197db1e7206\") " pod="openshift-machine-config-operator/machine-config-server-8j6qj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.332032 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/296e1c83-6ec8-4e68-9052-a1d3ea0e1d29-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wx8hg\" (UID: \"296e1c83-6ec8-4e68-9052-a1d3ea0e1d29\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wx8hg" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.332562 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bd3a2401-7fe7-4270-8437-8cdc5d4b4a65-metrics-tls\") pod \"dns-default-wflpr\" (UID: \"bd3a2401-7fe7-4270-8437-8cdc5d4b4a65\") " pod="openshift-dns/dns-default-wflpr" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.332699 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6e20de47-81b2-4b3c-9fbc-ab4ea511d276-webhook-cert\") pod \"packageserver-d55dfcdfc-gwc5q\" (UID: \"6e20de47-81b2-4b3c-9fbc-ab4ea511d276\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gwc5q" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.335355 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/deda04fb-8c07-460f-9756-c6ee7572925f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-4f7h4\" (UID: \"deda04fb-8c07-460f-9756-c6ee7572925f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4f7h4" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.335696 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b24e9005-dccf-4f8b-b476-19d9c5bf6544-serving-cert\") pod \"service-ca-operator-777779d784-9lqrf\" (UID: \"b24e9005-dccf-4f8b-b476-19d9c5bf6544\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9lqrf" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.336370 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-zrjt7"] Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.336820 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d2f894dd-ca98-4d54-aa15-a4603f1e14c8-signing-key\") pod \"service-ca-9c57cc56f-d47lw\" (UID: \"d2f894dd-ca98-4d54-aa15-a4603f1e14c8\") " pod="openshift-service-ca/service-ca-9c57cc56f-d47lw" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.338615 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6e20de47-81b2-4b3c-9fbc-ab4ea511d276-apiservice-cert\") pod \"packageserver-d55dfcdfc-gwc5q\" (UID: \"6e20de47-81b2-4b3c-9fbc-ab4ea511d276\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gwc5q" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.339430 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktlpx\" (UniqueName: \"kubernetes.io/projected/bd3a2401-7fe7-4270-8437-8cdc5d4b4a65-kube-api-access-ktlpx\") pod \"dns-default-wflpr\" (UID: \"bd3a2401-7fe7-4270-8437-8cdc5d4b4a65\") " pod="openshift-dns/dns-default-wflpr" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.340218 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bf93caea-b071-422d-ab31-9739c096160c-profile-collector-cert\") pod \"catalog-operator-68c6474976-l8kk7\" (UID: \"bf93caea-b071-422d-ab31-9739c096160c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l8kk7" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.356186 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdc2s\" (UniqueName: \"kubernetes.io/projected/c271ef31-887c-4b30-857a-7969eb9063bf-kube-api-access-sdc2s\") pod \"control-plane-machine-set-operator-78cbb6b69f-wcjjj\" (UID: \"c271ef31-887c-4b30-857a-7969eb9063bf\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wcjjj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.374789 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/402cf0e4-8561-4c43-8d2b-f2cb5b369345-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rqvfm\" (UID: \"402cf0e4-8561-4c43-8d2b-f2cb5b369345\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rqvfm" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.401644 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksr87\" (UniqueName: \"kubernetes.io/projected/cb37980a-f40a-4a5d-94c1-3197db1e7206-kube-api-access-ksr87\") pod \"machine-config-server-8j6qj\" (UID: \"cb37980a-f40a-4a5d-94c1-3197db1e7206\") " pod="openshift-machine-config-operator/machine-config-server-8j6qj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.402019 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wcjjj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.403676 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:06 crc kubenswrapper[4700]: E1007 11:23:06.403847 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:06.903825909 +0000 UTC m=+153.700224898 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.403956 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:06 crc kubenswrapper[4700]: E1007 11:23:06.404341 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:06.904327592 +0000 UTC m=+153.700726581 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.433335 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfkcw\" (UniqueName: \"kubernetes.io/projected/3c0381fa-4154-4d46-8531-51dc924c58fc-kube-api-access-cfkcw\") pod \"csi-hostpathplugin-p6cjh\" (UID: \"3c0381fa-4154-4d46-8531-51dc924c58fc\") " pod="hostpath-provisioner/csi-hostpathplugin-p6cjh" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.444061 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x29h\" (UniqueName: \"kubernetes.io/projected/296e1c83-6ec8-4e68-9052-a1d3ea0e1d29-kube-api-access-6x29h\") pod \"multus-admission-controller-857f4d67dd-wx8hg\" (UID: \"296e1c83-6ec8-4e68-9052-a1d3ea0e1d29\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wx8hg" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.448184 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wflpr" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.460925 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9249\" (UniqueName: \"kubernetes.io/projected/d4a12197-db67-481c-8508-a908ee9e9f01-kube-api-access-s9249\") pod \"olm-operator-6b444d44fb-2nxnf\" (UID: \"d4a12197-db67-481c-8508-a908ee9e9f01\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nxnf" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.474337 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8j6qj" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.496395 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x79fq\" (UniqueName: \"kubernetes.io/projected/c3de352c-546d-48ce-bc08-d8731761b018-kube-api-access-x79fq\") pod \"ingress-canary-v9nfw\" (UID: \"c3de352c-546d-48ce-bc08-d8731761b018\") " pod="openshift-ingress-canary/ingress-canary-v9nfw" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.500396 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-565pf\" (UniqueName: \"kubernetes.io/projected/b24e9005-dccf-4f8b-b476-19d9c5bf6544-kube-api-access-565pf\") pod \"service-ca-operator-777779d784-9lqrf\" (UID: \"b24e9005-dccf-4f8b-b476-19d9c5bf6544\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9lqrf" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.502984 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-p6cjh" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.505299 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:06 crc kubenswrapper[4700]: E1007 11:23:06.505715 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:07.005698609 +0000 UTC m=+153.802097598 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.530392 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqs2x\" (UniqueName: \"kubernetes.io/projected/ca3c8e5f-3994-409a-b8b2-58ee2ee245b6-kube-api-access-mqs2x\") pod \"marketplace-operator-79b997595-7qgxt\" (UID: \"ca3c8e5f-3994-409a-b8b2-58ee2ee245b6\") " pod="openshift-marketplace/marketplace-operator-79b997595-7qgxt" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.550086 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwzns\" (UniqueName: \"kubernetes.io/projected/402cf0e4-8561-4c43-8d2b-f2cb5b369345-kube-api-access-bwzns\") pod \"ingress-operator-5b745b69d9-rqvfm\" (UID: \"402cf0e4-8561-4c43-8d2b-f2cb5b369345\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rqvfm" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.563180 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8zfr\" (UniqueName: \"kubernetes.io/projected/bf93caea-b071-422d-ab31-9739c096160c-kube-api-access-c8zfr\") pod \"catalog-operator-68c6474976-l8kk7\" (UID: \"bf93caea-b071-422d-ab31-9739c096160c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l8kk7" Oct 07 11:23:06 crc kubenswrapper[4700]: W1007 11:23:06.573486 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb37980a_f40a_4a5d_94c1_3197db1e7206.slice/crio-18a6580bbbffdab78f50509decb0571925797d89248e6f13f9b068a26ffebbbb WatchSource:0}: Error finding container 18a6580bbbffdab78f50509decb0571925797d89248e6f13f9b068a26ffebbbb: Status 404 returned error can't find the container with id 18a6580bbbffdab78f50509decb0571925797d89248e6f13f9b068a26ffebbbb Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.581499 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx2dl\" (UniqueName: \"kubernetes.io/projected/8a7759fe-2642-4798-95ad-09f9c3d0b901-kube-api-access-hx2dl\") pod \"kube-storage-version-migrator-operator-b67b599dd-mcv46\" (UID: \"8a7759fe-2642-4798-95ad-09f9c3d0b901\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mcv46" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.585545 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-26t2g" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.599796 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d56sw\" (UniqueName: \"kubernetes.io/projected/2a1405be-c33d-417f-839f-cb2f16ee0b70-kube-api-access-d56sw\") pod \"collect-profiles-29330595-k68rm\" (UID: \"2a1405be-c33d-417f-839f-cb2f16ee0b70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330595-k68rm" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.607161 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:06 crc kubenswrapper[4700]: E1007 11:23:06.607870 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:07.107853526 +0000 UTC m=+153.904252515 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.620475 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd245\" (UniqueName: \"kubernetes.io/projected/24978587-9dc8-4e03-aa11-7ac8b7cb716e-kube-api-access-wd245\") pod \"migrator-59844c95c7-ww2cw\" (UID: \"24978587-9dc8-4e03-aa11-7ac8b7cb716e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ww2cw" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.633076 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-wx8hg" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.635479 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb8m6\" (UniqueName: \"kubernetes.io/projected/6e20de47-81b2-4b3c-9fbc-ab4ea511d276-kube-api-access-kb8m6\") pod \"packageserver-d55dfcdfc-gwc5q\" (UID: \"6e20de47-81b2-4b3c-9fbc-ab4ea511d276\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gwc5q" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.643491 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rqvfm" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.653525 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mcv46" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.666708 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ww2cw" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.671542 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnpn4\" (UniqueName: \"kubernetes.io/projected/d2f894dd-ca98-4d54-aa15-a4603f1e14c8-kube-api-access-nnpn4\") pod \"service-ca-9c57cc56f-d47lw\" (UID: \"d2f894dd-ca98-4d54-aa15-a4603f1e14c8\") " pod="openshift-service-ca/service-ca-9c57cc56f-d47lw" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.682733 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nxnf" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.691915 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-d47lw" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.707810 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.708426 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7qgxt" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.708567 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqszd\" (UniqueName: \"kubernetes.io/projected/deda04fb-8c07-460f-9756-c6ee7572925f-kube-api-access-vqszd\") pod \"package-server-manager-789f6589d5-4f7h4\" (UID: \"deda04fb-8c07-460f-9756-c6ee7572925f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4f7h4" Oct 07 11:23:06 crc kubenswrapper[4700]: E1007 11:23:06.708640 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:07.208622596 +0000 UTC m=+154.005021585 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.715171 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4f7h4" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.725995 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wkhrv"] Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.727193 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9lqrf" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.732515 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ttmh8"] Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.733178 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gwc5q" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.738267 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g2fw7"] Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.738864 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330595-k68rm" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.745911 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-tpvgp"] Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.755457 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l8kk7" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.764922 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-v9nfw" Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.801529 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vrqbv"] Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.810533 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:06 crc kubenswrapper[4700]: E1007 11:23:06.810919 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:07.310906627 +0000 UTC m=+154.107305616 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:06 crc kubenswrapper[4700]: W1007 11:23:06.829238 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2007404d_830b_44fe_b627_18b205d9f566.slice/crio-9bfbccf5af2c17f17ea521ffd5fabd18e7eff58d1dd7e096e4e3cf0981eb053c WatchSource:0}: Error finding container 9bfbccf5af2c17f17ea521ffd5fabd18e7eff58d1dd7e096e4e3cf0981eb053c: Status 404 returned error can't find the container with id 9bfbccf5af2c17f17ea521ffd5fabd18e7eff58d1dd7e096e4e3cf0981eb053c Oct 07 11:23:06 crc kubenswrapper[4700]: W1007 11:23:06.829511 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod351bb56c_499b_456c_81e2_ea2664ca5960.slice/crio-f2e884593d27a8285fd591a258423d0e14189d563a439bb824cb61d28fc4f893 WatchSource:0}: Error finding container f2e884593d27a8285fd591a258423d0e14189d563a439bb824cb61d28fc4f893: Status 404 returned error can't find the container with id f2e884593d27a8285fd591a258423d0e14189d563a439bb824cb61d28fc4f893 Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.833389 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4c7j6"] Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.843076 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-lhj8c"] Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.844166 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wcjjj"] Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.861204 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-p6cjh"] Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.901789 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-fvtx9"] Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.906598 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wflpr"] Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.912162 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:06 crc kubenswrapper[4700]: E1007 11:23:06.912283 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:07.412252992 +0000 UTC m=+154.208651981 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.915246 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:06 crc kubenswrapper[4700]: E1007 11:23:06.915899 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:07.41588156 +0000 UTC m=+154.212280549 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:06 crc kubenswrapper[4700]: W1007 11:23:06.926059 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86dce66a_8c06_4f71_8ce9_dec87390310d.slice/crio-89e650332482887468c554060bf414b6bf6fdd8dab49cc7cc7c9e7b499b86aff WatchSource:0}: Error finding container 89e650332482887468c554060bf414b6bf6fdd8dab49cc7cc7c9e7b499b86aff: Status 404 returned error can't find the container with id 89e650332482887468c554060bf414b6bf6fdd8dab49cc7cc7c9e7b499b86aff Oct 07 11:23:06 crc kubenswrapper[4700]: W1007 11:23:06.957998 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc271ef31_887c_4b30_857a_7969eb9063bf.slice/crio-cc049240da5f6a4f7f923890d011fd36b3465369dd26905105562f205c1b8560 WatchSource:0}: Error finding container cc049240da5f6a4f7f923890d011fd36b3465369dd26905105562f205c1b8560: Status 404 returned error can't find the container with id cc049240da5f6a4f7f923890d011fd36b3465369dd26905105562f205c1b8560 Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.961545 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-hbprv" event={"ID":"4c92443d-f8f6-4941-9729-013d10138707","Type":"ContainerStarted","Data":"b2045fce5cb460684f9718d18727916cfb77a554d9533a5ab5dfc7fde8dd00cd"} Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.961587 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-hbprv" event={"ID":"4c92443d-f8f6-4941-9729-013d10138707","Type":"ContainerStarted","Data":"bb38801631adad335f17c4dcd6c60d60825af7804c554264e7c4b1be312dc920"} Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.962462 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g2fw7" event={"ID":"01f5674d-e0dc-4a62-94a8-7200c30f524d","Type":"ContainerStarted","Data":"69153710381e96300d4e7e23d95f23fc67f95e86ec2ce26eb17b21bbaf327cf4"} Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.980077 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-tpvgp" event={"ID":"351bb56c-499b-456c-81e2-ea2664ca5960","Type":"ContainerStarted","Data":"f2e884593d27a8285fd591a258423d0e14189d563a439bb824cb61d28fc4f893"} Oct 07 11:23:06 crc kubenswrapper[4700]: W1007 11:23:06.992519 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c0381fa_4154_4d46_8531_51dc924c58fc.slice/crio-630c2a958196892f018e5e47c90822dc6e0b20823d2a935d60c15150b644d2c8 WatchSource:0}: Error finding container 630c2a958196892f018e5e47c90822dc6e0b20823d2a935d60c15150b644d2c8: Status 404 returned error can't find the container with id 630c2a958196892f018e5e47c90822dc6e0b20823d2a935d60c15150b644d2c8 Oct 07 11:23:06 crc kubenswrapper[4700]: I1007 11:23:06.997532 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8j6qj" event={"ID":"cb37980a-f40a-4a5d-94c1-3197db1e7206","Type":"ContainerStarted","Data":"18a6580bbbffdab78f50509decb0571925797d89248e6f13f9b068a26ffebbbb"} Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.004508 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-svs9t" event={"ID":"8c79b619-73da-4918-b5fd-e45ac5463a91","Type":"ContainerStarted","Data":"19d3854673baf9e91c54fba91486b5cbb512a42b67ca3d88aeac1080be22dd0b"} Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.007434 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-4bv9h" event={"ID":"dce1cb95-805f-4793-9a3e-45f4b18ca4cc","Type":"ContainerStarted","Data":"adcc403c7aed96fac0b7c9219f3be5b10ace0155273c64b48736799ba8ef67ec"} Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.007471 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-4bv9h" event={"ID":"dce1cb95-805f-4793-9a3e-45f4b18ca4cc","Type":"ContainerStarted","Data":"68f6db70100b8a020ad910ae3f3e876c798d83921a3c7e25c2f93ed61ec4df23"} Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.017404 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z84rb" event={"ID":"fe97f639-3a7f-437e-945c-e5a530726ced","Type":"ContainerStarted","Data":"38ef7a4365174e5acdb74b4bff2a0203a39d020c337050aa439b21f8dc9d19ee"} Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.017911 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:07 crc kubenswrapper[4700]: E1007 11:23:07.018262 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:07.518246613 +0000 UTC m=+154.314645602 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.019237 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wkhrv" event={"ID":"adfb3989-3a4d-4796-8c41-d6f554acea6d","Type":"ContainerStarted","Data":"f016d4e5997fecfb9a8e2ebaf27c5c6e7a610bb1f78df27096ac3cf42398c0c2"} Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.020858 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4c7j6" event={"ID":"86dce66a-8c06-4f71-8ce9-dec87390310d","Type":"ContainerStarted","Data":"89e650332482887468c554060bf414b6bf6fdd8dab49cc7cc7c9e7b499b86aff"} Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.022520 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-zrjt7" event={"ID":"2e3a06e3-520e-4096-ad0e-550a3cdd2a2b","Type":"ContainerStarted","Data":"91b34c35b2000d875f3768251eee073b6e8a46046beb0dbc36ab09ee4306db9c"} Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.025065 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b6rkf" event={"ID":"e78af36c-9fef-4db9-80a6-b0d02485f7bf","Type":"ContainerStarted","Data":"d19fed32e813b66dbd2c40a54bcad09b663d6c32aeca4b64470a95b2a8a553e8"} Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.025751 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b6rkf" Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.030187 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-26t2g" event={"ID":"8bc1e1cc-eb00-4dd9-82e5-895638bbde14","Type":"ContainerStarted","Data":"a47448331b3152abfc462b715cf3a41d2ec11f5ff78a9ed942742b0012ca330b"} Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.039790 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-xx8r8" event={"ID":"96767da2-aab3-4f6d-a14b-ada1c0a4ded8","Type":"ContainerStarted","Data":"fa3bd17f007cfb8592075be953c6daea733fd393b8a147ebad10b7e321ec0b63"} Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.039832 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-xx8r8" event={"ID":"96767da2-aab3-4f6d-a14b-ada1c0a4ded8","Type":"ContainerStarted","Data":"c6538b61de4ebf7d45dffdffbabd115c544a735d56563decaa9c4b00e96a73f5"} Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.048076 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nc8d6" event={"ID":"3d72c6ea-d44c-4da5-9fa7-17cd4b3f3cd4","Type":"ContainerStarted","Data":"e5397ebbbec102fcdd39280baaf4a2ef2631116d325032606aafa8b1f18a65ea"} Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.048114 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nc8d6" event={"ID":"3d72c6ea-d44c-4da5-9fa7-17cd4b3f3cd4","Type":"ContainerStarted","Data":"0fa4020f70bf82ee0c52e3367e5da70a8ba8d91eacbfc6069d240c110e73c3b7"} Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.048958 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ttmh8" event={"ID":"2007404d-830b-44fe-b627-18b205d9f566","Type":"ContainerStarted","Data":"9bfbccf5af2c17f17ea521ffd5fabd18e7eff58d1dd7e096e4e3cf0981eb053c"} Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.056696 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" event={"ID":"a40421f2-28c2-4eec-8a1b-77d96ac0b4fc","Type":"ContainerStarted","Data":"20195c9fcda490e977ce1d4a0623692f9ce5472e99beb9dbe1a3dabf93c0f746"} Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.060234 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqwpq" event={"ID":"0705c1f1-8602-47fc-9b52-82a70cf21976","Type":"ContainerStarted","Data":"3d7bd21c85de6a6f892c4e72ac62752cf85a83ddc3dd6e2218fbe05a7e6a0c54"} Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.060265 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqwpq" event={"ID":"0705c1f1-8602-47fc-9b52-82a70cf21976","Type":"ContainerStarted","Data":"b2c73163b5556c123c92990fa386806cb33b99bef72b81f9197a95de71268b5a"} Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.062495 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vrqbv" event={"ID":"77938bd8-bce7-4ca6-abcd-e0e965e47530","Type":"ContainerStarted","Data":"a17f139e30bc21f866aa9fb87cfc166e05421780c7d0b39256e4646592850452"} Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.064599 4700 patch_prober.go:28] interesting pod/console-operator-58897d9998-7l929 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.064627 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-7l929" podUID="fe7d5c4e-f774-4519-8696-781c27efb691" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.065102 4700 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-8l887 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.065124 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-8l887" podUID="2557333d-abb9-4863-81c9-397307a108f6" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.119890 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:07 crc kubenswrapper[4700]: E1007 11:23:07.122014 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:07.621992694 +0000 UTC m=+154.418391683 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.166013 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rqvfm"] Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.192593 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mcv46"] Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.221039 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:07 crc kubenswrapper[4700]: E1007 11:23:07.221350 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:07.721317385 +0000 UTC m=+154.517716374 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.221684 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:07 crc kubenswrapper[4700]: E1007 11:23:07.224920 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:07.724904022 +0000 UTC m=+154.521303011 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.324214 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:07 crc kubenswrapper[4700]: E1007 11:23:07.325056 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:07.825026733 +0000 UTC m=+154.621425722 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.325196 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:07 crc kubenswrapper[4700]: E1007 11:23:07.325820 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:07.825811224 +0000 UTC m=+154.622210213 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.426003 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:07 crc kubenswrapper[4700]: E1007 11:23:07.426341 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:07.926317787 +0000 UTC m=+154.722716776 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.426614 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:07 crc kubenswrapper[4700]: E1007 11:23:07.426948 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:07.926940874 +0000 UTC m=+154.723339863 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.427596 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2xjmg" podStartSLOduration=129.427573011 podStartE2EDuration="2m9.427573011s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:07.420175631 +0000 UTC m=+154.216574620" watchObservedRunningTime="2025-10-07 11:23:07.427573011 +0000 UTC m=+154.223972000" Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.465078 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqwpq" podStartSLOduration=129.465060173 podStartE2EDuration="2m9.465060173s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:07.461012874 +0000 UTC m=+154.257411863" watchObservedRunningTime="2025-10-07 11:23:07.465060173 +0000 UTC m=+154.261459162" Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.529087 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:07 crc kubenswrapper[4700]: E1007 11:23:07.529394 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:08.029349738 +0000 UTC m=+154.825748727 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.532243 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:07 crc kubenswrapper[4700]: E1007 11:23:07.533427 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:08.033386417 +0000 UTC m=+154.829785416 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.544080 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wx8hg"] Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.619814 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ww2cw"] Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.634546 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:07 crc kubenswrapper[4700]: E1007 11:23:07.635153 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:08.135137404 +0000 UTC m=+154.931536383 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.743931 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:07 crc kubenswrapper[4700]: E1007 11:23:07.745133 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:08.245113572 +0000 UTC m=+155.041512571 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.770976 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-svs9t" podStartSLOduration=129.77095464 podStartE2EDuration="2m9.77095464s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:07.769568163 +0000 UTC m=+154.565967152" watchObservedRunningTime="2025-10-07 11:23:07.77095464 +0000 UTC m=+154.567353629" Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.814941 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-7l929" podStartSLOduration=129.814922917 podStartE2EDuration="2m9.814922917s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:07.803592291 +0000 UTC m=+154.599991280" watchObservedRunningTime="2025-10-07 11:23:07.814922917 +0000 UTC m=+154.611321906" Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.847685 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:07 crc kubenswrapper[4700]: E1007 11:23:07.847913 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:08.347882856 +0000 UTC m=+155.144281845 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.847958 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:07 crc kubenswrapper[4700]: E1007 11:23:07.848611 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:08.348597996 +0000 UTC m=+155.144996975 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.898204 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l8kk7"] Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.913556 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4f7h4"] Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.949082 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:07 crc kubenswrapper[4700]: E1007 11:23:07.949462 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:08.449443378 +0000 UTC m=+155.245842357 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:07 crc kubenswrapper[4700]: I1007 11:23:07.998925 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" podStartSLOduration=129.998904143 podStartE2EDuration="2m9.998904143s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:07.997783633 +0000 UTC m=+154.794182622" watchObservedRunningTime="2025-10-07 11:23:07.998904143 +0000 UTC m=+154.795303132" Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.007128 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.038906 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7qgxt"] Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.048908 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gwc5q"] Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.050992 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:08 crc kubenswrapper[4700]: E1007 11:23:08.051360 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:08.551347639 +0000 UTC m=+155.347746628 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.069079 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330595-k68rm"] Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.069487 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qmpjz" podStartSLOduration=130.069474568 podStartE2EDuration="2m10.069474568s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:08.065552812 +0000 UTC m=+154.861951801" watchObservedRunningTime="2025-10-07 11:23:08.069474568 +0000 UTC m=+154.865873557" Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.089216 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nxnf"] Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.092382 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-9lqrf"] Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.104056 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-4bv9h" podStartSLOduration=130.104036031 podStartE2EDuration="2m10.104036031s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:08.103540838 +0000 UTC m=+154.899939827" watchObservedRunningTime="2025-10-07 11:23:08.104036031 +0000 UTC m=+154.900435010" Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.156667 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:08 crc kubenswrapper[4700]: E1007 11:23:08.157007 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:08.65698366 +0000 UTC m=+155.453382649 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.157178 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:08 crc kubenswrapper[4700]: E1007 11:23:08.157738 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:08.65773066 +0000 UTC m=+155.454129639 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.176984 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-d47lw"] Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.189327 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-tpvgp" event={"ID":"351bb56c-499b-456c-81e2-ea2664ca5960","Type":"ContainerStarted","Data":"c9a822c3c31d7e0ce0be363e2c02824d487bf886df028761cf730b1cbde9ae93"} Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.191353 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-tpvgp" Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.213348 4700 patch_prober.go:28] interesting pod/downloads-7954f5f757-tpvgp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.213401 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tpvgp" podUID="351bb56c-499b-456c-81e2-ea2664ca5960" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.248959 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b6rkf" podStartSLOduration=130.248929742 podStartE2EDuration="2m10.248929742s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:08.215701595 +0000 UTC m=+155.012100574" watchObservedRunningTime="2025-10-07 11:23:08.248929742 +0000 UTC m=+155.045328741" Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.256600 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-v9nfw"] Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.261358 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:08 crc kubenswrapper[4700]: E1007 11:23:08.262673 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:08.762649932 +0000 UTC m=+155.559048921 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.296547 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-zrjt7" event={"ID":"2e3a06e3-520e-4096-ad0e-550a3cdd2a2b","Type":"ContainerStarted","Data":"fe199d9f795d5717024442dbcfb382b826bb56a6b74b55236063b0e31d7fe083"} Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.321983 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wx8hg" event={"ID":"296e1c83-6ec8-4e68-9052-a1d3ea0e1d29","Type":"ContainerStarted","Data":"01552baad5617f098de44a7c173e8c1c4214cf1c1fa8435230576c47349dc08a"} Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.363089 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:08 crc kubenswrapper[4700]: E1007 11:23:08.365383 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:08.864990785 +0000 UTC m=+155.661389774 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.379667 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g2fw7" event={"ID":"01f5674d-e0dc-4a62-94a8-7200c30f524d","Type":"ContainerStarted","Data":"c2377828e2caaf5cbc9b87d0b135cc99703501c48dd1695acda13f521bcb5180"} Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.402535 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4f7h4" event={"ID":"deda04fb-8c07-460f-9756-c6ee7572925f","Type":"ContainerStarted","Data":"9a0608aefff02484b3946925e42cd6f6b66654f23ab435424e188a733eb50183"} Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.435048 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fvtx9" event={"ID":"111689f4-eb0d-47d5-800c-0eaa9acd7425","Type":"ContainerStarted","Data":"22b30a591d639a0c8881abb67ac1d318aa3f771a1b0bf36faf84a013cb7b2b13"} Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.466683 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:08 crc kubenswrapper[4700]: E1007 11:23:08.468543 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:08.968524639 +0000 UTC m=+155.764923628 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.489398 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-8l887" podStartSLOduration=130.484826459 podStartE2EDuration="2m10.484826459s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:08.481730206 +0000 UTC m=+155.278129195" watchObservedRunningTime="2025-10-07 11:23:08.484826459 +0000 UTC m=+155.281225448" Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.490285 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ww2cw" event={"ID":"24978587-9dc8-4e03-aa11-7ac8b7cb716e","Type":"ContainerStarted","Data":"6c4fc1fce3d4be0d56aa1172c918ae5cb647435a44e17c51865b414199a42021"} Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.495238 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mcv46" event={"ID":"8a7759fe-2642-4798-95ad-09f9c3d0b901","Type":"ContainerStarted","Data":"a847e80f48696106829111b13d62b91c1ec76287653f8c061ff99df5a8bc6b78"} Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.508015 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lhj8c" event={"ID":"1530c252-8f8d-4bf2-89f9-6d358d9cd617","Type":"ContainerStarted","Data":"c9e1478740e5e87e8ad970931eceef762d79a08d61a5b8a69c1e99c33e8b9719"} Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.508065 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lhj8c" event={"ID":"1530c252-8f8d-4bf2-89f9-6d358d9cd617","Type":"ContainerStarted","Data":"5e5a00f836e724115386ce56ad4578801fe8e4e0d2ca2898f76f3657ef1f54aa"} Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.511742 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-xx8r8" podStartSLOduration=130.511720825 podStartE2EDuration="2m10.511720825s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:08.506859354 +0000 UTC m=+155.303258343" watchObservedRunningTime="2025-10-07 11:23:08.511720825 +0000 UTC m=+155.308119814" Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.513073 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-26t2g" event={"ID":"8bc1e1cc-eb00-4dd9-82e5-895638bbde14","Type":"ContainerStarted","Data":"3e66ab210c94fa7182d43ca5b05938527d29735fb1784f31c02882ad39031f28"} Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.546933 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wcjjj" event={"ID":"c271ef31-887c-4b30-857a-7969eb9063bf","Type":"ContainerStarted","Data":"cc049240da5f6a4f7f923890d011fd36b3465369dd26905105562f205c1b8560"} Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.565853 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l8kk7" event={"ID":"bf93caea-b071-422d-ab31-9739c096160c","Type":"ContainerStarted","Data":"2b1b6c68d83c58ed37b614327804d2d97ca272577312e0fb52fa2d1e32b50c03"} Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.570036 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:08 crc kubenswrapper[4700]: E1007 11:23:08.571582 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:09.07156434 +0000 UTC m=+155.867963329 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.584731 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-p6cjh" event={"ID":"3c0381fa-4154-4d46-8531-51dc924c58fc","Type":"ContainerStarted","Data":"630c2a958196892f018e5e47c90822dc6e0b20823d2a935d60c15150b644d2c8"} Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.587976 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-26t2g" Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.612793 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8j6qj" event={"ID":"cb37980a-f40a-4a5d-94c1-3197db1e7206","Type":"ContainerStarted","Data":"31b58fd98f85176da1c04b6a45b74de95f9342c8cc9e13e24e1e1075687a64eb"} Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.616332 4700 patch_prober.go:28] interesting pod/router-default-5444994796-26t2g container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 11:23:08 crc kubenswrapper[4700]: [-]has-synced failed: reason withheld Oct 07 11:23:08 crc kubenswrapper[4700]: [+]process-running ok Oct 07 11:23:08 crc kubenswrapper[4700]: healthz check failed Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.616386 4700 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-26t2g" podUID="8bc1e1cc-eb00-4dd9-82e5-895638bbde14" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.673247 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:08 crc kubenswrapper[4700]: E1007 11:23:08.674717 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:09.174695544 +0000 UTC m=+155.971094533 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.695668 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z84rb" event={"ID":"fe97f639-3a7f-437e-945c-e5a530726ced","Type":"ContainerStarted","Data":"065f6db23858dff6c693686998ef4171cb6553a86ec5262d5b7d7f8e46a0e703"} Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.742273 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rqvfm" event={"ID":"402cf0e4-8561-4c43-8d2b-f2cb5b369345","Type":"ContainerStarted","Data":"1f9199abd0168ed35b7973bd68249767aa5a49a9afe9b2054bb1a64c46256d46"} Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.766926 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wflpr" event={"ID":"bd3a2401-7fe7-4270-8437-8cdc5d4b4a65","Type":"ContainerStarted","Data":"16b71cc09d453eb181b46aa2dedea8058751d5949fbd835e13ae6d800e41cf19"} Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.783974 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:08 crc kubenswrapper[4700]: E1007 11:23:08.784448 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:09.284431876 +0000 UTC m=+156.080830875 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.810199 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g2fw7" podStartSLOduration=130.810168871 podStartE2EDuration="2m10.810168871s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:08.800633903 +0000 UTC m=+155.597032902" watchObservedRunningTime="2025-10-07 11:23:08.810168871 +0000 UTC m=+155.606567870" Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.884968 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:08 crc kubenswrapper[4700]: E1007 11:23:08.886727 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:09.386695196 +0000 UTC m=+156.183094185 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.993895 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-26t2g" podStartSLOduration=130.993861809 podStartE2EDuration="2m10.993861809s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:08.857251732 +0000 UTC m=+155.653650721" watchObservedRunningTime="2025-10-07 11:23:08.993861809 +0000 UTC m=+155.790260798" Oct 07 11:23:08 crc kubenswrapper[4700]: I1007 11:23:08.994767 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-8j6qj" podStartSLOduration=5.994756783 podStartE2EDuration="5.994756783s" podCreationTimestamp="2025-10-07 11:23:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:08.959547273 +0000 UTC m=+155.755946282" watchObservedRunningTime="2025-10-07 11:23:08.994756783 +0000 UTC m=+155.791155772" Oct 07 11:23:09 crc kubenswrapper[4700]: I1007 11:23:09.004759 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:09 crc kubenswrapper[4700]: E1007 11:23:09.020679 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:09.520637092 +0000 UTC m=+156.317036081 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:09 crc kubenswrapper[4700]: I1007 11:23:09.055566 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wcjjj" podStartSLOduration=131.055539924 podStartE2EDuration="2m11.055539924s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:09.040871748 +0000 UTC m=+155.837270737" watchObservedRunningTime="2025-10-07 11:23:09.055539924 +0000 UTC m=+155.851938913" Oct 07 11:23:09 crc kubenswrapper[4700]: I1007 11:23:09.094773 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:23:09 crc kubenswrapper[4700]: I1007 11:23:09.106059 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:09 crc kubenswrapper[4700]: E1007 11:23:09.106407 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:09.606389326 +0000 UTC m=+156.402788315 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:09 crc kubenswrapper[4700]: I1007 11:23:09.213269 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:09 crc kubenswrapper[4700]: E1007 11:23:09.213873 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:09.713858157 +0000 UTC m=+156.510257146 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:09 crc kubenswrapper[4700]: I1007 11:23:09.299117 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-zrjt7" podStartSLOduration=131.299098788 podStartE2EDuration="2m11.299098788s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:09.295980214 +0000 UTC m=+156.092379203" watchObservedRunningTime="2025-10-07 11:23:09.299098788 +0000 UTC m=+156.095497777" Oct 07 11:23:09 crc kubenswrapper[4700]: I1007 11:23:09.315226 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:09 crc kubenswrapper[4700]: E1007 11:23:09.315824 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:09.815799159 +0000 UTC m=+156.612198148 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:09 crc kubenswrapper[4700]: I1007 11:23:09.417870 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:09 crc kubenswrapper[4700]: E1007 11:23:09.418547 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:09.918533392 +0000 UTC m=+156.714932381 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:09 crc kubenswrapper[4700]: I1007 11:23:09.461159 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-tpvgp" podStartSLOduration=131.461136992 podStartE2EDuration="2m11.461136992s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:09.364917625 +0000 UTC m=+156.161316614" watchObservedRunningTime="2025-10-07 11:23:09.461136992 +0000 UTC m=+156.257535981" Oct 07 11:23:09 crc kubenswrapper[4700]: I1007 11:23:09.519461 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:09 crc kubenswrapper[4700]: E1007 11:23:09.519956 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:10.019933659 +0000 UTC m=+156.816332648 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:09 crc kubenswrapper[4700]: I1007 11:23:09.591723 4700 patch_prober.go:28] interesting pod/router-default-5444994796-26t2g container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 11:23:09 crc kubenswrapper[4700]: [-]has-synced failed: reason withheld Oct 07 11:23:09 crc kubenswrapper[4700]: [+]process-running ok Oct 07 11:23:09 crc kubenswrapper[4700]: healthz check failed Oct 07 11:23:09 crc kubenswrapper[4700]: I1007 11:23:09.591833 4700 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-26t2g" podUID="8bc1e1cc-eb00-4dd9-82e5-895638bbde14" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 11:23:09 crc kubenswrapper[4700]: I1007 11:23:09.625538 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:09 crc kubenswrapper[4700]: E1007 11:23:09.626213 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:10.126195098 +0000 UTC m=+156.922594087 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:09 crc kubenswrapper[4700]: I1007 11:23:09.700354 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-svs9t" Oct 07 11:23:09 crc kubenswrapper[4700]: I1007 11:23:09.700730 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-svs9t" Oct 07 11:23:09 crc kubenswrapper[4700]: I1007 11:23:09.714177 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-svs9t" Oct 07 11:23:09 crc kubenswrapper[4700]: I1007 11:23:09.727729 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:09 crc kubenswrapper[4700]: E1007 11:23:09.728143 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:10.228126719 +0000 UTC m=+157.024525708 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:09 crc kubenswrapper[4700]: I1007 11:23:09.806788 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z84rb" event={"ID":"fe97f639-3a7f-437e-945c-e5a530726ced","Type":"ContainerStarted","Data":"126d88b66b9a53caeb1e5b323ac90d41bd1736f5a7f41916b25556a6bd6e97e9"} Oct 07 11:23:09 crc kubenswrapper[4700]: I1007 11:23:09.823168 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-hbprv" event={"ID":"4c92443d-f8f6-4941-9729-013d10138707","Type":"ContainerStarted","Data":"6abcf4fff11fb991893b13e22f99f68d8613bc34e7ec80cc95344ce6156dfd11"} Oct 07 11:23:09 crc kubenswrapper[4700]: I1007 11:23:09.829172 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:09 crc kubenswrapper[4700]: E1007 11:23:09.829583 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:10.329568317 +0000 UTC m=+157.125967306 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:09 crc kubenswrapper[4700]: I1007 11:23:09.842899 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" event={"ID":"a40421f2-28c2-4eec-8a1b-77d96ac0b4fc","Type":"ContainerStarted","Data":"0da4bf0fd37f07c7185629f87305bb43ab2c8a562e8e5dd117f2b70c03e38f9f"} Oct 07 11:23:09 crc kubenswrapper[4700]: I1007 11:23:09.851120 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-v9nfw" event={"ID":"c3de352c-546d-48ce-bc08-d8731761b018","Type":"ContainerStarted","Data":"ae5b7cd804baaff48c0f9564b90e998461356990900ec6cb0461cc9c3b4adc1b"} Oct 07 11:23:09 crc kubenswrapper[4700]: I1007 11:23:09.852662 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z84rb" podStartSLOduration=131.85264121 podStartE2EDuration="2m11.85264121s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:09.851232982 +0000 UTC m=+156.647631971" watchObservedRunningTime="2025-10-07 11:23:09.85264121 +0000 UTC m=+156.649040199" Oct 07 11:23:09 crc kubenswrapper[4700]: I1007 11:23:09.860272 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4f7h4" event={"ID":"deda04fb-8c07-460f-9756-c6ee7572925f","Type":"ContainerStarted","Data":"ac42bca47dbbd25f733c6f0aee66847422b3053443f6a9238650571189a76a0c"} Oct 07 11:23:09 crc kubenswrapper[4700]: I1007 11:23:09.870766 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gwc5q" event={"ID":"6e20de47-81b2-4b3c-9fbc-ab4ea511d276","Type":"ContainerStarted","Data":"3ac6b80734dcd79526a86e7a1513bdfa14db17e8ebcae6db8e2cd8d84d4e344a"} Oct 07 11:23:09 crc kubenswrapper[4700]: I1007 11:23:09.872141 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gwc5q" Oct 07 11:23:09 crc kubenswrapper[4700]: I1007 11:23:09.874218 4700 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-gwc5q container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" start-of-body= Oct 07 11:23:09 crc kubenswrapper[4700]: I1007 11:23:09.874268 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gwc5q" podUID="6e20de47-81b2-4b3c-9fbc-ab4ea511d276" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" Oct 07 11:23:09 crc kubenswrapper[4700]: I1007 11:23:09.883558 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nxnf" event={"ID":"d4a12197-db67-481c-8508-a908ee9e9f01","Type":"ContainerStarted","Data":"10eb06135d9dedac4607f613484a1799f7ff35e1641724fb5eecfe7854446f89"} Oct 07 11:23:09 crc kubenswrapper[4700]: I1007 11:23:09.889408 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-hbprv" podStartSLOduration=131.889389072 podStartE2EDuration="2m11.889389072s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:09.887316256 +0000 UTC m=+156.683715245" watchObservedRunningTime="2025-10-07 11:23:09.889389072 +0000 UTC m=+156.685788061" Oct 07 11:23:09 crc kubenswrapper[4700]: I1007 11:23:09.912473 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wkhrv" event={"ID":"adfb3989-3a4d-4796-8c41-d6f554acea6d","Type":"ContainerStarted","Data":"95e97591681efc3bfc7eaa5a7bc23b0a6604d4bd748418b3456c386a996f00ca"} Oct 07 11:23:09 crc kubenswrapper[4700]: I1007 11:23:09.934171 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:09 crc kubenswrapper[4700]: E1007 11:23:09.935463 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:10.435444295 +0000 UTC m=+157.231843284 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:09 crc kubenswrapper[4700]: I1007 11:23:09.938676 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7qgxt" event={"ID":"ca3c8e5f-3994-409a-b8b2-58ee2ee245b6","Type":"ContainerStarted","Data":"84a7aef1d5114e3eb188a1e8142594b182314e1013334015db53934af281e5a7"} Oct 07 11:23:09 crc kubenswrapper[4700]: I1007 11:23:09.968984 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" podStartSLOduration=131.968964549 podStartE2EDuration="2m11.968964549s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:09.925778514 +0000 UTC m=+156.722177523" watchObservedRunningTime="2025-10-07 11:23:09.968964549 +0000 UTC m=+156.765363538" Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.022083 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gwc5q" podStartSLOduration=132.022060393 podStartE2EDuration="2m12.022060393s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:09.971296572 +0000 UTC m=+156.767695561" watchObservedRunningTime="2025-10-07 11:23:10.022060393 +0000 UTC m=+156.818459382" Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.030669 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wkhrv" podStartSLOduration=132.030643644 podStartE2EDuration="2m12.030643644s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:10.021696693 +0000 UTC m=+156.818095682" watchObservedRunningTime="2025-10-07 11:23:10.030643644 +0000 UTC m=+156.827042633" Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.037004 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:10 crc kubenswrapper[4700]: E1007 11:23:10.038404 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:10.538391133 +0000 UTC m=+157.334790122 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.089568 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-nc8d6" podStartSLOduration=132.089544714 podStartE2EDuration="2m12.089544714s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:10.08790448 +0000 UTC m=+156.884303469" watchObservedRunningTime="2025-10-07 11:23:10.089544714 +0000 UTC m=+156.885943703" Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.089993 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nc8d6" event={"ID":"3d72c6ea-d44c-4da5-9fa7-17cd4b3f3cd4","Type":"ContainerStarted","Data":"2adc5287ce25d4dac415ef2907be69788fa2bdf154cca3c1f04c951c75f087e8"} Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.137934 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wcjjj" event={"ID":"c271ef31-887c-4b30-857a-7969eb9063bf","Type":"ContainerStarted","Data":"e6104345af7b7646d76517b28484b9f518b54e8ec32baba948e4d96a9f0d66ff"} Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.138622 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:10 crc kubenswrapper[4700]: E1007 11:23:10.138969 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:10.638941688 +0000 UTC m=+157.435340677 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.160209 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wx8hg" event={"ID":"296e1c83-6ec8-4e68-9052-a1d3ea0e1d29","Type":"ContainerStarted","Data":"61e34bec176667759df78a6c2baca309e40a64af5c6c5487b24e80705d2ee2b7"} Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.188687 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l8kk7" event={"ID":"bf93caea-b071-422d-ab31-9739c096160c","Type":"ContainerStarted","Data":"0c7264395c3856a394b12c3fd8702c631b29c1334bdaeee5fcad85b9284ede44"} Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.190350 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l8kk7" Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.224535 4700 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-l8kk7 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.224587 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l8kk7" podUID="bf93caea-b071-422d-ab31-9739c096160c" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.234583 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fvtx9" event={"ID":"111689f4-eb0d-47d5-800c-0eaa9acd7425","Type":"ContainerStarted","Data":"ab1802fe0ac85a2a602b0d5a8a097157717bd725e758558e7ccb163df697b395"} Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.235051 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fvtx9" event={"ID":"111689f4-eb0d-47d5-800c-0eaa9acd7425","Type":"ContainerStarted","Data":"4ebc5db1f11ccbb9d04b1732629fa44c3a20cfcdce35177c6e555467f4838241"} Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.244632 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:10 crc kubenswrapper[4700]: E1007 11:23:10.245890 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:10.745876614 +0000 UTC m=+157.542275603 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.258879 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ttmh8" event={"ID":"2007404d-830b-44fe-b627-18b205d9f566","Type":"ContainerStarted","Data":"c251a28fb161f29abffe3730da6e2f6b619076874cd6b35edac80631840925e4"} Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.272826 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9lqrf" event={"ID":"b24e9005-dccf-4f8b-b476-19d9c5bf6544","Type":"ContainerStarted","Data":"2cebf2a2f679f0a29daac9eb8c16236d79a1ad2c171e5b797f5712129f9bb402"} Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.272882 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9lqrf" event={"ID":"b24e9005-dccf-4f8b-b476-19d9c5bf6544","Type":"ContainerStarted","Data":"f22d2c83d899b71edb729807b4a862f31934859f05300e569ba5730fc9ecb791"} Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.275882 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fvtx9" podStartSLOduration=132.275861364 podStartE2EDuration="2m12.275861364s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:10.269714118 +0000 UTC m=+157.066113097" watchObservedRunningTime="2025-10-07 11:23:10.275861364 +0000 UTC m=+157.072260353" Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.276365 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l8kk7" podStartSLOduration=132.276360667 podStartE2EDuration="2m12.276360667s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:10.244003034 +0000 UTC m=+157.040402033" watchObservedRunningTime="2025-10-07 11:23:10.276360667 +0000 UTC m=+157.072759656" Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.288878 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330595-k68rm" event={"ID":"2a1405be-c33d-417f-839f-cb2f16ee0b70","Type":"ContainerStarted","Data":"6a38e158e26b451bd1b752cd65c399bcb27e6ab707fd16b60dcbf80a8d1ae450"} Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.317103 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ttmh8" podStartSLOduration=132.317078066 podStartE2EDuration="2m12.317078066s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:10.314837376 +0000 UTC m=+157.111236365" watchObservedRunningTime="2025-10-07 11:23:10.317078066 +0000 UTC m=+157.113477055" Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.322995 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vrqbv" event={"ID":"77938bd8-bce7-4ca6-abcd-e0e965e47530","Type":"ContainerStarted","Data":"f527542168539b5ea909898a9ebdde1cd9d543483b86da9ee9058a97aec6c4de"} Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.350665 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4c7j6" event={"ID":"86dce66a-8c06-4f71-8ce9-dec87390310d","Type":"ContainerStarted","Data":"924e6fc90a5277b37ef5bfd0e44d514709528eb378983776fb08b75d08ba3dab"} Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.351665 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4c7j6" Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.353515 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:10 crc kubenswrapper[4700]: E1007 11:23:10.354234 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:10.854218399 +0000 UTC m=+157.650617388 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.375372 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mcv46" event={"ID":"8a7759fe-2642-4798-95ad-09f9c3d0b901","Type":"ContainerStarted","Data":"1dd292888fa394a096e54727ae9edc43813e2de8e916af79b207831d5380c4f1"} Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.390325 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29330595-k68rm" podStartSLOduration=132.390287782 podStartE2EDuration="2m12.390287782s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:10.361990418 +0000 UTC m=+157.158389407" watchObservedRunningTime="2025-10-07 11:23:10.390287782 +0000 UTC m=+157.186686771" Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.392264 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lhj8c" event={"ID":"1530c252-8f8d-4bf2-89f9-6d358d9cd617","Type":"ContainerStarted","Data":"040716e4197a157137bcc02d4e7a55e91dd95d068d31300880145d9f11d6427a"} Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.405846 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-d47lw" event={"ID":"d2f894dd-ca98-4d54-aa15-a4603f1e14c8","Type":"ContainerStarted","Data":"b92ad2fe35813eb0c0b939d9abe5d39a888e0f3a7ca6665ba6cc931b510d79c4"} Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.415124 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9lqrf" podStartSLOduration=131.415103732 podStartE2EDuration="2m11.415103732s" podCreationTimestamp="2025-10-07 11:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:10.393228312 +0000 UTC m=+157.189627301" watchObservedRunningTime="2025-10-07 11:23:10.415103732 +0000 UTC m=+157.211502721" Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.415215 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rqvfm" event={"ID":"402cf0e4-8561-4c43-8d2b-f2cb5b369345","Type":"ContainerStarted","Data":"497cff38d5632ffe71a6ca53f4f58bf6050fded0a23a5449b8d9489a837bea5c"} Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.417497 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wflpr" event={"ID":"bd3a2401-7fe7-4270-8437-8cdc5d4b4a65","Type":"ContainerStarted","Data":"26949b0f5da334b2b51040f364e152b23b84d3dd1f78b5557d47e1f486756def"} Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.418061 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-wflpr" Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.443372 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ww2cw" event={"ID":"24978587-9dc8-4e03-aa11-7ac8b7cb716e","Type":"ContainerStarted","Data":"365efa3e01af4dc353ae6862844189649a72516fc31b88538ad79df11e4d2fdc"} Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.445201 4700 patch_prober.go:28] interesting pod/downloads-7954f5f757-tpvgp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.445241 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tpvgp" podUID="351bb56c-499b-456c-81e2-ea2664ca5960" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.458683 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.460133 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-svs9t" Oct 07 11:23:10 crc kubenswrapper[4700]: E1007 11:23:10.463173 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:10.963156439 +0000 UTC m=+157.759555428 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.491970 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lhj8c" podStartSLOduration=132.491945056 podStartE2EDuration="2m12.491945056s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:10.415458722 +0000 UTC m=+157.211857701" watchObservedRunningTime="2025-10-07 11:23:10.491945056 +0000 UTC m=+157.288344045" Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.547259 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mcv46" podStartSLOduration=132.547234399 podStartE2EDuration="2m12.547234399s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:10.502223944 +0000 UTC m=+157.298622933" watchObservedRunningTime="2025-10-07 11:23:10.547234399 +0000 UTC m=+157.343633388" Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.550253 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vrqbv" podStartSLOduration=132.55023962 podStartE2EDuration="2m12.55023962s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:10.549332285 +0000 UTC m=+157.345731274" watchObservedRunningTime="2025-10-07 11:23:10.55023962 +0000 UTC m=+157.346638609" Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.559648 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:10 crc kubenswrapper[4700]: E1007 11:23:10.559965 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:11.059947232 +0000 UTC m=+157.856346211 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.569489 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b6rkf" Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.606205 4700 patch_prober.go:28] interesting pod/router-default-5444994796-26t2g container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 11:23:10 crc kubenswrapper[4700]: [-]has-synced failed: reason withheld Oct 07 11:23:10 crc kubenswrapper[4700]: [+]process-running ok Oct 07 11:23:10 crc kubenswrapper[4700]: healthz check failed Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.606271 4700 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-26t2g" podUID="8bc1e1cc-eb00-4dd9-82e5-895638bbde14" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.624722 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4c7j6" podStartSLOduration=131.62470025 podStartE2EDuration="2m11.62470025s" podCreationTimestamp="2025-10-07 11:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:10.611296358 +0000 UTC m=+157.407695347" watchObservedRunningTime="2025-10-07 11:23:10.62470025 +0000 UTC m=+157.421099229" Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.661839 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:10 crc kubenswrapper[4700]: E1007 11:23:10.662245 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:11.162231433 +0000 UTC m=+157.958630422 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.685511 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-d47lw" podStartSLOduration=131.68548772 podStartE2EDuration="2m11.68548772s" podCreationTimestamp="2025-10-07 11:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:10.682763787 +0000 UTC m=+157.479162776" watchObservedRunningTime="2025-10-07 11:23:10.68548772 +0000 UTC m=+157.481886710" Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.763282 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:10 crc kubenswrapper[4700]: E1007 11:23:10.763455 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:11.263422434 +0000 UTC m=+158.059821423 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.763620 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:10 crc kubenswrapper[4700]: E1007 11:23:10.764070 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:11.264052681 +0000 UTC m=+158.060451670 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.865295 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:10 crc kubenswrapper[4700]: E1007 11:23:10.865474 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:11.365443258 +0000 UTC m=+158.161842247 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.865965 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:10 crc kubenswrapper[4700]: E1007 11:23:10.866360 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:11.366348552 +0000 UTC m=+158.162747541 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:10 crc kubenswrapper[4700]: I1007 11:23:10.967566 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:10 crc kubenswrapper[4700]: E1007 11:23:10.968082 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:11.468065637 +0000 UTC m=+158.264464626 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.067534 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rqvfm" podStartSLOduration=133.067509951 podStartE2EDuration="2m13.067509951s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:10.973189765 +0000 UTC m=+157.769588754" watchObservedRunningTime="2025-10-07 11:23:11.067509951 +0000 UTC m=+157.863908940" Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.069668 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:11 crc kubenswrapper[4700]: E1007 11:23:11.070092 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:11.570077531 +0000 UTC m=+158.366476520 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.095700 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ww2cw" podStartSLOduration=133.095670301 podStartE2EDuration="2m13.095670301s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:11.067932113 +0000 UTC m=+157.864331102" watchObservedRunningTime="2025-10-07 11:23:11.095670301 +0000 UTC m=+157.892069280" Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.101280 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-wflpr" podStartSLOduration=8.101263002 podStartE2EDuration="8.101263002s" podCreationTimestamp="2025-10-07 11:23:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:11.098292352 +0000 UTC m=+157.894691341" watchObservedRunningTime="2025-10-07 11:23:11.101263002 +0000 UTC m=+157.897661991" Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.170655 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:11 crc kubenswrapper[4700]: E1007 11:23:11.171836 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:11.671813237 +0000 UTC m=+158.468212226 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.201994 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4c7j6" Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.273154 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:11 crc kubenswrapper[4700]: E1007 11:23:11.273508 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:11.773494301 +0000 UTC m=+158.569893290 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.373904 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:11 crc kubenswrapper[4700]: E1007 11:23:11.373995 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:11.873980254 +0000 UTC m=+158.670379243 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.374274 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:11 crc kubenswrapper[4700]: E1007 11:23:11.374628 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:11.874621461 +0000 UTC m=+158.671020450 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.464103 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7qgxt" event={"ID":"ca3c8e5f-3994-409a-b8b2-58ee2ee245b6","Type":"ContainerStarted","Data":"2ce9b6f608a04db4a5b0f3d9609382869e9684c9023703cdd8e6d1108e0df27d"} Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.464585 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-7qgxt" Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.466085 4700 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-7qgxt container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.466122 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-7qgxt" podUID="ca3c8e5f-3994-409a-b8b2-58ee2ee245b6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.469455 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330595-k68rm" event={"ID":"2a1405be-c33d-417f-839f-cb2f16ee0b70","Type":"ContainerStarted","Data":"b0b165c7d26d1d296c43965c0f804e9e8afa33837599722f01c5286d61e1fa05"} Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.474343 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rqvfm" event={"ID":"402cf0e4-8561-4c43-8d2b-f2cb5b369345","Type":"ContainerStarted","Data":"92c1b237ca0690d57968c80472649d8af8daa48e72361c393895ade2a282cfb5"} Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.479975 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:11 crc kubenswrapper[4700]: E1007 11:23:11.480251 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:11.980229672 +0000 UTC m=+158.776628661 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.489642 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wflpr" event={"ID":"bd3a2401-7fe7-4270-8437-8cdc5d4b4a65","Type":"ContainerStarted","Data":"990a59e34987e71b23b48b75d77ae885fc4d95fd7866561fab882a42809a2217"} Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.504274 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-v9nfw" event={"ID":"c3de352c-546d-48ce-bc08-d8731761b018","Type":"ContainerStarted","Data":"40984d7fffc2d4014c33033a211f4924a3ce2d88460b7601df828c147bb8bd79"} Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.517149 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4f7h4" event={"ID":"deda04fb-8c07-460f-9756-c6ee7572925f","Type":"ContainerStarted","Data":"c2eb15e85314e56175c06f9cc0ef1da1d082c4fcb93da8feb3b0757ec4ac2cc3"} Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.517390 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4f7h4" Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.522144 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ww2cw" event={"ID":"24978587-9dc8-4e03-aa11-7ac8b7cb716e","Type":"ContainerStarted","Data":"1c68f6561fffb10415e9ae37b829f3b8ad598a1fac6648e7afaa5179a43ab0a9"} Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.535509 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-d47lw" event={"ID":"d2f894dd-ca98-4d54-aa15-a4603f1e14c8","Type":"ContainerStarted","Data":"51aa3f9112f07235d95265b1b76121b883dbffb67b67d1d366040087e246178e"} Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.552851 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wx8hg" event={"ID":"296e1c83-6ec8-4e68-9052-a1d3ea0e1d29","Type":"ContainerStarted","Data":"7f5418f8119ff45f662cb73465f5397db479e41806f8ad3f9ff3b3e048fb1ff8"} Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.560667 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-7qgxt" podStartSLOduration=133.560646372 podStartE2EDuration="2m13.560646372s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:11.517007634 +0000 UTC m=+158.313406623" watchObservedRunningTime="2025-10-07 11:23:11.560646372 +0000 UTC m=+158.357045361" Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.561507 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-p6cjh" event={"ID":"3c0381fa-4154-4d46-8531-51dc924c58fc","Type":"ContainerStarted","Data":"452cb0b62e3faea05cd84d8b4ca5469d95b17ae67f13a35bc299dc93b962e3da"} Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.561546 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-p6cjh" event={"ID":"3c0381fa-4154-4d46-8531-51dc924c58fc","Type":"ContainerStarted","Data":"03b56f79eb7f4d5a185e832a863fdb58e39c74cab84ee56c7a21fe3653073e3a"} Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.562507 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-v9nfw" podStartSLOduration=8.562502622 podStartE2EDuration="8.562502622s" podCreationTimestamp="2025-10-07 11:23:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:11.561191517 +0000 UTC m=+158.357590506" watchObservedRunningTime="2025-10-07 11:23:11.562502622 +0000 UTC m=+158.358901611" Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.569584 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nxnf" event={"ID":"d4a12197-db67-481c-8508-a908ee9e9f01","Type":"ContainerStarted","Data":"967945c6cfa44a5d669c92b0c73125f662f8b3586a1ea77dd47e659c5dde89d4"} Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.569796 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nxnf" Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.572166 4700 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-2nxnf container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.572211 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nxnf" podUID="d4a12197-db67-481c-8508-a908ee9e9f01" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.574453 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gwc5q" event={"ID":"6e20de47-81b2-4b3c-9fbc-ab4ea511d276","Type":"ContainerStarted","Data":"78cdbab0d58fe9342e96a6456d7c273d514887e8cf244ff3f63a9a9cd1bfd8b4"} Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.578386 4700 patch_prober.go:28] interesting pod/downloads-7954f5f757-tpvgp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.578425 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tpvgp" podUID="351bb56c-499b-456c-81e2-ea2664ca5960" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.578513 4700 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-gwc5q container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" start-of-body= Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.578584 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gwc5q" podUID="6e20de47-81b2-4b3c-9fbc-ab4ea511d276" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.581361 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:11 crc kubenswrapper[4700]: E1007 11:23:11.582448 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:12.08243179 +0000 UTC m=+158.878830779 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.595599 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-wx8hg" podStartSLOduration=133.595576925 podStartE2EDuration="2m13.595576925s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:11.595464422 +0000 UTC m=+158.391863411" watchObservedRunningTime="2025-10-07 11:23:11.595576925 +0000 UTC m=+158.391975904" Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.604875 4700 patch_prober.go:28] interesting pod/router-default-5444994796-26t2g container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 11:23:11 crc kubenswrapper[4700]: [-]has-synced failed: reason withheld Oct 07 11:23:11 crc kubenswrapper[4700]: [+]process-running ok Oct 07 11:23:11 crc kubenswrapper[4700]: healthz check failed Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.604924 4700 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-26t2g" podUID="8bc1e1cc-eb00-4dd9-82e5-895638bbde14" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.643467 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l8kk7" Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.682923 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:11 crc kubenswrapper[4700]: E1007 11:23:11.685464 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:12.185446391 +0000 UTC m=+158.981845380 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.716249 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4f7h4" podStartSLOduration=133.716225002 podStartE2EDuration="2m13.716225002s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:11.641135615 +0000 UTC m=+158.437534614" watchObservedRunningTime="2025-10-07 11:23:11.716225002 +0000 UTC m=+158.512623991" Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.741778 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nxnf" podStartSLOduration=133.741756311 podStartE2EDuration="2m13.741756311s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:11.719156351 +0000 UTC m=+158.515555340" watchObservedRunningTime="2025-10-07 11:23:11.741756311 +0000 UTC m=+158.538155290" Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.789536 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:11 crc kubenswrapper[4700]: E1007 11:23:11.789926 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:12.289913621 +0000 UTC m=+159.086312610 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.890876 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:11 crc kubenswrapper[4700]: E1007 11:23:11.891082 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:12.391048511 +0000 UTC m=+159.187447500 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.891289 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:11 crc kubenswrapper[4700]: E1007 11:23:11.891623 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:12.391615386 +0000 UTC m=+159.188014375 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.992039 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:11 crc kubenswrapper[4700]: E1007 11:23:11.992220 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:12.492191411 +0000 UTC m=+159.288590400 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:11 crc kubenswrapper[4700]: I1007 11:23:11.992432 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:11 crc kubenswrapper[4700]: E1007 11:23:11.992769 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:12.492757156 +0000 UTC m=+159.289156145 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:12 crc kubenswrapper[4700]: I1007 11:23:12.093976 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:12 crc kubenswrapper[4700]: E1007 11:23:12.094171 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:12.594137253 +0000 UTC m=+159.390536242 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:12 crc kubenswrapper[4700]: I1007 11:23:12.094507 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:12 crc kubenswrapper[4700]: E1007 11:23:12.094915 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:12.594900513 +0000 UTC m=+159.391299502 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:12 crc kubenswrapper[4700]: I1007 11:23:12.195510 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:12 crc kubenswrapper[4700]: E1007 11:23:12.195746 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:12.695706204 +0000 UTC m=+159.492105193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:12 crc kubenswrapper[4700]: I1007 11:23:12.195997 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:12 crc kubenswrapper[4700]: E1007 11:23:12.215662 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:12.715643442 +0000 UTC m=+159.512042431 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:12 crc kubenswrapper[4700]: I1007 11:23:12.297352 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:12 crc kubenswrapper[4700]: E1007 11:23:12.297679 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:12.797646406 +0000 UTC m=+159.594045395 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:12 crc kubenswrapper[4700]: I1007 11:23:12.399180 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:12 crc kubenswrapper[4700]: E1007 11:23:12.399618 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:12.899597998 +0000 UTC m=+159.695996987 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:12 crc kubenswrapper[4700]: I1007 11:23:12.500078 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:12 crc kubenswrapper[4700]: E1007 11:23:12.500335 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:13.000277085 +0000 UTC m=+159.796676074 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:12 crc kubenswrapper[4700]: I1007 11:23:12.500640 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:12 crc kubenswrapper[4700]: E1007 11:23:12.500942 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:13.000929813 +0000 UTC m=+159.797328802 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:12 crc kubenswrapper[4700]: I1007 11:23:12.559250 4700 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 07 11:23:12 crc kubenswrapper[4700]: I1007 11:23:12.582004 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-p6cjh" event={"ID":"3c0381fa-4154-4d46-8531-51dc924c58fc","Type":"ContainerStarted","Data":"a33033cdb9cfa2072bfa3d212a58810dda46b8d747d862a9c87b04c029c132e7"} Oct 07 11:23:12 crc kubenswrapper[4700]: I1007 11:23:12.582067 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-p6cjh" event={"ID":"3c0381fa-4154-4d46-8531-51dc924c58fc","Type":"ContainerStarted","Data":"dbbb800a87b7769d4df215b066843c44a17a7234176fce8dfffd739e93fc3633"} Oct 07 11:23:12 crc kubenswrapper[4700]: I1007 11:23:12.583124 4700 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-7qgxt container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Oct 07 11:23:12 crc kubenswrapper[4700]: I1007 11:23:12.583179 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-7qgxt" podUID="ca3c8e5f-3994-409a-b8b2-58ee2ee245b6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Oct 07 11:23:12 crc kubenswrapper[4700]: I1007 11:23:12.588052 4700 patch_prober.go:28] interesting pod/router-default-5444994796-26t2g container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 11:23:12 crc kubenswrapper[4700]: [-]has-synced failed: reason withheld Oct 07 11:23:12 crc kubenswrapper[4700]: [+]process-running ok Oct 07 11:23:12 crc kubenswrapper[4700]: healthz check failed Oct 07 11:23:12 crc kubenswrapper[4700]: I1007 11:23:12.588104 4700 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-26t2g" podUID="8bc1e1cc-eb00-4dd9-82e5-895638bbde14" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 11:23:12 crc kubenswrapper[4700]: I1007 11:23:12.593847 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nxnf" Oct 07 11:23:12 crc kubenswrapper[4700]: I1007 11:23:12.602076 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:12 crc kubenswrapper[4700]: E1007 11:23:12.602277 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:13.102244978 +0000 UTC m=+159.898643967 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:12 crc kubenswrapper[4700]: I1007 11:23:12.602532 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:12 crc kubenswrapper[4700]: E1007 11:23:12.602948 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:13.102935147 +0000 UTC m=+159.899334136 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:12 crc kubenswrapper[4700]: I1007 11:23:12.628878 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-p6cjh" podStartSLOduration=9.628856696 podStartE2EDuration="9.628856696s" podCreationTimestamp="2025-10-07 11:23:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:12.614567961 +0000 UTC m=+159.410966950" watchObservedRunningTime="2025-10-07 11:23:12.628856696 +0000 UTC m=+159.425255685" Oct 07 11:23:12 crc kubenswrapper[4700]: I1007 11:23:12.703852 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:12 crc kubenswrapper[4700]: E1007 11:23:12.704089 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:13.204054706 +0000 UTC m=+160.000453705 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:12 crc kubenswrapper[4700]: I1007 11:23:12.705266 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:12 crc kubenswrapper[4700]: E1007 11:23:12.705670 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:13.205650899 +0000 UTC m=+160.002049888 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:12 crc kubenswrapper[4700]: I1007 11:23:12.806857 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:12 crc kubenswrapper[4700]: E1007 11:23:12.807070 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:13.307038896 +0000 UTC m=+160.103437885 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:12 crc kubenswrapper[4700]: I1007 11:23:12.807169 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:12 crc kubenswrapper[4700]: E1007 11:23:12.807538 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:13.307531309 +0000 UTC m=+160.103930298 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:12 crc kubenswrapper[4700]: I1007 11:23:12.859610 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gwc5q" Oct 07 11:23:12 crc kubenswrapper[4700]: I1007 11:23:12.907991 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:12 crc kubenswrapper[4700]: E1007 11:23:12.908491 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:13.408475784 +0000 UTC m=+160.204874773 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.009735 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:13 crc kubenswrapper[4700]: E1007 11:23:13.010298 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:13.510276302 +0000 UTC m=+160.306675291 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:13 crc kubenswrapper[4700]: E1007 11:23:13.111616 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 11:23:13.611594667 +0000 UTC m=+160.407993676 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.111496 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.112006 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:13 crc kubenswrapper[4700]: E1007 11:23:13.112396 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 11:23:13.612375118 +0000 UTC m=+160.408774117 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xshkj" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.139182 4700 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-07T11:23:12.559283188Z","Handler":null,"Name":""} Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.143005 4700 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.143050 4700 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.156883 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8bwfw"] Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.158221 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8bwfw" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.160693 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.207769 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8bwfw"] Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.213641 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.228656 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.315767 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.315883 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a56f43f6-4fa8-47ab-b028-2bcc44e329d0-catalog-content\") pod \"certified-operators-8bwfw\" (UID: \"a56f43f6-4fa8-47ab-b028-2bcc44e329d0\") " pod="openshift-marketplace/certified-operators-8bwfw" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.315915 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a56f43f6-4fa8-47ab-b028-2bcc44e329d0-utilities\") pod \"certified-operators-8bwfw\" (UID: \"a56f43f6-4fa8-47ab-b028-2bcc44e329d0\") " pod="openshift-marketplace/certified-operators-8bwfw" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.315940 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvktg\" (UniqueName: \"kubernetes.io/projected/a56f43f6-4fa8-47ab-b028-2bcc44e329d0-kube-api-access-zvktg\") pod \"certified-operators-8bwfw\" (UID: \"a56f43f6-4fa8-47ab-b028-2bcc44e329d0\") " pod="openshift-marketplace/certified-operators-8bwfw" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.319044 4700 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.319084 4700 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.352556 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xshkj\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.353441 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9r2ww"] Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.354835 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9r2ww" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.359578 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.374945 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9r2ww"] Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.402940 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.417027 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a56f43f6-4fa8-47ab-b028-2bcc44e329d0-catalog-content\") pod \"certified-operators-8bwfw\" (UID: \"a56f43f6-4fa8-47ab-b028-2bcc44e329d0\") " pod="openshift-marketplace/certified-operators-8bwfw" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.417084 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a56f43f6-4fa8-47ab-b028-2bcc44e329d0-utilities\") pod \"certified-operators-8bwfw\" (UID: \"a56f43f6-4fa8-47ab-b028-2bcc44e329d0\") " pod="openshift-marketplace/certified-operators-8bwfw" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.417118 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvktg\" (UniqueName: \"kubernetes.io/projected/a56f43f6-4fa8-47ab-b028-2bcc44e329d0-kube-api-access-zvktg\") pod \"certified-operators-8bwfw\" (UID: \"a56f43f6-4fa8-47ab-b028-2bcc44e329d0\") " pod="openshift-marketplace/certified-operators-8bwfw" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.417696 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a56f43f6-4fa8-47ab-b028-2bcc44e329d0-catalog-content\") pod \"certified-operators-8bwfw\" (UID: \"a56f43f6-4fa8-47ab-b028-2bcc44e329d0\") " pod="openshift-marketplace/certified-operators-8bwfw" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.418131 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a56f43f6-4fa8-47ab-b028-2bcc44e329d0-utilities\") pod \"certified-operators-8bwfw\" (UID: \"a56f43f6-4fa8-47ab-b028-2bcc44e329d0\") " pod="openshift-marketplace/certified-operators-8bwfw" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.453501 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvktg\" (UniqueName: \"kubernetes.io/projected/a56f43f6-4fa8-47ab-b028-2bcc44e329d0-kube-api-access-zvktg\") pod \"certified-operators-8bwfw\" (UID: \"a56f43f6-4fa8-47ab-b028-2bcc44e329d0\") " pod="openshift-marketplace/certified-operators-8bwfw" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.473296 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8bwfw" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.518626 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/487ae31f-7fb6-4077-8f7d-11bb488b172b-catalog-content\") pod \"community-operators-9r2ww\" (UID: \"487ae31f-7fb6-4077-8f7d-11bb488b172b\") " pod="openshift-marketplace/community-operators-9r2ww" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.519141 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/487ae31f-7fb6-4077-8f7d-11bb488b172b-utilities\") pod \"community-operators-9r2ww\" (UID: \"487ae31f-7fb6-4077-8f7d-11bb488b172b\") " pod="openshift-marketplace/community-operators-9r2ww" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.519170 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlwhg\" (UniqueName: \"kubernetes.io/projected/487ae31f-7fb6-4077-8f7d-11bb488b172b-kube-api-access-rlwhg\") pod \"community-operators-9r2ww\" (UID: \"487ae31f-7fb6-4077-8f7d-11bb488b172b\") " pod="openshift-marketplace/community-operators-9r2ww" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.555101 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dblxw"] Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.556668 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dblxw" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.566905 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dblxw"] Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.591496 4700 generic.go:334] "Generic (PLEG): container finished" podID="2a1405be-c33d-417f-839f-cb2f16ee0b70" containerID="b0b165c7d26d1d296c43965c0f804e9e8afa33837599722f01c5286d61e1fa05" exitCode=0 Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.591731 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330595-k68rm" event={"ID":"2a1405be-c33d-417f-839f-cb2f16ee0b70","Type":"ContainerDied","Data":"b0b165c7d26d1d296c43965c0f804e9e8afa33837599722f01c5286d61e1fa05"} Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.593608 4700 patch_prober.go:28] interesting pod/router-default-5444994796-26t2g container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 11:23:13 crc kubenswrapper[4700]: [-]has-synced failed: reason withheld Oct 07 11:23:13 crc kubenswrapper[4700]: [+]process-running ok Oct 07 11:23:13 crc kubenswrapper[4700]: healthz check failed Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.593664 4700 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-26t2g" podUID="8bc1e1cc-eb00-4dd9-82e5-895638bbde14" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.593885 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-7qgxt" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.620834 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/487ae31f-7fb6-4077-8f7d-11bb488b172b-catalog-content\") pod \"community-operators-9r2ww\" (UID: \"487ae31f-7fb6-4077-8f7d-11bb488b172b\") " pod="openshift-marketplace/community-operators-9r2ww" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.620908 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/487ae31f-7fb6-4077-8f7d-11bb488b172b-utilities\") pod \"community-operators-9r2ww\" (UID: \"487ae31f-7fb6-4077-8f7d-11bb488b172b\") " pod="openshift-marketplace/community-operators-9r2ww" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.620947 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlwhg\" (UniqueName: \"kubernetes.io/projected/487ae31f-7fb6-4077-8f7d-11bb488b172b-kube-api-access-rlwhg\") pod \"community-operators-9r2ww\" (UID: \"487ae31f-7fb6-4077-8f7d-11bb488b172b\") " pod="openshift-marketplace/community-operators-9r2ww" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.621483 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/487ae31f-7fb6-4077-8f7d-11bb488b172b-catalog-content\") pod \"community-operators-9r2ww\" (UID: \"487ae31f-7fb6-4077-8f7d-11bb488b172b\") " pod="openshift-marketplace/community-operators-9r2ww" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.621615 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/487ae31f-7fb6-4077-8f7d-11bb488b172b-utilities\") pod \"community-operators-9r2ww\" (UID: \"487ae31f-7fb6-4077-8f7d-11bb488b172b\") " pod="openshift-marketplace/community-operators-9r2ww" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.641596 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlwhg\" (UniqueName: \"kubernetes.io/projected/487ae31f-7fb6-4077-8f7d-11bb488b172b-kube-api-access-rlwhg\") pod \"community-operators-9r2ww\" (UID: \"487ae31f-7fb6-4077-8f7d-11bb488b172b\") " pod="openshift-marketplace/community-operators-9r2ww" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.670819 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9r2ww" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.702977 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xshkj"] Oct 07 11:23:13 crc kubenswrapper[4700]: W1007 11:23:13.714398 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8df7762_8c45_42ad_a645_83eb0bbed34a.slice/crio-479b3920f0016fb8980940d2c7706d0bda066f7f6836130ed3ea30a20334361e WatchSource:0}: Error finding container 479b3920f0016fb8980940d2c7706d0bda066f7f6836130ed3ea30a20334361e: Status 404 returned error can't find the container with id 479b3920f0016fb8980940d2c7706d0bda066f7f6836130ed3ea30a20334361e Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.722379 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ea7e71b-3585-463c-ac5a-ac47dfbb9f47-utilities\") pod \"certified-operators-dblxw\" (UID: \"5ea7e71b-3585-463c-ac5a-ac47dfbb9f47\") " pod="openshift-marketplace/certified-operators-dblxw" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.722459 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrfb9\" (UniqueName: \"kubernetes.io/projected/5ea7e71b-3585-463c-ac5a-ac47dfbb9f47-kube-api-access-mrfb9\") pod \"certified-operators-dblxw\" (UID: \"5ea7e71b-3585-463c-ac5a-ac47dfbb9f47\") " pod="openshift-marketplace/certified-operators-dblxw" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.722551 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ea7e71b-3585-463c-ac5a-ac47dfbb9f47-catalog-content\") pod \"certified-operators-dblxw\" (UID: \"5ea7e71b-3585-463c-ac5a-ac47dfbb9f47\") " pod="openshift-marketplace/certified-operators-dblxw" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.751994 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rrq9m"] Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.754413 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rrq9m" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.768462 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rrq9m"] Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.823676 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ea7e71b-3585-463c-ac5a-ac47dfbb9f47-utilities\") pod \"certified-operators-dblxw\" (UID: \"5ea7e71b-3585-463c-ac5a-ac47dfbb9f47\") " pod="openshift-marketplace/certified-operators-dblxw" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.824298 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrfb9\" (UniqueName: \"kubernetes.io/projected/5ea7e71b-3585-463c-ac5a-ac47dfbb9f47-kube-api-access-mrfb9\") pod \"certified-operators-dblxw\" (UID: \"5ea7e71b-3585-463c-ac5a-ac47dfbb9f47\") " pod="openshift-marketplace/certified-operators-dblxw" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.824407 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ea7e71b-3585-463c-ac5a-ac47dfbb9f47-catalog-content\") pod \"certified-operators-dblxw\" (UID: \"5ea7e71b-3585-463c-ac5a-ac47dfbb9f47\") " pod="openshift-marketplace/certified-operators-dblxw" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.825061 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ea7e71b-3585-463c-ac5a-ac47dfbb9f47-catalog-content\") pod \"certified-operators-dblxw\" (UID: \"5ea7e71b-3585-463c-ac5a-ac47dfbb9f47\") " pod="openshift-marketplace/certified-operators-dblxw" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.825391 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ea7e71b-3585-463c-ac5a-ac47dfbb9f47-utilities\") pod \"certified-operators-dblxw\" (UID: \"5ea7e71b-3585-463c-ac5a-ac47dfbb9f47\") " pod="openshift-marketplace/certified-operators-dblxw" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.844875 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrfb9\" (UniqueName: \"kubernetes.io/projected/5ea7e71b-3585-463c-ac5a-ac47dfbb9f47-kube-api-access-mrfb9\") pod \"certified-operators-dblxw\" (UID: \"5ea7e71b-3585-463c-ac5a-ac47dfbb9f47\") " pod="openshift-marketplace/certified-operators-dblxw" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.884694 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dblxw" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.925718 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9r2ww"] Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.925933 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcfdc410-2928-44f4-a636-a683a6106aa8-catalog-content\") pod \"community-operators-rrq9m\" (UID: \"dcfdc410-2928-44f4-a636-a683a6106aa8\") " pod="openshift-marketplace/community-operators-rrq9m" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.925998 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcfdc410-2928-44f4-a636-a683a6106aa8-utilities\") pod \"community-operators-rrq9m\" (UID: \"dcfdc410-2928-44f4-a636-a683a6106aa8\") " pod="openshift-marketplace/community-operators-rrq9m" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.926044 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6hw9\" (UniqueName: \"kubernetes.io/projected/dcfdc410-2928-44f4-a636-a683a6106aa8-kube-api-access-z6hw9\") pod \"community-operators-rrq9m\" (UID: \"dcfdc410-2928-44f4-a636-a683a6106aa8\") " pod="openshift-marketplace/community-operators-rrq9m" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.973497 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 07 11:23:13 crc kubenswrapper[4700]: I1007 11:23:13.990377 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8bwfw"] Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.027653 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcfdc410-2928-44f4-a636-a683a6106aa8-utilities\") pod \"community-operators-rrq9m\" (UID: \"dcfdc410-2928-44f4-a636-a683a6106aa8\") " pod="openshift-marketplace/community-operators-rrq9m" Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.027720 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6hw9\" (UniqueName: \"kubernetes.io/projected/dcfdc410-2928-44f4-a636-a683a6106aa8-kube-api-access-z6hw9\") pod \"community-operators-rrq9m\" (UID: \"dcfdc410-2928-44f4-a636-a683a6106aa8\") " pod="openshift-marketplace/community-operators-rrq9m" Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.027779 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcfdc410-2928-44f4-a636-a683a6106aa8-catalog-content\") pod \"community-operators-rrq9m\" (UID: \"dcfdc410-2928-44f4-a636-a683a6106aa8\") " pod="openshift-marketplace/community-operators-rrq9m" Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.028243 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcfdc410-2928-44f4-a636-a683a6106aa8-catalog-content\") pod \"community-operators-rrq9m\" (UID: \"dcfdc410-2928-44f4-a636-a683a6106aa8\") " pod="openshift-marketplace/community-operators-rrq9m" Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.028496 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcfdc410-2928-44f4-a636-a683a6106aa8-utilities\") pod \"community-operators-rrq9m\" (UID: \"dcfdc410-2928-44f4-a636-a683a6106aa8\") " pod="openshift-marketplace/community-operators-rrq9m" Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.059053 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6hw9\" (UniqueName: \"kubernetes.io/projected/dcfdc410-2928-44f4-a636-a683a6106aa8-kube-api-access-z6hw9\") pod \"community-operators-rrq9m\" (UID: \"dcfdc410-2928-44f4-a636-a683a6106aa8\") " pod="openshift-marketplace/community-operators-rrq9m" Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.073782 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rrq9m" Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.143151 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dblxw"] Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.295675 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rrq9m"] Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.389611 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.390978 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.407204 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.448931 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.450540 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.453205 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.453508 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.466227 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.538449 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e6dab5aa-6c06-4814-839d-10efee6cfb77-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e6dab5aa-6c06-4814-839d-10efee6cfb77\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.538553 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e6dab5aa-6c06-4814-839d-10efee6cfb77-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e6dab5aa-6c06-4814-839d-10efee6cfb77\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.589244 4700 patch_prober.go:28] interesting pod/router-default-5444994796-26t2g container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 11:23:14 crc kubenswrapper[4700]: [-]has-synced failed: reason withheld Oct 07 11:23:14 crc kubenswrapper[4700]: [+]process-running ok Oct 07 11:23:14 crc kubenswrapper[4700]: healthz check failed Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.589342 4700 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-26t2g" podUID="8bc1e1cc-eb00-4dd9-82e5-895638bbde14" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.598652 4700 generic.go:334] "Generic (PLEG): container finished" podID="dcfdc410-2928-44f4-a636-a683a6106aa8" containerID="3a3be45a3c737fda62c500a4d975840f34fe2faefdc3d98227a7de6f72eb751a" exitCode=0 Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.598756 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rrq9m" event={"ID":"dcfdc410-2928-44f4-a636-a683a6106aa8","Type":"ContainerDied","Data":"3a3be45a3c737fda62c500a4d975840f34fe2faefdc3d98227a7de6f72eb751a"} Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.598810 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rrq9m" event={"ID":"dcfdc410-2928-44f4-a636-a683a6106aa8","Type":"ContainerStarted","Data":"18851b8f0eb1a193f57fa0f01827f26e47cb2abcc7a6235964f07f8201787803"} Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.600200 4700 generic.go:334] "Generic (PLEG): container finished" podID="5ea7e71b-3585-463c-ac5a-ac47dfbb9f47" containerID="cfeb7083c6d6359b2f02abe79592e3e0d75853a887febbe08da045b2cb90b749" exitCode=0 Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.600246 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dblxw" event={"ID":"5ea7e71b-3585-463c-ac5a-ac47dfbb9f47","Type":"ContainerDied","Data":"cfeb7083c6d6359b2f02abe79592e3e0d75853a887febbe08da045b2cb90b749"} Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.600295 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dblxw" event={"ID":"5ea7e71b-3585-463c-ac5a-ac47dfbb9f47","Type":"ContainerStarted","Data":"1794322d4f9f3dea65f1918292b990c23bb17212d04a636dd77f3dd63178ca55"} Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.601116 4700 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.601744 4700 generic.go:334] "Generic (PLEG): container finished" podID="487ae31f-7fb6-4077-8f7d-11bb488b172b" containerID="33dacfefec07737c4b2e99f06b29c1a9285b2959fbcd51e7de6567c493de6bcf" exitCode=0 Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.601833 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9r2ww" event={"ID":"487ae31f-7fb6-4077-8f7d-11bb488b172b","Type":"ContainerDied","Data":"33dacfefec07737c4b2e99f06b29c1a9285b2959fbcd51e7de6567c493de6bcf"} Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.601874 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9r2ww" event={"ID":"487ae31f-7fb6-4077-8f7d-11bb488b172b","Type":"ContainerStarted","Data":"b2575d5bd169dc79d8b5e452ece37556c61750d094a446a7df7d6301d2390e4a"} Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.603419 4700 generic.go:334] "Generic (PLEG): container finished" podID="a56f43f6-4fa8-47ab-b028-2bcc44e329d0" containerID="a271f2831e9b301668c8c4d812b50aaf39614e29bf1fa4fdf0f772b29525828a" exitCode=0 Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.603497 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8bwfw" event={"ID":"a56f43f6-4fa8-47ab-b028-2bcc44e329d0","Type":"ContainerDied","Data":"a271f2831e9b301668c8c4d812b50aaf39614e29bf1fa4fdf0f772b29525828a"} Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.603517 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8bwfw" event={"ID":"a56f43f6-4fa8-47ab-b028-2bcc44e329d0","Type":"ContainerStarted","Data":"b011b7bdb75dbcf24e770f7a962dd7f31d85e669b0de4b1c53e37c4f45926578"} Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.608927 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" event={"ID":"b8df7762-8c45-42ad-a645-83eb0bbed34a","Type":"ContainerStarted","Data":"a149b9e903fe3816460bcfafb884f0c5d26b872f0806a05dcd1c05f22865899a"} Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.608986 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" event={"ID":"b8df7762-8c45-42ad-a645-83eb0bbed34a","Type":"ContainerStarted","Data":"479b3920f0016fb8980940d2c7706d0bda066f7f6836130ed3ea30a20334361e"} Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.617202 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-zhvq9" Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.641095 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e6dab5aa-6c06-4814-839d-10efee6cfb77-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e6dab5aa-6c06-4814-839d-10efee6cfb77\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.641203 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e6dab5aa-6c06-4814-839d-10efee6cfb77-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e6dab5aa-6c06-4814-839d-10efee6cfb77\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.641865 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e6dab5aa-6c06-4814-839d-10efee6cfb77-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e6dab5aa-6c06-4814-839d-10efee6cfb77\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.693253 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e6dab5aa-6c06-4814-839d-10efee6cfb77-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e6dab5aa-6c06-4814-839d-10efee6cfb77\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.708761 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-8l887" Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.761975 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-7l929" Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.840315 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" podStartSLOduration=136.840283648 podStartE2EDuration="2m16.840283648s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:14.837744679 +0000 UTC m=+161.634143668" watchObservedRunningTime="2025-10-07 11:23:14.840283648 +0000 UTC m=+161.636682637" Oct 07 11:23:14 crc kubenswrapper[4700]: I1007 11:23:14.869642 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.051328 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330595-k68rm" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.152679 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a1405be-c33d-417f-839f-cb2f16ee0b70-secret-volume\") pod \"2a1405be-c33d-417f-839f-cb2f16ee0b70\" (UID: \"2a1405be-c33d-417f-839f-cb2f16ee0b70\") " Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.152762 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d56sw\" (UniqueName: \"kubernetes.io/projected/2a1405be-c33d-417f-839f-cb2f16ee0b70-kube-api-access-d56sw\") pod \"2a1405be-c33d-417f-839f-cb2f16ee0b70\" (UID: \"2a1405be-c33d-417f-839f-cb2f16ee0b70\") " Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.152813 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a1405be-c33d-417f-839f-cb2f16ee0b70-config-volume\") pod \"2a1405be-c33d-417f-839f-cb2f16ee0b70\" (UID: \"2a1405be-c33d-417f-839f-cb2f16ee0b70\") " Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.162612 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a1405be-c33d-417f-839f-cb2f16ee0b70-config-volume" (OuterVolumeSpecName: "config-volume") pod "2a1405be-c33d-417f-839f-cb2f16ee0b70" (UID: "2a1405be-c33d-417f-839f-cb2f16ee0b70"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.168702 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-c7vk6"] Oct 07 11:23:15 crc kubenswrapper[4700]: E1007 11:23:15.169064 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a1405be-c33d-417f-839f-cb2f16ee0b70" containerName="collect-profiles" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.169092 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a1405be-c33d-417f-839f-cb2f16ee0b70" containerName="collect-profiles" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.169310 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a1405be-c33d-417f-839f-cb2f16ee0b70" containerName="collect-profiles" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.171387 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c7vk6" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.174462 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.177815 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a1405be-c33d-417f-839f-cb2f16ee0b70-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2a1405be-c33d-417f-839f-cb2f16ee0b70" (UID: "2a1405be-c33d-417f-839f-cb2f16ee0b70"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.177910 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a1405be-c33d-417f-839f-cb2f16ee0b70-kube-api-access-d56sw" (OuterVolumeSpecName: "kube-api-access-d56sw") pod "2a1405be-c33d-417f-839f-cb2f16ee0b70" (UID: "2a1405be-c33d-417f-839f-cb2f16ee0b70"). InnerVolumeSpecName "kube-api-access-d56sw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.192399 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c7vk6"] Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.209929 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.255463 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcksd\" (UniqueName: \"kubernetes.io/projected/790ca1b9-1a1e-49d8-802b-e848efbc4c3e-kube-api-access-fcksd\") pod \"redhat-marketplace-c7vk6\" (UID: \"790ca1b9-1a1e-49d8-802b-e848efbc4c3e\") " pod="openshift-marketplace/redhat-marketplace-c7vk6" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.255644 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/790ca1b9-1a1e-49d8-802b-e848efbc4c3e-catalog-content\") pod \"redhat-marketplace-c7vk6\" (UID: \"790ca1b9-1a1e-49d8-802b-e848efbc4c3e\") " pod="openshift-marketplace/redhat-marketplace-c7vk6" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.255838 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/790ca1b9-1a1e-49d8-802b-e848efbc4c3e-utilities\") pod \"redhat-marketplace-c7vk6\" (UID: \"790ca1b9-1a1e-49d8-802b-e848efbc4c3e\") " pod="openshift-marketplace/redhat-marketplace-c7vk6" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.256072 4700 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a1405be-c33d-417f-839f-cb2f16ee0b70-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.256109 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d56sw\" (UniqueName: \"kubernetes.io/projected/2a1405be-c33d-417f-839f-cb2f16ee0b70-kube-api-access-d56sw\") on node \"crc\" DevicePath \"\"" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.256122 4700 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a1405be-c33d-417f-839f-cb2f16ee0b70-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.334239 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.334329 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.362047 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcksd\" (UniqueName: \"kubernetes.io/projected/790ca1b9-1a1e-49d8-802b-e848efbc4c3e-kube-api-access-fcksd\") pod \"redhat-marketplace-c7vk6\" (UID: \"790ca1b9-1a1e-49d8-802b-e848efbc4c3e\") " pod="openshift-marketplace/redhat-marketplace-c7vk6" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.362420 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/790ca1b9-1a1e-49d8-802b-e848efbc4c3e-catalog-content\") pod \"redhat-marketplace-c7vk6\" (UID: \"790ca1b9-1a1e-49d8-802b-e848efbc4c3e\") " pod="openshift-marketplace/redhat-marketplace-c7vk6" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.362524 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/790ca1b9-1a1e-49d8-802b-e848efbc4c3e-utilities\") pod \"redhat-marketplace-c7vk6\" (UID: \"790ca1b9-1a1e-49d8-802b-e848efbc4c3e\") " pod="openshift-marketplace/redhat-marketplace-c7vk6" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.363097 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/790ca1b9-1a1e-49d8-802b-e848efbc4c3e-utilities\") pod \"redhat-marketplace-c7vk6\" (UID: \"790ca1b9-1a1e-49d8-802b-e848efbc4c3e\") " pod="openshift-marketplace/redhat-marketplace-c7vk6" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.363133 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/790ca1b9-1a1e-49d8-802b-e848efbc4c3e-catalog-content\") pod \"redhat-marketplace-c7vk6\" (UID: \"790ca1b9-1a1e-49d8-802b-e848efbc4c3e\") " pod="openshift-marketplace/redhat-marketplace-c7vk6" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.387598 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcksd\" (UniqueName: \"kubernetes.io/projected/790ca1b9-1a1e-49d8-802b-e848efbc4c3e-kube-api-access-fcksd\") pod \"redhat-marketplace-c7vk6\" (UID: \"790ca1b9-1a1e-49d8-802b-e848efbc4c3e\") " pod="openshift-marketplace/redhat-marketplace-c7vk6" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.501890 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c7vk6" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.562348 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4zrq9"] Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.564355 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4zrq9" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.570176 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4zrq9"] Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.590416 4700 patch_prober.go:28] interesting pod/router-default-5444994796-26t2g container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 11:23:15 crc kubenswrapper[4700]: [-]has-synced failed: reason withheld Oct 07 11:23:15 crc kubenswrapper[4700]: [+]process-running ok Oct 07 11:23:15 crc kubenswrapper[4700]: healthz check failed Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.590529 4700 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-26t2g" podUID="8bc1e1cc-eb00-4dd9-82e5-895638bbde14" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.615868 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330595-k68rm" event={"ID":"2a1405be-c33d-417f-839f-cb2f16ee0b70","Type":"ContainerDied","Data":"6a38e158e26b451bd1b752cd65c399bcb27e6ab707fd16b60dcbf80a8d1ae450"} Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.615949 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a38e158e26b451bd1b752cd65c399bcb27e6ab707fd16b60dcbf80a8d1ae450" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.616065 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330595-k68rm" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.633963 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e6dab5aa-6c06-4814-839d-10efee6cfb77","Type":"ContainerStarted","Data":"67cf732d8ec74b082ab2599ed63c98ee835f411ca1b2a7789296f7a1f845c1d8"} Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.634368 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.667053 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/630f464b-788f-48ee-93ee-d5644f705ec0-catalog-content\") pod \"redhat-marketplace-4zrq9\" (UID: \"630f464b-788f-48ee-93ee-d5644f705ec0\") " pod="openshift-marketplace/redhat-marketplace-4zrq9" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.667199 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/630f464b-788f-48ee-93ee-d5644f705ec0-utilities\") pod \"redhat-marketplace-4zrq9\" (UID: \"630f464b-788f-48ee-93ee-d5644f705ec0\") " pod="openshift-marketplace/redhat-marketplace-4zrq9" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.667499 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcx5l\" (UniqueName: \"kubernetes.io/projected/630f464b-788f-48ee-93ee-d5644f705ec0-kube-api-access-hcx5l\") pod \"redhat-marketplace-4zrq9\" (UID: \"630f464b-788f-48ee-93ee-d5644f705ec0\") " pod="openshift-marketplace/redhat-marketplace-4zrq9" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.719668 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-xx8r8" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.720063 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-xx8r8" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.722549 4700 patch_prober.go:28] interesting pod/console-f9d7485db-xx8r8 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.21:8443/health\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.722632 4700 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-xx8r8" podUID="96767da2-aab3-4f6d-a14b-ada1c0a4ded8" containerName="console" probeResult="failure" output="Get \"https://10.217.0.21:8443/health\": dial tcp 10.217.0.21:8443: connect: connection refused" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.762894 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c7vk6"] Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.768685 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcx5l\" (UniqueName: \"kubernetes.io/projected/630f464b-788f-48ee-93ee-d5644f705ec0-kube-api-access-hcx5l\") pod \"redhat-marketplace-4zrq9\" (UID: \"630f464b-788f-48ee-93ee-d5644f705ec0\") " pod="openshift-marketplace/redhat-marketplace-4zrq9" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.768755 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/630f464b-788f-48ee-93ee-d5644f705ec0-catalog-content\") pod \"redhat-marketplace-4zrq9\" (UID: \"630f464b-788f-48ee-93ee-d5644f705ec0\") " pod="openshift-marketplace/redhat-marketplace-4zrq9" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.768805 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/630f464b-788f-48ee-93ee-d5644f705ec0-utilities\") pod \"redhat-marketplace-4zrq9\" (UID: \"630f464b-788f-48ee-93ee-d5644f705ec0\") " pod="openshift-marketplace/redhat-marketplace-4zrq9" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.769324 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/630f464b-788f-48ee-93ee-d5644f705ec0-catalog-content\") pod \"redhat-marketplace-4zrq9\" (UID: \"630f464b-788f-48ee-93ee-d5644f705ec0\") " pod="openshift-marketplace/redhat-marketplace-4zrq9" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.769394 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/630f464b-788f-48ee-93ee-d5644f705ec0-utilities\") pod \"redhat-marketplace-4zrq9\" (UID: \"630f464b-788f-48ee-93ee-d5644f705ec0\") " pod="openshift-marketplace/redhat-marketplace-4zrq9" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.787673 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcx5l\" (UniqueName: \"kubernetes.io/projected/630f464b-788f-48ee-93ee-d5644f705ec0-kube-api-access-hcx5l\") pod \"redhat-marketplace-4zrq9\" (UID: \"630f464b-788f-48ee-93ee-d5644f705ec0\") " pod="openshift-marketplace/redhat-marketplace-4zrq9" Oct 07 11:23:15 crc kubenswrapper[4700]: I1007 11:23:15.883793 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4zrq9" Oct 07 11:23:16 crc kubenswrapper[4700]: I1007 11:23:16.098289 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4zrq9"] Oct 07 11:23:16 crc kubenswrapper[4700]: W1007 11:23:16.106018 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod630f464b_788f_48ee_93ee_d5644f705ec0.slice/crio-25aa8c486eb3d9eee69fa0cd0499599db8b0216ca5d2c40c5472d704e3bc1c85 WatchSource:0}: Error finding container 25aa8c486eb3d9eee69fa0cd0499599db8b0216ca5d2c40c5472d704e3bc1c85: Status 404 returned error can't find the container with id 25aa8c486eb3d9eee69fa0cd0499599db8b0216ca5d2c40c5472d704e3bc1c85 Oct 07 11:23:16 crc kubenswrapper[4700]: I1007 11:23:16.184849 4700 patch_prober.go:28] interesting pod/downloads-7954f5f757-tpvgp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 07 11:23:16 crc kubenswrapper[4700]: I1007 11:23:16.184905 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tpvgp" podUID="351bb56c-499b-456c-81e2-ea2664ca5960" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 07 11:23:16 crc kubenswrapper[4700]: I1007 11:23:16.184938 4700 patch_prober.go:28] interesting pod/downloads-7954f5f757-tpvgp container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 07 11:23:16 crc kubenswrapper[4700]: I1007 11:23:16.184962 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-tpvgp" podUID="351bb56c-499b-456c-81e2-ea2664ca5960" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 07 11:23:16 crc kubenswrapper[4700]: I1007 11:23:16.552923 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cr2l5"] Oct 07 11:23:16 crc kubenswrapper[4700]: I1007 11:23:16.574016 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cr2l5" Oct 07 11:23:16 crc kubenswrapper[4700]: I1007 11:23:16.577076 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 07 11:23:16 crc kubenswrapper[4700]: I1007 11:23:16.577777 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cr2l5"] Oct 07 11:23:16 crc kubenswrapper[4700]: I1007 11:23:16.585670 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-26t2g" Oct 07 11:23:16 crc kubenswrapper[4700]: I1007 11:23:16.593193 4700 patch_prober.go:28] interesting pod/router-default-5444994796-26t2g container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 11:23:16 crc kubenswrapper[4700]: [-]has-synced failed: reason withheld Oct 07 11:23:16 crc kubenswrapper[4700]: [+]process-running ok Oct 07 11:23:16 crc kubenswrapper[4700]: healthz check failed Oct 07 11:23:16 crc kubenswrapper[4700]: I1007 11:23:16.593292 4700 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-26t2g" podUID="8bc1e1cc-eb00-4dd9-82e5-895638bbde14" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 11:23:16 crc kubenswrapper[4700]: I1007 11:23:16.654982 4700 generic.go:334] "Generic (PLEG): container finished" podID="630f464b-788f-48ee-93ee-d5644f705ec0" containerID="63517a2aa24f14a0cf0b6d6c44a0b3639b6f51b113aa1601c7e01ebe3ac06746" exitCode=0 Oct 07 11:23:16 crc kubenswrapper[4700]: I1007 11:23:16.655044 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4zrq9" event={"ID":"630f464b-788f-48ee-93ee-d5644f705ec0","Type":"ContainerDied","Data":"63517a2aa24f14a0cf0b6d6c44a0b3639b6f51b113aa1601c7e01ebe3ac06746"} Oct 07 11:23:16 crc kubenswrapper[4700]: I1007 11:23:16.655105 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4zrq9" event={"ID":"630f464b-788f-48ee-93ee-d5644f705ec0","Type":"ContainerStarted","Data":"25aa8c486eb3d9eee69fa0cd0499599db8b0216ca5d2c40c5472d704e3bc1c85"} Oct 07 11:23:16 crc kubenswrapper[4700]: I1007 11:23:16.659419 4700 generic.go:334] "Generic (PLEG): container finished" podID="e6dab5aa-6c06-4814-839d-10efee6cfb77" containerID="529fdd6da5953eaab67fc2a13b24b7844002561260989fbac74675e017cbc0a0" exitCode=0 Oct 07 11:23:16 crc kubenswrapper[4700]: I1007 11:23:16.659614 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e6dab5aa-6c06-4814-839d-10efee6cfb77","Type":"ContainerDied","Data":"529fdd6da5953eaab67fc2a13b24b7844002561260989fbac74675e017cbc0a0"} Oct 07 11:23:16 crc kubenswrapper[4700]: I1007 11:23:16.663676 4700 generic.go:334] "Generic (PLEG): container finished" podID="790ca1b9-1a1e-49d8-802b-e848efbc4c3e" containerID="596c919ebf6b875b2799a9a5109011d8f075136d269c7384aa7597f02754c6e2" exitCode=0 Oct 07 11:23:16 crc kubenswrapper[4700]: I1007 11:23:16.664890 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c7vk6" event={"ID":"790ca1b9-1a1e-49d8-802b-e848efbc4c3e","Type":"ContainerDied","Data":"596c919ebf6b875b2799a9a5109011d8f075136d269c7384aa7597f02754c6e2"} Oct 07 11:23:16 crc kubenswrapper[4700]: I1007 11:23:16.664915 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c7vk6" event={"ID":"790ca1b9-1a1e-49d8-802b-e848efbc4c3e","Type":"ContainerStarted","Data":"83ff4946b21aaafbc17f413e5460c9295321cb073057819e10b2b66318943d13"} Oct 07 11:23:16 crc kubenswrapper[4700]: I1007 11:23:16.685116 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17497e3a-9f81-4f68-8881-80a6aaae79a1-utilities\") pod \"redhat-operators-cr2l5\" (UID: \"17497e3a-9f81-4f68-8881-80a6aaae79a1\") " pod="openshift-marketplace/redhat-operators-cr2l5" Oct 07 11:23:16 crc kubenswrapper[4700]: I1007 11:23:16.685290 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb2p2\" (UniqueName: \"kubernetes.io/projected/17497e3a-9f81-4f68-8881-80a6aaae79a1-kube-api-access-tb2p2\") pod \"redhat-operators-cr2l5\" (UID: \"17497e3a-9f81-4f68-8881-80a6aaae79a1\") " pod="openshift-marketplace/redhat-operators-cr2l5" Oct 07 11:23:16 crc kubenswrapper[4700]: I1007 11:23:16.685399 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17497e3a-9f81-4f68-8881-80a6aaae79a1-catalog-content\") pod \"redhat-operators-cr2l5\" (UID: \"17497e3a-9f81-4f68-8881-80a6aaae79a1\") " pod="openshift-marketplace/redhat-operators-cr2l5" Oct 07 11:23:16 crc kubenswrapper[4700]: I1007 11:23:16.792921 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17497e3a-9f81-4f68-8881-80a6aaae79a1-catalog-content\") pod \"redhat-operators-cr2l5\" (UID: \"17497e3a-9f81-4f68-8881-80a6aaae79a1\") " pod="openshift-marketplace/redhat-operators-cr2l5" Oct 07 11:23:16 crc kubenswrapper[4700]: I1007 11:23:16.793079 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17497e3a-9f81-4f68-8881-80a6aaae79a1-utilities\") pod \"redhat-operators-cr2l5\" (UID: \"17497e3a-9f81-4f68-8881-80a6aaae79a1\") " pod="openshift-marketplace/redhat-operators-cr2l5" Oct 07 11:23:16 crc kubenswrapper[4700]: I1007 11:23:16.793266 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb2p2\" (UniqueName: \"kubernetes.io/projected/17497e3a-9f81-4f68-8881-80a6aaae79a1-kube-api-access-tb2p2\") pod \"redhat-operators-cr2l5\" (UID: \"17497e3a-9f81-4f68-8881-80a6aaae79a1\") " pod="openshift-marketplace/redhat-operators-cr2l5" Oct 07 11:23:16 crc kubenswrapper[4700]: I1007 11:23:16.794958 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17497e3a-9f81-4f68-8881-80a6aaae79a1-catalog-content\") pod \"redhat-operators-cr2l5\" (UID: \"17497e3a-9f81-4f68-8881-80a6aaae79a1\") " pod="openshift-marketplace/redhat-operators-cr2l5" Oct 07 11:23:16 crc kubenswrapper[4700]: I1007 11:23:16.796141 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17497e3a-9f81-4f68-8881-80a6aaae79a1-utilities\") pod \"redhat-operators-cr2l5\" (UID: \"17497e3a-9f81-4f68-8881-80a6aaae79a1\") " pod="openshift-marketplace/redhat-operators-cr2l5" Oct 07 11:23:16 crc kubenswrapper[4700]: I1007 11:23:16.815039 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb2p2\" (UniqueName: \"kubernetes.io/projected/17497e3a-9f81-4f68-8881-80a6aaae79a1-kube-api-access-tb2p2\") pod \"redhat-operators-cr2l5\" (UID: \"17497e3a-9f81-4f68-8881-80a6aaae79a1\") " pod="openshift-marketplace/redhat-operators-cr2l5" Oct 07 11:23:16 crc kubenswrapper[4700]: I1007 11:23:16.891038 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cr2l5" Oct 07 11:23:16 crc kubenswrapper[4700]: I1007 11:23:16.953037 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fg2dh"] Oct 07 11:23:16 crc kubenswrapper[4700]: I1007 11:23:16.954172 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fg2dh" Oct 07 11:23:16 crc kubenswrapper[4700]: I1007 11:23:16.960843 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fg2dh"] Oct 07 11:23:17 crc kubenswrapper[4700]: I1007 11:23:17.098461 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f084cb1b-50ca-41c8-8a54-0002371e9041-catalog-content\") pod \"redhat-operators-fg2dh\" (UID: \"f084cb1b-50ca-41c8-8a54-0002371e9041\") " pod="openshift-marketplace/redhat-operators-fg2dh" Oct 07 11:23:17 crc kubenswrapper[4700]: I1007 11:23:17.098508 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f084cb1b-50ca-41c8-8a54-0002371e9041-utilities\") pod \"redhat-operators-fg2dh\" (UID: \"f084cb1b-50ca-41c8-8a54-0002371e9041\") " pod="openshift-marketplace/redhat-operators-fg2dh" Oct 07 11:23:17 crc kubenswrapper[4700]: I1007 11:23:17.098567 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqzxl\" (UniqueName: \"kubernetes.io/projected/f084cb1b-50ca-41c8-8a54-0002371e9041-kube-api-access-jqzxl\") pod \"redhat-operators-fg2dh\" (UID: \"f084cb1b-50ca-41c8-8a54-0002371e9041\") " pod="openshift-marketplace/redhat-operators-fg2dh" Oct 07 11:23:17 crc kubenswrapper[4700]: I1007 11:23:17.199613 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqzxl\" (UniqueName: \"kubernetes.io/projected/f084cb1b-50ca-41c8-8a54-0002371e9041-kube-api-access-jqzxl\") pod \"redhat-operators-fg2dh\" (UID: \"f084cb1b-50ca-41c8-8a54-0002371e9041\") " pod="openshift-marketplace/redhat-operators-fg2dh" Oct 07 11:23:17 crc kubenswrapper[4700]: I1007 11:23:17.199719 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f084cb1b-50ca-41c8-8a54-0002371e9041-catalog-content\") pod \"redhat-operators-fg2dh\" (UID: \"f084cb1b-50ca-41c8-8a54-0002371e9041\") " pod="openshift-marketplace/redhat-operators-fg2dh" Oct 07 11:23:17 crc kubenswrapper[4700]: I1007 11:23:17.199746 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f084cb1b-50ca-41c8-8a54-0002371e9041-utilities\") pod \"redhat-operators-fg2dh\" (UID: \"f084cb1b-50ca-41c8-8a54-0002371e9041\") " pod="openshift-marketplace/redhat-operators-fg2dh" Oct 07 11:23:17 crc kubenswrapper[4700]: I1007 11:23:17.200627 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f084cb1b-50ca-41c8-8a54-0002371e9041-utilities\") pod \"redhat-operators-fg2dh\" (UID: \"f084cb1b-50ca-41c8-8a54-0002371e9041\") " pod="openshift-marketplace/redhat-operators-fg2dh" Oct 07 11:23:17 crc kubenswrapper[4700]: I1007 11:23:17.201162 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f084cb1b-50ca-41c8-8a54-0002371e9041-catalog-content\") pod \"redhat-operators-fg2dh\" (UID: \"f084cb1b-50ca-41c8-8a54-0002371e9041\") " pod="openshift-marketplace/redhat-operators-fg2dh" Oct 07 11:23:17 crc kubenswrapper[4700]: I1007 11:23:17.217557 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqzxl\" (UniqueName: \"kubernetes.io/projected/f084cb1b-50ca-41c8-8a54-0002371e9041-kube-api-access-jqzxl\") pod \"redhat-operators-fg2dh\" (UID: \"f084cb1b-50ca-41c8-8a54-0002371e9041\") " pod="openshift-marketplace/redhat-operators-fg2dh" Oct 07 11:23:17 crc kubenswrapper[4700]: I1007 11:23:17.303627 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fg2dh" Oct 07 11:23:17 crc kubenswrapper[4700]: I1007 11:23:17.360287 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 07 11:23:17 crc kubenswrapper[4700]: I1007 11:23:17.361038 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 11:23:17 crc kubenswrapper[4700]: I1007 11:23:17.370405 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 07 11:23:17 crc kubenswrapper[4700]: I1007 11:23:17.370675 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 07 11:23:17 crc kubenswrapper[4700]: I1007 11:23:17.371438 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 07 11:23:17 crc kubenswrapper[4700]: I1007 11:23:17.425181 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cr2l5"] Oct 07 11:23:17 crc kubenswrapper[4700]: W1007 11:23:17.438768 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17497e3a_9f81_4f68_8881_80a6aaae79a1.slice/crio-9e91a9c1e30bd8f7a17a51116484964bc7014c5aa17044750c8b88a811e93f93 WatchSource:0}: Error finding container 9e91a9c1e30bd8f7a17a51116484964bc7014c5aa17044750c8b88a811e93f93: Status 404 returned error can't find the container with id 9e91a9c1e30bd8f7a17a51116484964bc7014c5aa17044750c8b88a811e93f93 Oct 07 11:23:17 crc kubenswrapper[4700]: I1007 11:23:17.503491 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd6f2f13-b68f-4ff7-a1b4-5603f32dcea9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"bd6f2f13-b68f-4ff7-a1b4-5603f32dcea9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 11:23:17 crc kubenswrapper[4700]: I1007 11:23:17.503560 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd6f2f13-b68f-4ff7-a1b4-5603f32dcea9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"bd6f2f13-b68f-4ff7-a1b4-5603f32dcea9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 11:23:17 crc kubenswrapper[4700]: I1007 11:23:17.575704 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fg2dh"] Oct 07 11:23:17 crc kubenswrapper[4700]: I1007 11:23:17.590337 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-26t2g" Oct 07 11:23:17 crc kubenswrapper[4700]: I1007 11:23:17.594844 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-26t2g" Oct 07 11:23:17 crc kubenswrapper[4700]: I1007 11:23:17.609014 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd6f2f13-b68f-4ff7-a1b4-5603f32dcea9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"bd6f2f13-b68f-4ff7-a1b4-5603f32dcea9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 11:23:17 crc kubenswrapper[4700]: I1007 11:23:17.609133 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd6f2f13-b68f-4ff7-a1b4-5603f32dcea9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"bd6f2f13-b68f-4ff7-a1b4-5603f32dcea9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 11:23:17 crc kubenswrapper[4700]: I1007 11:23:17.609472 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd6f2f13-b68f-4ff7-a1b4-5603f32dcea9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"bd6f2f13-b68f-4ff7-a1b4-5603f32dcea9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 11:23:17 crc kubenswrapper[4700]: I1007 11:23:17.631355 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd6f2f13-b68f-4ff7-a1b4-5603f32dcea9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"bd6f2f13-b68f-4ff7-a1b4-5603f32dcea9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 11:23:17 crc kubenswrapper[4700]: I1007 11:23:17.675368 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cr2l5" event={"ID":"17497e3a-9f81-4f68-8881-80a6aaae79a1","Type":"ContainerStarted","Data":"9e91a9c1e30bd8f7a17a51116484964bc7014c5aa17044750c8b88a811e93f93"} Oct 07 11:23:17 crc kubenswrapper[4700]: I1007 11:23:17.693574 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fg2dh" event={"ID":"f084cb1b-50ca-41c8-8a54-0002371e9041","Type":"ContainerStarted","Data":"04fc7beaf4e7d536ad489a2eeb4f81b75232418101965b8c609b543fa1371182"} Oct 07 11:23:17 crc kubenswrapper[4700]: I1007 11:23:17.710578 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 11:23:18 crc kubenswrapper[4700]: I1007 11:23:18.095093 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 11:23:18 crc kubenswrapper[4700]: I1007 11:23:18.173111 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 07 11:23:18 crc kubenswrapper[4700]: I1007 11:23:18.220807 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e6dab5aa-6c06-4814-839d-10efee6cfb77-kube-api-access\") pod \"e6dab5aa-6c06-4814-839d-10efee6cfb77\" (UID: \"e6dab5aa-6c06-4814-839d-10efee6cfb77\") " Oct 07 11:23:18 crc kubenswrapper[4700]: I1007 11:23:18.220950 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e6dab5aa-6c06-4814-839d-10efee6cfb77-kubelet-dir\") pod \"e6dab5aa-6c06-4814-839d-10efee6cfb77\" (UID: \"e6dab5aa-6c06-4814-839d-10efee6cfb77\") " Oct 07 11:23:18 crc kubenswrapper[4700]: I1007 11:23:18.221299 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e6dab5aa-6c06-4814-839d-10efee6cfb77-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e6dab5aa-6c06-4814-839d-10efee6cfb77" (UID: "e6dab5aa-6c06-4814-839d-10efee6cfb77"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 11:23:18 crc kubenswrapper[4700]: I1007 11:23:18.240534 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6dab5aa-6c06-4814-839d-10efee6cfb77-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e6dab5aa-6c06-4814-839d-10efee6cfb77" (UID: "e6dab5aa-6c06-4814-839d-10efee6cfb77"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:23:18 crc kubenswrapper[4700]: I1007 11:23:18.322643 4700 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e6dab5aa-6c06-4814-839d-10efee6cfb77-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 07 11:23:18 crc kubenswrapper[4700]: I1007 11:23:18.322677 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e6dab5aa-6c06-4814-839d-10efee6cfb77-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 11:23:18 crc kubenswrapper[4700]: I1007 11:23:18.451233 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-wflpr" Oct 07 11:23:18 crc kubenswrapper[4700]: I1007 11:23:18.705659 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bd6f2f13-b68f-4ff7-a1b4-5603f32dcea9","Type":"ContainerStarted","Data":"de379f0298576a72812a99c0c6d034b5e49801e379d06bc128389e14ca82fe76"} Oct 07 11:23:18 crc kubenswrapper[4700]: I1007 11:23:18.711416 4700 generic.go:334] "Generic (PLEG): container finished" podID="17497e3a-9f81-4f68-8881-80a6aaae79a1" containerID="6cf45c1a147b35322bddc4b8086229252d4e3b19bccd8949106ef4ed990e19a3" exitCode=0 Oct 07 11:23:18 crc kubenswrapper[4700]: I1007 11:23:18.711466 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cr2l5" event={"ID":"17497e3a-9f81-4f68-8881-80a6aaae79a1","Type":"ContainerDied","Data":"6cf45c1a147b35322bddc4b8086229252d4e3b19bccd8949106ef4ed990e19a3"} Oct 07 11:23:18 crc kubenswrapper[4700]: I1007 11:23:18.714956 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e6dab5aa-6c06-4814-839d-10efee6cfb77","Type":"ContainerDied","Data":"67cf732d8ec74b082ab2599ed63c98ee835f411ca1b2a7789296f7a1f845c1d8"} Oct 07 11:23:18 crc kubenswrapper[4700]: I1007 11:23:18.715009 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67cf732d8ec74b082ab2599ed63c98ee835f411ca1b2a7789296f7a1f845c1d8" Oct 07 11:23:18 crc kubenswrapper[4700]: I1007 11:23:18.715083 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 11:23:18 crc kubenswrapper[4700]: I1007 11:23:18.729361 4700 generic.go:334] "Generic (PLEG): container finished" podID="f084cb1b-50ca-41c8-8a54-0002371e9041" containerID="f3a5de9966f3a28ea1f95adfb14169278e3843bf468301705a8fe266f144580f" exitCode=0 Oct 07 11:23:18 crc kubenswrapper[4700]: I1007 11:23:18.729417 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fg2dh" event={"ID":"f084cb1b-50ca-41c8-8a54-0002371e9041","Type":"ContainerDied","Data":"f3a5de9966f3a28ea1f95adfb14169278e3843bf468301705a8fe266f144580f"} Oct 07 11:23:19 crc kubenswrapper[4700]: I1007 11:23:19.738652 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bd6f2f13-b68f-4ff7-a1b4-5603f32dcea9","Type":"ContainerStarted","Data":"c108f6e542f7d89e2ef37f3b9e96ab31182fd37e3c1171504293b156fd0c2b6d"} Oct 07 11:23:20 crc kubenswrapper[4700]: I1007 11:23:20.752706 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-z84rb_fe97f639-3a7f-437e-945c-e5a530726ced/cluster-samples-operator/0.log" Oct 07 11:23:20 crc kubenswrapper[4700]: I1007 11:23:20.752753 4700 generic.go:334] "Generic (PLEG): container finished" podID="fe97f639-3a7f-437e-945c-e5a530726ced" containerID="065f6db23858dff6c693686998ef4171cb6553a86ec5262d5b7d7f8e46a0e703" exitCode=2 Oct 07 11:23:20 crc kubenswrapper[4700]: I1007 11:23:20.752804 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z84rb" event={"ID":"fe97f639-3a7f-437e-945c-e5a530726ced","Type":"ContainerDied","Data":"065f6db23858dff6c693686998ef4171cb6553a86ec5262d5b7d7f8e46a0e703"} Oct 07 11:23:20 crc kubenswrapper[4700]: I1007 11:23:20.753357 4700 scope.go:117] "RemoveContainer" containerID="065f6db23858dff6c693686998ef4171cb6553a86ec5262d5b7d7f8e46a0e703" Oct 07 11:23:20 crc kubenswrapper[4700]: I1007 11:23:20.758929 4700 generic.go:334] "Generic (PLEG): container finished" podID="bd6f2f13-b68f-4ff7-a1b4-5603f32dcea9" containerID="c108f6e542f7d89e2ef37f3b9e96ab31182fd37e3c1171504293b156fd0c2b6d" exitCode=0 Oct 07 11:23:20 crc kubenswrapper[4700]: I1007 11:23:20.758984 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bd6f2f13-b68f-4ff7-a1b4-5603f32dcea9","Type":"ContainerDied","Data":"c108f6e542f7d89e2ef37f3b9e96ab31182fd37e3c1171504293b156fd0c2b6d"} Oct 07 11:23:21 crc kubenswrapper[4700]: I1007 11:23:21.148065 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/25429408-169d-4998-9b40-44a882f5a89e-metrics-certs\") pod \"network-metrics-daemon-dhsvm\" (UID: \"25429408-169d-4998-9b40-44a882f5a89e\") " pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:23:21 crc kubenswrapper[4700]: I1007 11:23:21.155912 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/25429408-169d-4998-9b40-44a882f5a89e-metrics-certs\") pod \"network-metrics-daemon-dhsvm\" (UID: \"25429408-169d-4998-9b40-44a882f5a89e\") " pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:23:21 crc kubenswrapper[4700]: I1007 11:23:21.197966 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhsvm" Oct 07 11:23:21 crc kubenswrapper[4700]: I1007 11:23:21.637656 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dhsvm"] Oct 07 11:23:21 crc kubenswrapper[4700]: W1007 11:23:21.647893 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25429408_169d_4998_9b40_44a882f5a89e.slice/crio-f795a075c0f0b3323ecfab7f2e8aa8b08c2c8ab4733f18daab0b74b875fd862f WatchSource:0}: Error finding container f795a075c0f0b3323ecfab7f2e8aa8b08c2c8ab4733f18daab0b74b875fd862f: Status 404 returned error can't find the container with id f795a075c0f0b3323ecfab7f2e8aa8b08c2c8ab4733f18daab0b74b875fd862f Oct 07 11:23:21 crc kubenswrapper[4700]: I1007 11:23:21.773432 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-z84rb_fe97f639-3a7f-437e-945c-e5a530726ced/cluster-samples-operator/0.log" Oct 07 11:23:21 crc kubenswrapper[4700]: I1007 11:23:21.773563 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z84rb" event={"ID":"fe97f639-3a7f-437e-945c-e5a530726ced","Type":"ContainerStarted","Data":"b26c91f22e8f4579b645b02d5dc0798a874de8ff85f79b0131e96232ad0e0225"} Oct 07 11:23:21 crc kubenswrapper[4700]: I1007 11:23:21.775027 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dhsvm" event={"ID":"25429408-169d-4998-9b40-44a882f5a89e","Type":"ContainerStarted","Data":"f795a075c0f0b3323ecfab7f2e8aa8b08c2c8ab4733f18daab0b74b875fd862f"} Oct 07 11:23:22 crc kubenswrapper[4700]: I1007 11:23:22.058967 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 11:23:22 crc kubenswrapper[4700]: I1007 11:23:22.164843 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd6f2f13-b68f-4ff7-a1b4-5603f32dcea9-kube-api-access\") pod \"bd6f2f13-b68f-4ff7-a1b4-5603f32dcea9\" (UID: \"bd6f2f13-b68f-4ff7-a1b4-5603f32dcea9\") " Oct 07 11:23:22 crc kubenswrapper[4700]: I1007 11:23:22.164959 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd6f2f13-b68f-4ff7-a1b4-5603f32dcea9-kubelet-dir\") pod \"bd6f2f13-b68f-4ff7-a1b4-5603f32dcea9\" (UID: \"bd6f2f13-b68f-4ff7-a1b4-5603f32dcea9\") " Oct 07 11:23:22 crc kubenswrapper[4700]: I1007 11:23:22.165253 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bd6f2f13-b68f-4ff7-a1b4-5603f32dcea9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bd6f2f13-b68f-4ff7-a1b4-5603f32dcea9" (UID: "bd6f2f13-b68f-4ff7-a1b4-5603f32dcea9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 11:23:22 crc kubenswrapper[4700]: I1007 11:23:22.165593 4700 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd6f2f13-b68f-4ff7-a1b4-5603f32dcea9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 07 11:23:22 crc kubenswrapper[4700]: I1007 11:23:22.171922 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd6f2f13-b68f-4ff7-a1b4-5603f32dcea9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bd6f2f13-b68f-4ff7-a1b4-5603f32dcea9" (UID: "bd6f2f13-b68f-4ff7-a1b4-5603f32dcea9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:23:22 crc kubenswrapper[4700]: I1007 11:23:22.266401 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd6f2f13-b68f-4ff7-a1b4-5603f32dcea9-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 11:23:22 crc kubenswrapper[4700]: I1007 11:23:22.782159 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bd6f2f13-b68f-4ff7-a1b4-5603f32dcea9","Type":"ContainerDied","Data":"de379f0298576a72812a99c0c6d034b5e49801e379d06bc128389e14ca82fe76"} Oct 07 11:23:22 crc kubenswrapper[4700]: I1007 11:23:22.782224 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de379f0298576a72812a99c0c6d034b5e49801e379d06bc128389e14ca82fe76" Oct 07 11:23:22 crc kubenswrapper[4700]: I1007 11:23:22.783229 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 11:23:23 crc kubenswrapper[4700]: I1007 11:23:23.790456 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dhsvm" event={"ID":"25429408-169d-4998-9b40-44a882f5a89e","Type":"ContainerStarted","Data":"0346fa053f47d1140e763b4053f5f0415945bf0c9e59bf259b1b6fe0d6b8221f"} Oct 07 11:23:25 crc kubenswrapper[4700]: I1007 11:23:25.725832 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-xx8r8" Oct 07 11:23:25 crc kubenswrapper[4700]: I1007 11:23:25.730065 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-xx8r8" Oct 07 11:23:26 crc kubenswrapper[4700]: I1007 11:23:26.181074 4700 patch_prober.go:28] interesting pod/downloads-7954f5f757-tpvgp container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 07 11:23:26 crc kubenswrapper[4700]: I1007 11:23:26.181088 4700 patch_prober.go:28] interesting pod/downloads-7954f5f757-tpvgp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 07 11:23:26 crc kubenswrapper[4700]: I1007 11:23:26.181138 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-tpvgp" podUID="351bb56c-499b-456c-81e2-ea2664ca5960" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 07 11:23:26 crc kubenswrapper[4700]: I1007 11:23:26.181157 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tpvgp" podUID="351bb56c-499b-456c-81e2-ea2664ca5960" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 07 11:23:32 crc kubenswrapper[4700]: I1007 11:23:32.204019 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 11:23:33 crc kubenswrapper[4700]: I1007 11:23:33.409874 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:23:36 crc kubenswrapper[4700]: I1007 11:23:36.181413 4700 patch_prober.go:28] interesting pod/downloads-7954f5f757-tpvgp container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 07 11:23:36 crc kubenswrapper[4700]: I1007 11:23:36.182007 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-tpvgp" podUID="351bb56c-499b-456c-81e2-ea2664ca5960" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 07 11:23:36 crc kubenswrapper[4700]: I1007 11:23:36.181467 4700 patch_prober.go:28] interesting pod/downloads-7954f5f757-tpvgp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 07 11:23:36 crc kubenswrapper[4700]: I1007 11:23:36.182067 4700 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-tpvgp" Oct 07 11:23:36 crc kubenswrapper[4700]: I1007 11:23:36.182082 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tpvgp" podUID="351bb56c-499b-456c-81e2-ea2664ca5960" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 07 11:23:36 crc kubenswrapper[4700]: I1007 11:23:36.182687 4700 patch_prober.go:28] interesting pod/downloads-7954f5f757-tpvgp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 07 11:23:36 crc kubenswrapper[4700]: I1007 11:23:36.182746 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tpvgp" podUID="351bb56c-499b-456c-81e2-ea2664ca5960" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 07 11:23:36 crc kubenswrapper[4700]: I1007 11:23:36.182828 4700 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"c9a822c3c31d7e0ce0be363e2c02824d487bf886df028761cf730b1cbde9ae93"} pod="openshift-console/downloads-7954f5f757-tpvgp" containerMessage="Container download-server failed liveness probe, will be restarted" Oct 07 11:23:36 crc kubenswrapper[4700]: I1007 11:23:36.182922 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-tpvgp" podUID="351bb56c-499b-456c-81e2-ea2664ca5960" containerName="download-server" containerID="cri-o://c9a822c3c31d7e0ce0be363e2c02824d487bf886df028761cf730b1cbde9ae93" gracePeriod=2 Oct 07 11:23:36 crc kubenswrapper[4700]: I1007 11:23:36.883619 4700 generic.go:334] "Generic (PLEG): container finished" podID="351bb56c-499b-456c-81e2-ea2664ca5960" containerID="c9a822c3c31d7e0ce0be363e2c02824d487bf886df028761cf730b1cbde9ae93" exitCode=0 Oct 07 11:23:36 crc kubenswrapper[4700]: I1007 11:23:36.883675 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-tpvgp" event={"ID":"351bb56c-499b-456c-81e2-ea2664ca5960","Type":"ContainerDied","Data":"c9a822c3c31d7e0ce0be363e2c02824d487bf886df028761cf730b1cbde9ae93"} Oct 07 11:23:45 crc kubenswrapper[4700]: I1007 11:23:45.334130 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 11:23:45 crc kubenswrapper[4700]: I1007 11:23:45.334982 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 11:23:46 crc kubenswrapper[4700]: I1007 11:23:46.183509 4700 patch_prober.go:28] interesting pod/downloads-7954f5f757-tpvgp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 07 11:23:46 crc kubenswrapper[4700]: I1007 11:23:46.183615 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tpvgp" podUID="351bb56c-499b-456c-81e2-ea2664ca5960" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 07 11:23:46 crc kubenswrapper[4700]: I1007 11:23:46.720091 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4f7h4" Oct 07 11:23:53 crc kubenswrapper[4700]: E1007 11:23:53.499860 4700 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 07 11:23:53 crc kubenswrapper[4700]: E1007 11:23:53.500619 4700 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zvktg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-8bwfw_openshift-marketplace(a56f43f6-4fa8-47ab-b028-2bcc44e329d0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 11:23:53 crc kubenswrapper[4700]: E1007 11:23:53.501857 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-8bwfw" podUID="a56f43f6-4fa8-47ab-b028-2bcc44e329d0" Oct 07 11:23:56 crc kubenswrapper[4700]: I1007 11:23:56.190645 4700 patch_prober.go:28] interesting pod/downloads-7954f5f757-tpvgp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 07 11:23:56 crc kubenswrapper[4700]: I1007 11:23:56.191114 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tpvgp" podUID="351bb56c-499b-456c-81e2-ea2664ca5960" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 07 11:23:56 crc kubenswrapper[4700]: E1007 11:23:56.736893 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-8bwfw" podUID="a56f43f6-4fa8-47ab-b028-2bcc44e329d0" Oct 07 11:23:56 crc kubenswrapper[4700]: E1007 11:23:56.788725 4700 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 07 11:23:56 crc kubenswrapper[4700]: E1007 11:23:56.788924 4700 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jqzxl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-fg2dh_openshift-marketplace(f084cb1b-50ca-41c8-8a54-0002371e9041): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 11:23:56 crc kubenswrapper[4700]: E1007 11:23:56.790741 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-fg2dh" podUID="f084cb1b-50ca-41c8-8a54-0002371e9041" Oct 07 11:23:56 crc kubenswrapper[4700]: E1007 11:23:56.830227 4700 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 07 11:23:56 crc kubenswrapper[4700]: E1007 11:23:56.830550 4700 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mrfb9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-dblxw_openshift-marketplace(5ea7e71b-3585-463c-ac5a-ac47dfbb9f47): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 11:23:56 crc kubenswrapper[4700]: E1007 11:23:56.831754 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-dblxw" podUID="5ea7e71b-3585-463c-ac5a-ac47dfbb9f47" Oct 07 11:23:58 crc kubenswrapper[4700]: E1007 11:23:58.065549 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-dblxw" podUID="5ea7e71b-3585-463c-ac5a-ac47dfbb9f47" Oct 07 11:23:58 crc kubenswrapper[4700]: E1007 11:23:58.065665 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-fg2dh" podUID="f084cb1b-50ca-41c8-8a54-0002371e9041" Oct 07 11:23:58 crc kubenswrapper[4700]: E1007 11:23:58.112042 4700 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 07 11:23:58 crc kubenswrapper[4700]: E1007 11:23:58.112236 4700 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tb2p2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-cr2l5_openshift-marketplace(17497e3a-9f81-4f68-8881-80a6aaae79a1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 11:23:58 crc kubenswrapper[4700]: E1007 11:23:58.114754 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-cr2l5" podUID="17497e3a-9f81-4f68-8881-80a6aaae79a1" Oct 07 11:23:58 crc kubenswrapper[4700]: E1007 11:23:58.155387 4700 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 07 11:23:58 crc kubenswrapper[4700]: E1007 11:23:58.155613 4700 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z6hw9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-rrq9m_openshift-marketplace(dcfdc410-2928-44f4-a636-a683a6106aa8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 11:23:58 crc kubenswrapper[4700]: E1007 11:23:58.156881 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-rrq9m" podUID="dcfdc410-2928-44f4-a636-a683a6106aa8" Oct 07 11:23:58 crc kubenswrapper[4700]: E1007 11:23:58.160669 4700 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 07 11:23:58 crc kubenswrapper[4700]: E1007 11:23:58.160963 4700 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rlwhg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-9r2ww_openshift-marketplace(487ae31f-7fb6-4077-8f7d-11bb488b172b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 11:23:58 crc kubenswrapper[4700]: E1007 11:23:58.162336 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-9r2ww" podUID="487ae31f-7fb6-4077-8f7d-11bb488b172b" Oct 07 11:23:58 crc kubenswrapper[4700]: E1007 11:23:58.758858 4700 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 07 11:23:58 crc kubenswrapper[4700]: E1007 11:23:58.759136 4700 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fcksd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-c7vk6_openshift-marketplace(790ca1b9-1a1e-49d8-802b-e848efbc4c3e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 11:23:58 crc kubenswrapper[4700]: E1007 11:23:58.761374 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-c7vk6" podUID="790ca1b9-1a1e-49d8-802b-e848efbc4c3e" Oct 07 11:23:58 crc kubenswrapper[4700]: E1007 11:23:58.766707 4700 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 07 11:23:58 crc kubenswrapper[4700]: E1007 11:23:58.767044 4700 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hcx5l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-4zrq9_openshift-marketplace(630f464b-788f-48ee-93ee-d5644f705ec0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 11:23:58 crc kubenswrapper[4700]: E1007 11:23:58.768823 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-4zrq9" podUID="630f464b-788f-48ee-93ee-d5644f705ec0" Oct 07 11:23:59 crc kubenswrapper[4700]: I1007 11:23:59.008714 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dhsvm" event={"ID":"25429408-169d-4998-9b40-44a882f5a89e","Type":"ContainerStarted","Data":"59b0ec02a5e73795f4ec57c7a45ac27e844ae73a355daa7d6c3b65f13d3d7bd2"} Oct 07 11:23:59 crc kubenswrapper[4700]: I1007 11:23:59.014325 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-tpvgp" event={"ID":"351bb56c-499b-456c-81e2-ea2664ca5960","Type":"ContainerStarted","Data":"4e659dcb1079e8116610454058ff9dfb88044f4f5150ae8c3e7ff9508a4d8fcd"} Oct 07 11:23:59 crc kubenswrapper[4700]: I1007 11:23:59.014368 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-tpvgp" Oct 07 11:23:59 crc kubenswrapper[4700]: I1007 11:23:59.015725 4700 patch_prober.go:28] interesting pod/downloads-7954f5f757-tpvgp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 07 11:23:59 crc kubenswrapper[4700]: I1007 11:23:59.015754 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tpvgp" podUID="351bb56c-499b-456c-81e2-ea2664ca5960" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 07 11:23:59 crc kubenswrapper[4700]: E1007 11:23:59.017172 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-c7vk6" podUID="790ca1b9-1a1e-49d8-802b-e848efbc4c3e" Oct 07 11:23:59 crc kubenswrapper[4700]: E1007 11:23:59.017287 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-4zrq9" podUID="630f464b-788f-48ee-93ee-d5644f705ec0" Oct 07 11:23:59 crc kubenswrapper[4700]: E1007 11:23:59.017337 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-9r2ww" podUID="487ae31f-7fb6-4077-8f7d-11bb488b172b" Oct 07 11:23:59 crc kubenswrapper[4700]: E1007 11:23:59.017700 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cr2l5" podUID="17497e3a-9f81-4f68-8881-80a6aaae79a1" Oct 07 11:23:59 crc kubenswrapper[4700]: E1007 11:23:59.017908 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-rrq9m" podUID="dcfdc410-2928-44f4-a636-a683a6106aa8" Oct 07 11:23:59 crc kubenswrapper[4700]: I1007 11:23:59.037814 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-dhsvm" podStartSLOduration=181.037800467 podStartE2EDuration="3m1.037800467s" podCreationTimestamp="2025-10-07 11:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:23:59.035851865 +0000 UTC m=+205.832250894" watchObservedRunningTime="2025-10-07 11:23:59.037800467 +0000 UTC m=+205.834199456" Oct 07 11:24:00 crc kubenswrapper[4700]: I1007 11:24:00.021073 4700 patch_prober.go:28] interesting pod/downloads-7954f5f757-tpvgp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 07 11:24:00 crc kubenswrapper[4700]: I1007 11:24:00.021170 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tpvgp" podUID="351bb56c-499b-456c-81e2-ea2664ca5960" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 07 11:24:06 crc kubenswrapper[4700]: I1007 11:24:06.200516 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-tpvgp" Oct 07 11:24:15 crc kubenswrapper[4700]: I1007 11:24:15.336108 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 11:24:15 crc kubenswrapper[4700]: I1007 11:24:15.336846 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 11:24:15 crc kubenswrapper[4700]: I1007 11:24:15.336918 4700 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" Oct 07 11:24:15 crc kubenswrapper[4700]: I1007 11:24:15.337896 4700 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6"} pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 11:24:15 crc kubenswrapper[4700]: I1007 11:24:15.337985 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" containerID="cri-o://8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6" gracePeriod=600 Oct 07 11:24:17 crc kubenswrapper[4700]: I1007 11:24:17.141913 4700 generic.go:334] "Generic (PLEG): container finished" podID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerID="8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6" exitCode=0 Oct 07 11:24:17 crc kubenswrapper[4700]: I1007 11:24:17.142008 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" event={"ID":"97a77b38-e9b1-4243-ac3a-28d83d87cf15","Type":"ContainerDied","Data":"8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6"} Oct 07 11:24:18 crc kubenswrapper[4700]: I1007 11:24:18.153666 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" event={"ID":"97a77b38-e9b1-4243-ac3a-28d83d87cf15","Type":"ContainerStarted","Data":"eccd7bcc9d2e4841d5c5ffebb71c3562830e1b3391f2acddd70627baba88e9fd"} Oct 07 11:24:19 crc kubenswrapper[4700]: I1007 11:24:19.171201 4700 generic.go:334] "Generic (PLEG): container finished" podID="790ca1b9-1a1e-49d8-802b-e848efbc4c3e" containerID="beb4c08326a724ef416b73c2f409f1ef1d353dc961a9c0a3e0ff9884a20b903c" exitCode=0 Oct 07 11:24:19 crc kubenswrapper[4700]: I1007 11:24:19.171393 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c7vk6" event={"ID":"790ca1b9-1a1e-49d8-802b-e848efbc4c3e","Type":"ContainerDied","Data":"beb4c08326a724ef416b73c2f409f1ef1d353dc961a9c0a3e0ff9884a20b903c"} Oct 07 11:24:19 crc kubenswrapper[4700]: I1007 11:24:19.180291 4700 generic.go:334] "Generic (PLEG): container finished" podID="487ae31f-7fb6-4077-8f7d-11bb488b172b" containerID="d48430ee2c175830564f0031f926693aaef3d42c788c76061e0747e60b4ee58d" exitCode=0 Oct 07 11:24:19 crc kubenswrapper[4700]: I1007 11:24:19.180590 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9r2ww" event={"ID":"487ae31f-7fb6-4077-8f7d-11bb488b172b","Type":"ContainerDied","Data":"d48430ee2c175830564f0031f926693aaef3d42c788c76061e0747e60b4ee58d"} Oct 07 11:24:19 crc kubenswrapper[4700]: I1007 11:24:19.187006 4700 generic.go:334] "Generic (PLEG): container finished" podID="630f464b-788f-48ee-93ee-d5644f705ec0" containerID="bfbbe8529836aa12f37057a8fb5a1eb36f956154a41efc00ed42f7944245bc70" exitCode=0 Oct 07 11:24:19 crc kubenswrapper[4700]: I1007 11:24:19.187083 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4zrq9" event={"ID":"630f464b-788f-48ee-93ee-d5644f705ec0","Type":"ContainerDied","Data":"bfbbe8529836aa12f37057a8fb5a1eb36f956154a41efc00ed42f7944245bc70"} Oct 07 11:24:19 crc kubenswrapper[4700]: I1007 11:24:19.193466 4700 generic.go:334] "Generic (PLEG): container finished" podID="5ea7e71b-3585-463c-ac5a-ac47dfbb9f47" containerID="747607f441fd8e772ee3e56b8c51bf090e7932565512064a3edd5b9edf28525e" exitCode=0 Oct 07 11:24:19 crc kubenswrapper[4700]: I1007 11:24:19.193547 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dblxw" event={"ID":"5ea7e71b-3585-463c-ac5a-ac47dfbb9f47","Type":"ContainerDied","Data":"747607f441fd8e772ee3e56b8c51bf090e7932565512064a3edd5b9edf28525e"} Oct 07 11:24:19 crc kubenswrapper[4700]: I1007 11:24:19.200899 4700 generic.go:334] "Generic (PLEG): container finished" podID="a56f43f6-4fa8-47ab-b028-2bcc44e329d0" containerID="cd5e927a233fc46878a2302a8d40e3f8781094e78fa9c286bf589901a784aa78" exitCode=0 Oct 07 11:24:19 crc kubenswrapper[4700]: I1007 11:24:19.200979 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8bwfw" event={"ID":"a56f43f6-4fa8-47ab-b028-2bcc44e329d0","Type":"ContainerDied","Data":"cd5e927a233fc46878a2302a8d40e3f8781094e78fa9c286bf589901a784aa78"} Oct 07 11:24:19 crc kubenswrapper[4700]: I1007 11:24:19.213487 4700 generic.go:334] "Generic (PLEG): container finished" podID="f084cb1b-50ca-41c8-8a54-0002371e9041" containerID="3686e1bb7cd64c7981f808d6cabb6e20d27e2145f1136a5a1f509a767608e178" exitCode=0 Oct 07 11:24:19 crc kubenswrapper[4700]: I1007 11:24:19.213988 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fg2dh" event={"ID":"f084cb1b-50ca-41c8-8a54-0002371e9041","Type":"ContainerDied","Data":"3686e1bb7cd64c7981f808d6cabb6e20d27e2145f1136a5a1f509a767608e178"} Oct 07 11:24:19 crc kubenswrapper[4700]: I1007 11:24:19.223216 4700 generic.go:334] "Generic (PLEG): container finished" podID="dcfdc410-2928-44f4-a636-a683a6106aa8" containerID="e29fae14e748ab147c2d50a6c7d91c233df80ce95a243d360887ec669173062f" exitCode=0 Oct 07 11:24:19 crc kubenswrapper[4700]: I1007 11:24:19.223337 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rrq9m" event={"ID":"dcfdc410-2928-44f4-a636-a683a6106aa8","Type":"ContainerDied","Data":"e29fae14e748ab147c2d50a6c7d91c233df80ce95a243d360887ec669173062f"} Oct 07 11:24:19 crc kubenswrapper[4700]: I1007 11:24:19.230770 4700 generic.go:334] "Generic (PLEG): container finished" podID="17497e3a-9f81-4f68-8881-80a6aaae79a1" containerID="58e9b1307667b3558200c0e004363280719e8b200716cbe649b132da1f4e30a1" exitCode=0 Oct 07 11:24:19 crc kubenswrapper[4700]: I1007 11:24:19.230861 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cr2l5" event={"ID":"17497e3a-9f81-4f68-8881-80a6aaae79a1","Type":"ContainerDied","Data":"58e9b1307667b3558200c0e004363280719e8b200716cbe649b132da1f4e30a1"} Oct 07 11:24:20 crc kubenswrapper[4700]: I1007 11:24:20.274503 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8bwfw" event={"ID":"a56f43f6-4fa8-47ab-b028-2bcc44e329d0","Type":"ContainerStarted","Data":"2dc4f1ea40e85f8880385647609d3d4337144af4536669fe652a004e60091f69"} Oct 07 11:24:20 crc kubenswrapper[4700]: I1007 11:24:20.283392 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fg2dh" event={"ID":"f084cb1b-50ca-41c8-8a54-0002371e9041","Type":"ContainerStarted","Data":"d75131c0d649ec6d079de9f513df16a7b7c5dd83dc4d20cf3fe2ece554bfc1df"} Oct 07 11:24:20 crc kubenswrapper[4700]: I1007 11:24:20.289404 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rrq9m" event={"ID":"dcfdc410-2928-44f4-a636-a683a6106aa8","Type":"ContainerStarted","Data":"0fef2ef97feeb2cdb977430da7852544017bd4f3cc88dea1c8310c82036247c6"} Oct 07 11:24:20 crc kubenswrapper[4700]: I1007 11:24:20.293806 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cr2l5" event={"ID":"17497e3a-9f81-4f68-8881-80a6aaae79a1","Type":"ContainerStarted","Data":"1cdb0cef4b70ad8dba90741ebb2f73e5e2df2148980bf5928ef0b29c2315b4e3"} Oct 07 11:24:20 crc kubenswrapper[4700]: I1007 11:24:20.296728 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c7vk6" event={"ID":"790ca1b9-1a1e-49d8-802b-e848efbc4c3e","Type":"ContainerStarted","Data":"dba3207861d7cbea85470b7c066d7e13465c00a96d4deb80c92eaf2e5f92a5a4"} Oct 07 11:24:20 crc kubenswrapper[4700]: I1007 11:24:20.299882 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dblxw" event={"ID":"5ea7e71b-3585-463c-ac5a-ac47dfbb9f47","Type":"ContainerStarted","Data":"3bfd06af38ca9aa16839c58c193a00b5577f51c5468d8b303d4d6191231877d2"} Oct 07 11:24:20 crc kubenswrapper[4700]: I1007 11:24:20.301158 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8bwfw" podStartSLOduration=2.245059648 podStartE2EDuration="1m7.301149191s" podCreationTimestamp="2025-10-07 11:23:13 +0000 UTC" firstStartedPulling="2025-10-07 11:23:14.615041778 +0000 UTC m=+161.411440767" lastFinishedPulling="2025-10-07 11:24:19.671131311 +0000 UTC m=+226.467530310" observedRunningTime="2025-10-07 11:24:20.299594099 +0000 UTC m=+227.095993088" watchObservedRunningTime="2025-10-07 11:24:20.301149191 +0000 UTC m=+227.097548180" Oct 07 11:24:20 crc kubenswrapper[4700]: I1007 11:24:20.311553 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9r2ww" event={"ID":"487ae31f-7fb6-4077-8f7d-11bb488b172b","Type":"ContainerStarted","Data":"2560fc8caa644a41d36efff79ae79ea49ca45a4dd7f3ac90b2fa4b40ff4eefe2"} Oct 07 11:24:20 crc kubenswrapper[4700]: I1007 11:24:20.313716 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4zrq9" event={"ID":"630f464b-788f-48ee-93ee-d5644f705ec0","Type":"ContainerStarted","Data":"9870eaef3ccfea04e8da897b8bd9cd06dfcf4b06907a5ffff40be8a85544d083"} Oct 07 11:24:20 crc kubenswrapper[4700]: I1007 11:24:20.342909 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fg2dh" podStartSLOduration=3.223319439 podStartE2EDuration="1m4.342888499s" podCreationTimestamp="2025-10-07 11:23:16 +0000 UTC" firstStartedPulling="2025-10-07 11:23:18.734537333 +0000 UTC m=+165.530936322" lastFinishedPulling="2025-10-07 11:24:19.854106383 +0000 UTC m=+226.650505382" observedRunningTime="2025-10-07 11:24:20.33992866 +0000 UTC m=+227.136327649" watchObservedRunningTime="2025-10-07 11:24:20.342888499 +0000 UTC m=+227.139287488" Oct 07 11:24:20 crc kubenswrapper[4700]: I1007 11:24:20.362815 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rrq9m" podStartSLOduration=2.091346847 podStartE2EDuration="1m7.362796183s" podCreationTimestamp="2025-10-07 11:23:13 +0000 UTC" firstStartedPulling="2025-10-07 11:23:14.600811424 +0000 UTC m=+161.397210403" lastFinishedPulling="2025-10-07 11:24:19.87226075 +0000 UTC m=+226.668659739" observedRunningTime="2025-10-07 11:24:20.362177716 +0000 UTC m=+227.158576705" watchObservedRunningTime="2025-10-07 11:24:20.362796183 +0000 UTC m=+227.159195172" Oct 07 11:24:20 crc kubenswrapper[4700]: I1007 11:24:20.379985 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-c7vk6" podStartSLOduration=2.092713862 podStartE2EDuration="1m5.379960463s" podCreationTimestamp="2025-10-07 11:23:15 +0000 UTC" firstStartedPulling="2025-10-07 11:23:16.665723331 +0000 UTC m=+163.462122320" lastFinishedPulling="2025-10-07 11:24:19.952969932 +0000 UTC m=+226.749368921" observedRunningTime="2025-10-07 11:24:20.378650848 +0000 UTC m=+227.175049847" watchObservedRunningTime="2025-10-07 11:24:20.379960463 +0000 UTC m=+227.176359442" Oct 07 11:24:20 crc kubenswrapper[4700]: I1007 11:24:20.394050 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cr2l5" podStartSLOduration=3.34588861 podStartE2EDuration="1m4.39403048s" podCreationTimestamp="2025-10-07 11:23:16 +0000 UTC" firstStartedPulling="2025-10-07 11:23:18.716541557 +0000 UTC m=+165.512940546" lastFinishedPulling="2025-10-07 11:24:19.764683407 +0000 UTC m=+226.561082416" observedRunningTime="2025-10-07 11:24:20.393838104 +0000 UTC m=+227.190237093" watchObservedRunningTime="2025-10-07 11:24:20.39403048 +0000 UTC m=+227.190429469" Oct 07 11:24:20 crc kubenswrapper[4700]: I1007 11:24:20.416497 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4zrq9" podStartSLOduration=2.326957066 podStartE2EDuration="1m5.416474941s" podCreationTimestamp="2025-10-07 11:23:15 +0000 UTC" firstStartedPulling="2025-10-07 11:23:16.657104338 +0000 UTC m=+163.453503327" lastFinishedPulling="2025-10-07 11:24:19.746622193 +0000 UTC m=+226.543021202" observedRunningTime="2025-10-07 11:24:20.415163046 +0000 UTC m=+227.211562035" watchObservedRunningTime="2025-10-07 11:24:20.416474941 +0000 UTC m=+227.212873920" Oct 07 11:24:20 crc kubenswrapper[4700]: I1007 11:24:20.433065 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dblxw" podStartSLOduration=2.448578388 podStartE2EDuration="1m7.433046235s" podCreationTimestamp="2025-10-07 11:23:13 +0000 UTC" firstStartedPulling="2025-10-07 11:23:14.601161543 +0000 UTC m=+161.397560532" lastFinishedPulling="2025-10-07 11:24:19.58562939 +0000 UTC m=+226.382028379" observedRunningTime="2025-10-07 11:24:20.431295788 +0000 UTC m=+227.227694767" watchObservedRunningTime="2025-10-07 11:24:20.433046235 +0000 UTC m=+227.229445224" Oct 07 11:24:20 crc kubenswrapper[4700]: I1007 11:24:20.448220 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9r2ww" podStartSLOduration=2.405839984 podStartE2EDuration="1m7.448199701s" podCreationTimestamp="2025-10-07 11:23:13 +0000 UTC" firstStartedPulling="2025-10-07 11:23:14.607391931 +0000 UTC m=+161.403790920" lastFinishedPulling="2025-10-07 11:24:19.649751628 +0000 UTC m=+226.446150637" observedRunningTime="2025-10-07 11:24:20.445409636 +0000 UTC m=+227.241808625" watchObservedRunningTime="2025-10-07 11:24:20.448199701 +0000 UTC m=+227.244598690" Oct 07 11:24:23 crc kubenswrapper[4700]: I1007 11:24:23.473958 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8bwfw" Oct 07 11:24:23 crc kubenswrapper[4700]: I1007 11:24:23.474587 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8bwfw" Oct 07 11:24:23 crc kubenswrapper[4700]: I1007 11:24:23.634026 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8bwfw" Oct 07 11:24:23 crc kubenswrapper[4700]: I1007 11:24:23.671418 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9r2ww" Oct 07 11:24:23 crc kubenswrapper[4700]: I1007 11:24:23.671492 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9r2ww" Oct 07 11:24:23 crc kubenswrapper[4700]: I1007 11:24:23.713852 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9r2ww" Oct 07 11:24:23 crc kubenswrapper[4700]: I1007 11:24:23.886089 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dblxw" Oct 07 11:24:23 crc kubenswrapper[4700]: I1007 11:24:23.886162 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dblxw" Oct 07 11:24:23 crc kubenswrapper[4700]: I1007 11:24:23.938296 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dblxw" Oct 07 11:24:24 crc kubenswrapper[4700]: I1007 11:24:24.074781 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rrq9m" Oct 07 11:24:24 crc kubenswrapper[4700]: I1007 11:24:24.075181 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rrq9m" Oct 07 11:24:24 crc kubenswrapper[4700]: I1007 11:24:24.117283 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rrq9m" Oct 07 11:24:24 crc kubenswrapper[4700]: I1007 11:24:24.388207 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9r2ww" Oct 07 11:24:24 crc kubenswrapper[4700]: I1007 11:24:24.388364 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dblxw" Oct 07 11:24:24 crc kubenswrapper[4700]: I1007 11:24:24.397366 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8bwfw" Oct 07 11:24:25 crc kubenswrapper[4700]: I1007 11:24:25.502255 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-c7vk6" Oct 07 11:24:25 crc kubenswrapper[4700]: I1007 11:24:25.502562 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c7vk6" Oct 07 11:24:25 crc kubenswrapper[4700]: I1007 11:24:25.537473 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-c7vk6" Oct 07 11:24:25 crc kubenswrapper[4700]: I1007 11:24:25.607492 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dblxw"] Oct 07 11:24:25 crc kubenswrapper[4700]: I1007 11:24:25.884577 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4zrq9" Oct 07 11:24:25 crc kubenswrapper[4700]: I1007 11:24:25.884623 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4zrq9" Oct 07 11:24:25 crc kubenswrapper[4700]: I1007 11:24:25.927610 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4zrq9" Oct 07 11:24:26 crc kubenswrapper[4700]: I1007 11:24:26.344568 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dblxw" podUID="5ea7e71b-3585-463c-ac5a-ac47dfbb9f47" containerName="registry-server" containerID="cri-o://3bfd06af38ca9aa16839c58c193a00b5577f51c5468d8b303d4d6191231877d2" gracePeriod=2 Oct 07 11:24:26 crc kubenswrapper[4700]: I1007 11:24:26.386068 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-c7vk6" Oct 07 11:24:26 crc kubenswrapper[4700]: I1007 11:24:26.395453 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4zrq9" Oct 07 11:24:26 crc kubenswrapper[4700]: I1007 11:24:26.884281 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dblxw" Oct 07 11:24:26 crc kubenswrapper[4700]: I1007 11:24:26.891628 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cr2l5" Oct 07 11:24:26 crc kubenswrapper[4700]: I1007 11:24:26.891675 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cr2l5" Oct 07 11:24:26 crc kubenswrapper[4700]: I1007 11:24:26.942381 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ea7e71b-3585-463c-ac5a-ac47dfbb9f47-catalog-content\") pod \"5ea7e71b-3585-463c-ac5a-ac47dfbb9f47\" (UID: \"5ea7e71b-3585-463c-ac5a-ac47dfbb9f47\") " Oct 07 11:24:26 crc kubenswrapper[4700]: I1007 11:24:26.951387 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cr2l5" Oct 07 11:24:26 crc kubenswrapper[4700]: I1007 11:24:26.986771 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ea7e71b-3585-463c-ac5a-ac47dfbb9f47-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5ea7e71b-3585-463c-ac5a-ac47dfbb9f47" (UID: "5ea7e71b-3585-463c-ac5a-ac47dfbb9f47"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:24:27 crc kubenswrapper[4700]: I1007 11:24:27.043868 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ea7e71b-3585-463c-ac5a-ac47dfbb9f47-utilities\") pod \"5ea7e71b-3585-463c-ac5a-ac47dfbb9f47\" (UID: \"5ea7e71b-3585-463c-ac5a-ac47dfbb9f47\") " Oct 07 11:24:27 crc kubenswrapper[4700]: I1007 11:24:27.043964 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrfb9\" (UniqueName: \"kubernetes.io/projected/5ea7e71b-3585-463c-ac5a-ac47dfbb9f47-kube-api-access-mrfb9\") pod \"5ea7e71b-3585-463c-ac5a-ac47dfbb9f47\" (UID: \"5ea7e71b-3585-463c-ac5a-ac47dfbb9f47\") " Oct 07 11:24:27 crc kubenswrapper[4700]: I1007 11:24:27.044252 4700 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ea7e71b-3585-463c-ac5a-ac47dfbb9f47-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 11:24:27 crc kubenswrapper[4700]: I1007 11:24:27.044670 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ea7e71b-3585-463c-ac5a-ac47dfbb9f47-utilities" (OuterVolumeSpecName: "utilities") pod "5ea7e71b-3585-463c-ac5a-ac47dfbb9f47" (UID: "5ea7e71b-3585-463c-ac5a-ac47dfbb9f47"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:24:27 crc kubenswrapper[4700]: I1007 11:24:27.049534 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ea7e71b-3585-463c-ac5a-ac47dfbb9f47-kube-api-access-mrfb9" (OuterVolumeSpecName: "kube-api-access-mrfb9") pod "5ea7e71b-3585-463c-ac5a-ac47dfbb9f47" (UID: "5ea7e71b-3585-463c-ac5a-ac47dfbb9f47"). InnerVolumeSpecName "kube-api-access-mrfb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:24:27 crc kubenswrapper[4700]: I1007 11:24:27.146397 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrfb9\" (UniqueName: \"kubernetes.io/projected/5ea7e71b-3585-463c-ac5a-ac47dfbb9f47-kube-api-access-mrfb9\") on node \"crc\" DevicePath \"\"" Oct 07 11:24:27 crc kubenswrapper[4700]: I1007 11:24:27.146461 4700 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ea7e71b-3585-463c-ac5a-ac47dfbb9f47-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 11:24:27 crc kubenswrapper[4700]: I1007 11:24:27.304281 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fg2dh" Oct 07 11:24:27 crc kubenswrapper[4700]: I1007 11:24:27.304518 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fg2dh" Oct 07 11:24:27 crc kubenswrapper[4700]: I1007 11:24:27.351671 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fg2dh" Oct 07 11:24:27 crc kubenswrapper[4700]: I1007 11:24:27.352020 4700 generic.go:334] "Generic (PLEG): container finished" podID="5ea7e71b-3585-463c-ac5a-ac47dfbb9f47" containerID="3bfd06af38ca9aa16839c58c193a00b5577f51c5468d8b303d4d6191231877d2" exitCode=0 Oct 07 11:24:27 crc kubenswrapper[4700]: I1007 11:24:27.352103 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dblxw" Oct 07 11:24:27 crc kubenswrapper[4700]: I1007 11:24:27.352160 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dblxw" event={"ID":"5ea7e71b-3585-463c-ac5a-ac47dfbb9f47","Type":"ContainerDied","Data":"3bfd06af38ca9aa16839c58c193a00b5577f51c5468d8b303d4d6191231877d2"} Oct 07 11:24:27 crc kubenswrapper[4700]: I1007 11:24:27.352238 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dblxw" event={"ID":"5ea7e71b-3585-463c-ac5a-ac47dfbb9f47","Type":"ContainerDied","Data":"1794322d4f9f3dea65f1918292b990c23bb17212d04a636dd77f3dd63178ca55"} Oct 07 11:24:27 crc kubenswrapper[4700]: I1007 11:24:27.352266 4700 scope.go:117] "RemoveContainer" containerID="3bfd06af38ca9aa16839c58c193a00b5577f51c5468d8b303d4d6191231877d2" Oct 07 11:24:27 crc kubenswrapper[4700]: I1007 11:24:27.391592 4700 scope.go:117] "RemoveContainer" containerID="747607f441fd8e772ee3e56b8c51bf090e7932565512064a3edd5b9edf28525e" Oct 07 11:24:27 crc kubenswrapper[4700]: I1007 11:24:27.395478 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dblxw"] Oct 07 11:24:27 crc kubenswrapper[4700]: I1007 11:24:27.403517 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fg2dh" Oct 07 11:24:27 crc kubenswrapper[4700]: I1007 11:24:27.403575 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dblxw"] Oct 07 11:24:27 crc kubenswrapper[4700]: I1007 11:24:27.406389 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cr2l5" Oct 07 11:24:27 crc kubenswrapper[4700]: I1007 11:24:27.445348 4700 scope.go:117] "RemoveContainer" containerID="cfeb7083c6d6359b2f02abe79592e3e0d75853a887febbe08da045b2cb90b749" Oct 07 11:24:27 crc kubenswrapper[4700]: I1007 11:24:27.471545 4700 scope.go:117] "RemoveContainer" containerID="3bfd06af38ca9aa16839c58c193a00b5577f51c5468d8b303d4d6191231877d2" Oct 07 11:24:27 crc kubenswrapper[4700]: E1007 11:24:27.472099 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bfd06af38ca9aa16839c58c193a00b5577f51c5468d8b303d4d6191231877d2\": container with ID starting with 3bfd06af38ca9aa16839c58c193a00b5577f51c5468d8b303d4d6191231877d2 not found: ID does not exist" containerID="3bfd06af38ca9aa16839c58c193a00b5577f51c5468d8b303d4d6191231877d2" Oct 07 11:24:27 crc kubenswrapper[4700]: I1007 11:24:27.472160 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bfd06af38ca9aa16839c58c193a00b5577f51c5468d8b303d4d6191231877d2"} err="failed to get container status \"3bfd06af38ca9aa16839c58c193a00b5577f51c5468d8b303d4d6191231877d2\": rpc error: code = NotFound desc = could not find container \"3bfd06af38ca9aa16839c58c193a00b5577f51c5468d8b303d4d6191231877d2\": container with ID starting with 3bfd06af38ca9aa16839c58c193a00b5577f51c5468d8b303d4d6191231877d2 not found: ID does not exist" Oct 07 11:24:27 crc kubenswrapper[4700]: I1007 11:24:27.472192 4700 scope.go:117] "RemoveContainer" containerID="747607f441fd8e772ee3e56b8c51bf090e7932565512064a3edd5b9edf28525e" Oct 07 11:24:27 crc kubenswrapper[4700]: E1007 11:24:27.472582 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"747607f441fd8e772ee3e56b8c51bf090e7932565512064a3edd5b9edf28525e\": container with ID starting with 747607f441fd8e772ee3e56b8c51bf090e7932565512064a3edd5b9edf28525e not found: ID does not exist" containerID="747607f441fd8e772ee3e56b8c51bf090e7932565512064a3edd5b9edf28525e" Oct 07 11:24:27 crc kubenswrapper[4700]: I1007 11:24:27.472631 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"747607f441fd8e772ee3e56b8c51bf090e7932565512064a3edd5b9edf28525e"} err="failed to get container status \"747607f441fd8e772ee3e56b8c51bf090e7932565512064a3edd5b9edf28525e\": rpc error: code = NotFound desc = could not find container \"747607f441fd8e772ee3e56b8c51bf090e7932565512064a3edd5b9edf28525e\": container with ID starting with 747607f441fd8e772ee3e56b8c51bf090e7932565512064a3edd5b9edf28525e not found: ID does not exist" Oct 07 11:24:27 crc kubenswrapper[4700]: I1007 11:24:27.472663 4700 scope.go:117] "RemoveContainer" containerID="cfeb7083c6d6359b2f02abe79592e3e0d75853a887febbe08da045b2cb90b749" Oct 07 11:24:27 crc kubenswrapper[4700]: E1007 11:24:27.473048 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfeb7083c6d6359b2f02abe79592e3e0d75853a887febbe08da045b2cb90b749\": container with ID starting with cfeb7083c6d6359b2f02abe79592e3e0d75853a887febbe08da045b2cb90b749 not found: ID does not exist" containerID="cfeb7083c6d6359b2f02abe79592e3e0d75853a887febbe08da045b2cb90b749" Oct 07 11:24:27 crc kubenswrapper[4700]: I1007 11:24:27.473084 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfeb7083c6d6359b2f02abe79592e3e0d75853a887febbe08da045b2cb90b749"} err="failed to get container status \"cfeb7083c6d6359b2f02abe79592e3e0d75853a887febbe08da045b2cb90b749\": rpc error: code = NotFound desc = could not find container \"cfeb7083c6d6359b2f02abe79592e3e0d75853a887febbe08da045b2cb90b749\": container with ID starting with cfeb7083c6d6359b2f02abe79592e3e0d75853a887febbe08da045b2cb90b749 not found: ID does not exist" Oct 07 11:24:27 crc kubenswrapper[4700]: I1007 11:24:27.962979 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ea7e71b-3585-463c-ac5a-ac47dfbb9f47" path="/var/lib/kubelet/pods/5ea7e71b-3585-463c-ac5a-ac47dfbb9f47/volumes" Oct 07 11:24:28 crc kubenswrapper[4700]: I1007 11:24:28.013592 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4zrq9"] Oct 07 11:24:28 crc kubenswrapper[4700]: I1007 11:24:28.358342 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4zrq9" podUID="630f464b-788f-48ee-93ee-d5644f705ec0" containerName="registry-server" containerID="cri-o://9870eaef3ccfea04e8da897b8bd9cd06dfcf4b06907a5ffff40be8a85544d083" gracePeriod=2 Oct 07 11:24:28 crc kubenswrapper[4700]: I1007 11:24:28.708184 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4zrq9" Oct 07 11:24:28 crc kubenswrapper[4700]: I1007 11:24:28.872574 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcx5l\" (UniqueName: \"kubernetes.io/projected/630f464b-788f-48ee-93ee-d5644f705ec0-kube-api-access-hcx5l\") pod \"630f464b-788f-48ee-93ee-d5644f705ec0\" (UID: \"630f464b-788f-48ee-93ee-d5644f705ec0\") " Oct 07 11:24:28 crc kubenswrapper[4700]: I1007 11:24:28.873038 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/630f464b-788f-48ee-93ee-d5644f705ec0-catalog-content\") pod \"630f464b-788f-48ee-93ee-d5644f705ec0\" (UID: \"630f464b-788f-48ee-93ee-d5644f705ec0\") " Oct 07 11:24:28 crc kubenswrapper[4700]: I1007 11:24:28.873123 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/630f464b-788f-48ee-93ee-d5644f705ec0-utilities\") pod \"630f464b-788f-48ee-93ee-d5644f705ec0\" (UID: \"630f464b-788f-48ee-93ee-d5644f705ec0\") " Oct 07 11:24:28 crc kubenswrapper[4700]: I1007 11:24:28.874581 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/630f464b-788f-48ee-93ee-d5644f705ec0-utilities" (OuterVolumeSpecName: "utilities") pod "630f464b-788f-48ee-93ee-d5644f705ec0" (UID: "630f464b-788f-48ee-93ee-d5644f705ec0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:24:28 crc kubenswrapper[4700]: I1007 11:24:28.880705 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/630f464b-788f-48ee-93ee-d5644f705ec0-kube-api-access-hcx5l" (OuterVolumeSpecName: "kube-api-access-hcx5l") pod "630f464b-788f-48ee-93ee-d5644f705ec0" (UID: "630f464b-788f-48ee-93ee-d5644f705ec0"). InnerVolumeSpecName "kube-api-access-hcx5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:24:28 crc kubenswrapper[4700]: I1007 11:24:28.884765 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/630f464b-788f-48ee-93ee-d5644f705ec0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "630f464b-788f-48ee-93ee-d5644f705ec0" (UID: "630f464b-788f-48ee-93ee-d5644f705ec0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:24:28 crc kubenswrapper[4700]: I1007 11:24:28.974136 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcx5l\" (UniqueName: \"kubernetes.io/projected/630f464b-788f-48ee-93ee-d5644f705ec0-kube-api-access-hcx5l\") on node \"crc\" DevicePath \"\"" Oct 07 11:24:28 crc kubenswrapper[4700]: I1007 11:24:28.974177 4700 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/630f464b-788f-48ee-93ee-d5644f705ec0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 11:24:28 crc kubenswrapper[4700]: I1007 11:24:28.974192 4700 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/630f464b-788f-48ee-93ee-d5644f705ec0-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 11:24:29 crc kubenswrapper[4700]: I1007 11:24:29.366057 4700 generic.go:334] "Generic (PLEG): container finished" podID="630f464b-788f-48ee-93ee-d5644f705ec0" containerID="9870eaef3ccfea04e8da897b8bd9cd06dfcf4b06907a5ffff40be8a85544d083" exitCode=0 Oct 07 11:24:29 crc kubenswrapper[4700]: I1007 11:24:29.366161 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4zrq9" event={"ID":"630f464b-788f-48ee-93ee-d5644f705ec0","Type":"ContainerDied","Data":"9870eaef3ccfea04e8da897b8bd9cd06dfcf4b06907a5ffff40be8a85544d083"} Oct 07 11:24:29 crc kubenswrapper[4700]: I1007 11:24:29.366184 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4zrq9" Oct 07 11:24:29 crc kubenswrapper[4700]: I1007 11:24:29.366221 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4zrq9" event={"ID":"630f464b-788f-48ee-93ee-d5644f705ec0","Type":"ContainerDied","Data":"25aa8c486eb3d9eee69fa0cd0499599db8b0216ca5d2c40c5472d704e3bc1c85"} Oct 07 11:24:29 crc kubenswrapper[4700]: I1007 11:24:29.366249 4700 scope.go:117] "RemoveContainer" containerID="9870eaef3ccfea04e8da897b8bd9cd06dfcf4b06907a5ffff40be8a85544d083" Oct 07 11:24:29 crc kubenswrapper[4700]: I1007 11:24:29.383728 4700 scope.go:117] "RemoveContainer" containerID="bfbbe8529836aa12f37057a8fb5a1eb36f956154a41efc00ed42f7944245bc70" Oct 07 11:24:29 crc kubenswrapper[4700]: I1007 11:24:29.403547 4700 scope.go:117] "RemoveContainer" containerID="63517a2aa24f14a0cf0b6d6c44a0b3639b6f51b113aa1601c7e01ebe3ac06746" Oct 07 11:24:29 crc kubenswrapper[4700]: I1007 11:24:29.403809 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4zrq9"] Oct 07 11:24:29 crc kubenswrapper[4700]: I1007 11:24:29.407332 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4zrq9"] Oct 07 11:24:29 crc kubenswrapper[4700]: I1007 11:24:29.420053 4700 scope.go:117] "RemoveContainer" containerID="9870eaef3ccfea04e8da897b8bd9cd06dfcf4b06907a5ffff40be8a85544d083" Oct 07 11:24:29 crc kubenswrapper[4700]: E1007 11:24:29.420565 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9870eaef3ccfea04e8da897b8bd9cd06dfcf4b06907a5ffff40be8a85544d083\": container with ID starting with 9870eaef3ccfea04e8da897b8bd9cd06dfcf4b06907a5ffff40be8a85544d083 not found: ID does not exist" containerID="9870eaef3ccfea04e8da897b8bd9cd06dfcf4b06907a5ffff40be8a85544d083" Oct 07 11:24:29 crc kubenswrapper[4700]: I1007 11:24:29.420685 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9870eaef3ccfea04e8da897b8bd9cd06dfcf4b06907a5ffff40be8a85544d083"} err="failed to get container status \"9870eaef3ccfea04e8da897b8bd9cd06dfcf4b06907a5ffff40be8a85544d083\": rpc error: code = NotFound desc = could not find container \"9870eaef3ccfea04e8da897b8bd9cd06dfcf4b06907a5ffff40be8a85544d083\": container with ID starting with 9870eaef3ccfea04e8da897b8bd9cd06dfcf4b06907a5ffff40be8a85544d083 not found: ID does not exist" Oct 07 11:24:29 crc kubenswrapper[4700]: I1007 11:24:29.420771 4700 scope.go:117] "RemoveContainer" containerID="bfbbe8529836aa12f37057a8fb5a1eb36f956154a41efc00ed42f7944245bc70" Oct 07 11:24:29 crc kubenswrapper[4700]: E1007 11:24:29.421043 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfbbe8529836aa12f37057a8fb5a1eb36f956154a41efc00ed42f7944245bc70\": container with ID starting with bfbbe8529836aa12f37057a8fb5a1eb36f956154a41efc00ed42f7944245bc70 not found: ID does not exist" containerID="bfbbe8529836aa12f37057a8fb5a1eb36f956154a41efc00ed42f7944245bc70" Oct 07 11:24:29 crc kubenswrapper[4700]: I1007 11:24:29.421091 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfbbe8529836aa12f37057a8fb5a1eb36f956154a41efc00ed42f7944245bc70"} err="failed to get container status \"bfbbe8529836aa12f37057a8fb5a1eb36f956154a41efc00ed42f7944245bc70\": rpc error: code = NotFound desc = could not find container \"bfbbe8529836aa12f37057a8fb5a1eb36f956154a41efc00ed42f7944245bc70\": container with ID starting with bfbbe8529836aa12f37057a8fb5a1eb36f956154a41efc00ed42f7944245bc70 not found: ID does not exist" Oct 07 11:24:29 crc kubenswrapper[4700]: I1007 11:24:29.421111 4700 scope.go:117] "RemoveContainer" containerID="63517a2aa24f14a0cf0b6d6c44a0b3639b6f51b113aa1601c7e01ebe3ac06746" Oct 07 11:24:29 crc kubenswrapper[4700]: E1007 11:24:29.421366 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63517a2aa24f14a0cf0b6d6c44a0b3639b6f51b113aa1601c7e01ebe3ac06746\": container with ID starting with 63517a2aa24f14a0cf0b6d6c44a0b3639b6f51b113aa1601c7e01ebe3ac06746 not found: ID does not exist" containerID="63517a2aa24f14a0cf0b6d6c44a0b3639b6f51b113aa1601c7e01ebe3ac06746" Oct 07 11:24:29 crc kubenswrapper[4700]: I1007 11:24:29.421389 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63517a2aa24f14a0cf0b6d6c44a0b3639b6f51b113aa1601c7e01ebe3ac06746"} err="failed to get container status \"63517a2aa24f14a0cf0b6d6c44a0b3639b6f51b113aa1601c7e01ebe3ac06746\": rpc error: code = NotFound desc = could not find container \"63517a2aa24f14a0cf0b6d6c44a0b3639b6f51b113aa1601c7e01ebe3ac06746\": container with ID starting with 63517a2aa24f14a0cf0b6d6c44a0b3639b6f51b113aa1601c7e01ebe3ac06746 not found: ID does not exist" Oct 07 11:24:29 crc kubenswrapper[4700]: I1007 11:24:29.964563 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="630f464b-788f-48ee-93ee-d5644f705ec0" path="/var/lib/kubelet/pods/630f464b-788f-48ee-93ee-d5644f705ec0/volumes" Oct 07 11:24:30 crc kubenswrapper[4700]: I1007 11:24:30.406004 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fg2dh"] Oct 07 11:24:30 crc kubenswrapper[4700]: I1007 11:24:30.407301 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fg2dh" podUID="f084cb1b-50ca-41c8-8a54-0002371e9041" containerName="registry-server" containerID="cri-o://d75131c0d649ec6d079de9f513df16a7b7c5dd83dc4d20cf3fe2ece554bfc1df" gracePeriod=2 Oct 07 11:24:31 crc kubenswrapper[4700]: I1007 11:24:31.238383 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fg2dh" Oct 07 11:24:31 crc kubenswrapper[4700]: I1007 11:24:31.380120 4700 generic.go:334] "Generic (PLEG): container finished" podID="f084cb1b-50ca-41c8-8a54-0002371e9041" containerID="d75131c0d649ec6d079de9f513df16a7b7c5dd83dc4d20cf3fe2ece554bfc1df" exitCode=0 Oct 07 11:24:31 crc kubenswrapper[4700]: I1007 11:24:31.380201 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fg2dh" event={"ID":"f084cb1b-50ca-41c8-8a54-0002371e9041","Type":"ContainerDied","Data":"d75131c0d649ec6d079de9f513df16a7b7c5dd83dc4d20cf3fe2ece554bfc1df"} Oct 07 11:24:31 crc kubenswrapper[4700]: I1007 11:24:31.380239 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fg2dh" event={"ID":"f084cb1b-50ca-41c8-8a54-0002371e9041","Type":"ContainerDied","Data":"04fc7beaf4e7d536ad489a2eeb4f81b75232418101965b8c609b543fa1371182"} Oct 07 11:24:31 crc kubenswrapper[4700]: I1007 11:24:31.380258 4700 scope.go:117] "RemoveContainer" containerID="d75131c0d649ec6d079de9f513df16a7b7c5dd83dc4d20cf3fe2ece554bfc1df" Oct 07 11:24:31 crc kubenswrapper[4700]: I1007 11:24:31.380379 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fg2dh" Oct 07 11:24:31 crc kubenswrapper[4700]: I1007 11:24:31.393958 4700 scope.go:117] "RemoveContainer" containerID="3686e1bb7cd64c7981f808d6cabb6e20d27e2145f1136a5a1f509a767608e178" Oct 07 11:24:31 crc kubenswrapper[4700]: I1007 11:24:31.407823 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f084cb1b-50ca-41c8-8a54-0002371e9041-utilities\") pod \"f084cb1b-50ca-41c8-8a54-0002371e9041\" (UID: \"f084cb1b-50ca-41c8-8a54-0002371e9041\") " Oct 07 11:24:31 crc kubenswrapper[4700]: I1007 11:24:31.407891 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqzxl\" (UniqueName: \"kubernetes.io/projected/f084cb1b-50ca-41c8-8a54-0002371e9041-kube-api-access-jqzxl\") pod \"f084cb1b-50ca-41c8-8a54-0002371e9041\" (UID: \"f084cb1b-50ca-41c8-8a54-0002371e9041\") " Oct 07 11:24:31 crc kubenswrapper[4700]: I1007 11:24:31.407929 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f084cb1b-50ca-41c8-8a54-0002371e9041-catalog-content\") pod \"f084cb1b-50ca-41c8-8a54-0002371e9041\" (UID: \"f084cb1b-50ca-41c8-8a54-0002371e9041\") " Oct 07 11:24:31 crc kubenswrapper[4700]: I1007 11:24:31.408450 4700 scope.go:117] "RemoveContainer" containerID="f3a5de9966f3a28ea1f95adfb14169278e3843bf468301705a8fe266f144580f" Oct 07 11:24:31 crc kubenswrapper[4700]: I1007 11:24:31.409259 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f084cb1b-50ca-41c8-8a54-0002371e9041-utilities" (OuterVolumeSpecName: "utilities") pod "f084cb1b-50ca-41c8-8a54-0002371e9041" (UID: "f084cb1b-50ca-41c8-8a54-0002371e9041"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:24:31 crc kubenswrapper[4700]: I1007 11:24:31.415479 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f084cb1b-50ca-41c8-8a54-0002371e9041-kube-api-access-jqzxl" (OuterVolumeSpecName: "kube-api-access-jqzxl") pod "f084cb1b-50ca-41c8-8a54-0002371e9041" (UID: "f084cb1b-50ca-41c8-8a54-0002371e9041"). InnerVolumeSpecName "kube-api-access-jqzxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:24:31 crc kubenswrapper[4700]: I1007 11:24:31.444156 4700 scope.go:117] "RemoveContainer" containerID="d75131c0d649ec6d079de9f513df16a7b7c5dd83dc4d20cf3fe2ece554bfc1df" Oct 07 11:24:31 crc kubenswrapper[4700]: E1007 11:24:31.444812 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d75131c0d649ec6d079de9f513df16a7b7c5dd83dc4d20cf3fe2ece554bfc1df\": container with ID starting with d75131c0d649ec6d079de9f513df16a7b7c5dd83dc4d20cf3fe2ece554bfc1df not found: ID does not exist" containerID="d75131c0d649ec6d079de9f513df16a7b7c5dd83dc4d20cf3fe2ece554bfc1df" Oct 07 11:24:31 crc kubenswrapper[4700]: I1007 11:24:31.444874 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d75131c0d649ec6d079de9f513df16a7b7c5dd83dc4d20cf3fe2ece554bfc1df"} err="failed to get container status \"d75131c0d649ec6d079de9f513df16a7b7c5dd83dc4d20cf3fe2ece554bfc1df\": rpc error: code = NotFound desc = could not find container \"d75131c0d649ec6d079de9f513df16a7b7c5dd83dc4d20cf3fe2ece554bfc1df\": container with ID starting with d75131c0d649ec6d079de9f513df16a7b7c5dd83dc4d20cf3fe2ece554bfc1df not found: ID does not exist" Oct 07 11:24:31 crc kubenswrapper[4700]: I1007 11:24:31.444911 4700 scope.go:117] "RemoveContainer" containerID="3686e1bb7cd64c7981f808d6cabb6e20d27e2145f1136a5a1f509a767608e178" Oct 07 11:24:31 crc kubenswrapper[4700]: E1007 11:24:31.445547 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3686e1bb7cd64c7981f808d6cabb6e20d27e2145f1136a5a1f509a767608e178\": container with ID starting with 3686e1bb7cd64c7981f808d6cabb6e20d27e2145f1136a5a1f509a767608e178 not found: ID does not exist" containerID="3686e1bb7cd64c7981f808d6cabb6e20d27e2145f1136a5a1f509a767608e178" Oct 07 11:24:31 crc kubenswrapper[4700]: I1007 11:24:31.445579 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3686e1bb7cd64c7981f808d6cabb6e20d27e2145f1136a5a1f509a767608e178"} err="failed to get container status \"3686e1bb7cd64c7981f808d6cabb6e20d27e2145f1136a5a1f509a767608e178\": rpc error: code = NotFound desc = could not find container \"3686e1bb7cd64c7981f808d6cabb6e20d27e2145f1136a5a1f509a767608e178\": container with ID starting with 3686e1bb7cd64c7981f808d6cabb6e20d27e2145f1136a5a1f509a767608e178 not found: ID does not exist" Oct 07 11:24:31 crc kubenswrapper[4700]: I1007 11:24:31.445615 4700 scope.go:117] "RemoveContainer" containerID="f3a5de9966f3a28ea1f95adfb14169278e3843bf468301705a8fe266f144580f" Oct 07 11:24:31 crc kubenswrapper[4700]: E1007 11:24:31.445994 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3a5de9966f3a28ea1f95adfb14169278e3843bf468301705a8fe266f144580f\": container with ID starting with f3a5de9966f3a28ea1f95adfb14169278e3843bf468301705a8fe266f144580f not found: ID does not exist" containerID="f3a5de9966f3a28ea1f95adfb14169278e3843bf468301705a8fe266f144580f" Oct 07 11:24:31 crc kubenswrapper[4700]: I1007 11:24:31.446061 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3a5de9966f3a28ea1f95adfb14169278e3843bf468301705a8fe266f144580f"} err="failed to get container status \"f3a5de9966f3a28ea1f95adfb14169278e3843bf468301705a8fe266f144580f\": rpc error: code = NotFound desc = could not find container \"f3a5de9966f3a28ea1f95adfb14169278e3843bf468301705a8fe266f144580f\": container with ID starting with f3a5de9966f3a28ea1f95adfb14169278e3843bf468301705a8fe266f144580f not found: ID does not exist" Oct 07 11:24:31 crc kubenswrapper[4700]: I1007 11:24:31.501563 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f084cb1b-50ca-41c8-8a54-0002371e9041-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f084cb1b-50ca-41c8-8a54-0002371e9041" (UID: "f084cb1b-50ca-41c8-8a54-0002371e9041"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:24:31 crc kubenswrapper[4700]: I1007 11:24:31.509139 4700 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f084cb1b-50ca-41c8-8a54-0002371e9041-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 11:24:31 crc kubenswrapper[4700]: I1007 11:24:31.509175 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqzxl\" (UniqueName: \"kubernetes.io/projected/f084cb1b-50ca-41c8-8a54-0002371e9041-kube-api-access-jqzxl\") on node \"crc\" DevicePath \"\"" Oct 07 11:24:31 crc kubenswrapper[4700]: I1007 11:24:31.509188 4700 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f084cb1b-50ca-41c8-8a54-0002371e9041-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 11:24:31 crc kubenswrapper[4700]: I1007 11:24:31.707940 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fg2dh"] Oct 07 11:24:31 crc kubenswrapper[4700]: I1007 11:24:31.710018 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fg2dh"] Oct 07 11:24:31 crc kubenswrapper[4700]: I1007 11:24:31.964011 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f084cb1b-50ca-41c8-8a54-0002371e9041" path="/var/lib/kubelet/pods/f084cb1b-50ca-41c8-8a54-0002371e9041/volumes" Oct 07 11:24:34 crc kubenswrapper[4700]: I1007 11:24:34.122994 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rrq9m" Oct 07 11:24:35 crc kubenswrapper[4700]: I1007 11:24:35.233686 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wlgsp"] Oct 07 11:24:37 crc kubenswrapper[4700]: I1007 11:24:37.005155 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rrq9m"] Oct 07 11:24:37 crc kubenswrapper[4700]: I1007 11:24:37.006097 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rrq9m" podUID="dcfdc410-2928-44f4-a636-a683a6106aa8" containerName="registry-server" containerID="cri-o://0fef2ef97feeb2cdb977430da7852544017bd4f3cc88dea1c8310c82036247c6" gracePeriod=2 Oct 07 11:24:37 crc kubenswrapper[4700]: I1007 11:24:37.378108 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rrq9m" Oct 07 11:24:37 crc kubenswrapper[4700]: I1007 11:24:37.413610 4700 generic.go:334] "Generic (PLEG): container finished" podID="dcfdc410-2928-44f4-a636-a683a6106aa8" containerID="0fef2ef97feeb2cdb977430da7852544017bd4f3cc88dea1c8310c82036247c6" exitCode=0 Oct 07 11:24:37 crc kubenswrapper[4700]: I1007 11:24:37.413660 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rrq9m" event={"ID":"dcfdc410-2928-44f4-a636-a683a6106aa8","Type":"ContainerDied","Data":"0fef2ef97feeb2cdb977430da7852544017bd4f3cc88dea1c8310c82036247c6"} Oct 07 11:24:37 crc kubenswrapper[4700]: I1007 11:24:37.413691 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rrq9m" event={"ID":"dcfdc410-2928-44f4-a636-a683a6106aa8","Type":"ContainerDied","Data":"18851b8f0eb1a193f57fa0f01827f26e47cb2abcc7a6235964f07f8201787803"} Oct 07 11:24:37 crc kubenswrapper[4700]: I1007 11:24:37.413708 4700 scope.go:117] "RemoveContainer" containerID="0fef2ef97feeb2cdb977430da7852544017bd4f3cc88dea1c8310c82036247c6" Oct 07 11:24:37 crc kubenswrapper[4700]: I1007 11:24:37.413699 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rrq9m" Oct 07 11:24:37 crc kubenswrapper[4700]: I1007 11:24:37.414105 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6hw9\" (UniqueName: \"kubernetes.io/projected/dcfdc410-2928-44f4-a636-a683a6106aa8-kube-api-access-z6hw9\") pod \"dcfdc410-2928-44f4-a636-a683a6106aa8\" (UID: \"dcfdc410-2928-44f4-a636-a683a6106aa8\") " Oct 07 11:24:37 crc kubenswrapper[4700]: I1007 11:24:37.414139 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcfdc410-2928-44f4-a636-a683a6106aa8-utilities\") pod \"dcfdc410-2928-44f4-a636-a683a6106aa8\" (UID: \"dcfdc410-2928-44f4-a636-a683a6106aa8\") " Oct 07 11:24:37 crc kubenswrapper[4700]: I1007 11:24:37.414174 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcfdc410-2928-44f4-a636-a683a6106aa8-catalog-content\") pod \"dcfdc410-2928-44f4-a636-a683a6106aa8\" (UID: \"dcfdc410-2928-44f4-a636-a683a6106aa8\") " Oct 07 11:24:37 crc kubenswrapper[4700]: I1007 11:24:37.415272 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcfdc410-2928-44f4-a636-a683a6106aa8-utilities" (OuterVolumeSpecName: "utilities") pod "dcfdc410-2928-44f4-a636-a683a6106aa8" (UID: "dcfdc410-2928-44f4-a636-a683a6106aa8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:24:37 crc kubenswrapper[4700]: I1007 11:24:37.420567 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcfdc410-2928-44f4-a636-a683a6106aa8-kube-api-access-z6hw9" (OuterVolumeSpecName: "kube-api-access-z6hw9") pod "dcfdc410-2928-44f4-a636-a683a6106aa8" (UID: "dcfdc410-2928-44f4-a636-a683a6106aa8"). InnerVolumeSpecName "kube-api-access-z6hw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:24:37 crc kubenswrapper[4700]: I1007 11:24:37.457465 4700 scope.go:117] "RemoveContainer" containerID="e29fae14e748ab147c2d50a6c7d91c233df80ce95a243d360887ec669173062f" Oct 07 11:24:37 crc kubenswrapper[4700]: I1007 11:24:37.462228 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcfdc410-2928-44f4-a636-a683a6106aa8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dcfdc410-2928-44f4-a636-a683a6106aa8" (UID: "dcfdc410-2928-44f4-a636-a683a6106aa8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:24:37 crc kubenswrapper[4700]: I1007 11:24:37.493103 4700 scope.go:117] "RemoveContainer" containerID="3a3be45a3c737fda62c500a4d975840f34fe2faefdc3d98227a7de6f72eb751a" Oct 07 11:24:37 crc kubenswrapper[4700]: I1007 11:24:37.506004 4700 scope.go:117] "RemoveContainer" containerID="0fef2ef97feeb2cdb977430da7852544017bd4f3cc88dea1c8310c82036247c6" Oct 07 11:24:37 crc kubenswrapper[4700]: E1007 11:24:37.506709 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fef2ef97feeb2cdb977430da7852544017bd4f3cc88dea1c8310c82036247c6\": container with ID starting with 0fef2ef97feeb2cdb977430da7852544017bd4f3cc88dea1c8310c82036247c6 not found: ID does not exist" containerID="0fef2ef97feeb2cdb977430da7852544017bd4f3cc88dea1c8310c82036247c6" Oct 07 11:24:37 crc kubenswrapper[4700]: I1007 11:24:37.506775 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fef2ef97feeb2cdb977430da7852544017bd4f3cc88dea1c8310c82036247c6"} err="failed to get container status \"0fef2ef97feeb2cdb977430da7852544017bd4f3cc88dea1c8310c82036247c6\": rpc error: code = NotFound desc = could not find container \"0fef2ef97feeb2cdb977430da7852544017bd4f3cc88dea1c8310c82036247c6\": container with ID starting with 0fef2ef97feeb2cdb977430da7852544017bd4f3cc88dea1c8310c82036247c6 not found: ID does not exist" Oct 07 11:24:37 crc kubenswrapper[4700]: I1007 11:24:37.506822 4700 scope.go:117] "RemoveContainer" containerID="e29fae14e748ab147c2d50a6c7d91c233df80ce95a243d360887ec669173062f" Oct 07 11:24:37 crc kubenswrapper[4700]: E1007 11:24:37.507175 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e29fae14e748ab147c2d50a6c7d91c233df80ce95a243d360887ec669173062f\": container with ID starting with e29fae14e748ab147c2d50a6c7d91c233df80ce95a243d360887ec669173062f not found: ID does not exist" containerID="e29fae14e748ab147c2d50a6c7d91c233df80ce95a243d360887ec669173062f" Oct 07 11:24:37 crc kubenswrapper[4700]: I1007 11:24:37.507217 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e29fae14e748ab147c2d50a6c7d91c233df80ce95a243d360887ec669173062f"} err="failed to get container status \"e29fae14e748ab147c2d50a6c7d91c233df80ce95a243d360887ec669173062f\": rpc error: code = NotFound desc = could not find container \"e29fae14e748ab147c2d50a6c7d91c233df80ce95a243d360887ec669173062f\": container with ID starting with e29fae14e748ab147c2d50a6c7d91c233df80ce95a243d360887ec669173062f not found: ID does not exist" Oct 07 11:24:37 crc kubenswrapper[4700]: I1007 11:24:37.507248 4700 scope.go:117] "RemoveContainer" containerID="3a3be45a3c737fda62c500a4d975840f34fe2faefdc3d98227a7de6f72eb751a" Oct 07 11:24:37 crc kubenswrapper[4700]: E1007 11:24:37.507570 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a3be45a3c737fda62c500a4d975840f34fe2faefdc3d98227a7de6f72eb751a\": container with ID starting with 3a3be45a3c737fda62c500a4d975840f34fe2faefdc3d98227a7de6f72eb751a not found: ID does not exist" containerID="3a3be45a3c737fda62c500a4d975840f34fe2faefdc3d98227a7de6f72eb751a" Oct 07 11:24:37 crc kubenswrapper[4700]: I1007 11:24:37.507614 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a3be45a3c737fda62c500a4d975840f34fe2faefdc3d98227a7de6f72eb751a"} err="failed to get container status \"3a3be45a3c737fda62c500a4d975840f34fe2faefdc3d98227a7de6f72eb751a\": rpc error: code = NotFound desc = could not find container \"3a3be45a3c737fda62c500a4d975840f34fe2faefdc3d98227a7de6f72eb751a\": container with ID starting with 3a3be45a3c737fda62c500a4d975840f34fe2faefdc3d98227a7de6f72eb751a not found: ID does not exist" Oct 07 11:24:37 crc kubenswrapper[4700]: I1007 11:24:37.515058 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6hw9\" (UniqueName: \"kubernetes.io/projected/dcfdc410-2928-44f4-a636-a683a6106aa8-kube-api-access-z6hw9\") on node \"crc\" DevicePath \"\"" Oct 07 11:24:37 crc kubenswrapper[4700]: I1007 11:24:37.515086 4700 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcfdc410-2928-44f4-a636-a683a6106aa8-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 11:24:37 crc kubenswrapper[4700]: I1007 11:24:37.515099 4700 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcfdc410-2928-44f4-a636-a683a6106aa8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 11:24:37 crc kubenswrapper[4700]: I1007 11:24:37.756167 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rrq9m"] Oct 07 11:24:37 crc kubenswrapper[4700]: I1007 11:24:37.758359 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rrq9m"] Oct 07 11:24:37 crc kubenswrapper[4700]: I1007 11:24:37.966949 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcfdc410-2928-44f4-a636-a683a6106aa8" path="/var/lib/kubelet/pods/dcfdc410-2928-44f4-a636-a683a6106aa8/volumes" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.263224 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" podUID="274a19c3-7e08-4994-986f-d43d111bde3c" containerName="oauth-openshift" containerID="cri-o://37784e25a0e2dc2e3c9b3254490435972bb12f0e74718e9e6d8269ded191cf20" gracePeriod=15 Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.597912 4700 generic.go:334] "Generic (PLEG): container finished" podID="274a19c3-7e08-4994-986f-d43d111bde3c" containerID="37784e25a0e2dc2e3c9b3254490435972bb12f0e74718e9e6d8269ded191cf20" exitCode=0 Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.598077 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" event={"ID":"274a19c3-7e08-4994-986f-d43d111bde3c","Type":"ContainerDied","Data":"37784e25a0e2dc2e3c9b3254490435972bb12f0e74718e9e6d8269ded191cf20"} Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.673544 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.695138 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-serving-cert\") pod \"274a19c3-7e08-4994-986f-d43d111bde3c\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.695238 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-trusted-ca-bundle\") pod \"274a19c3-7e08-4994-986f-d43d111bde3c\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.695297 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-user-template-login\") pod \"274a19c3-7e08-4994-986f-d43d111bde3c\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.695386 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/274a19c3-7e08-4994-986f-d43d111bde3c-audit-dir\") pod \"274a19c3-7e08-4994-986f-d43d111bde3c\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.695462 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wbdv\" (UniqueName: \"kubernetes.io/projected/274a19c3-7e08-4994-986f-d43d111bde3c-kube-api-access-9wbdv\") pod \"274a19c3-7e08-4994-986f-d43d111bde3c\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.695515 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-ocp-branding-template\") pod \"274a19c3-7e08-4994-986f-d43d111bde3c\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.695528 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/274a19c3-7e08-4994-986f-d43d111bde3c-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "274a19c3-7e08-4994-986f-d43d111bde3c" (UID: "274a19c3-7e08-4994-986f-d43d111bde3c"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.695575 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-user-idp-0-file-data\") pod \"274a19c3-7e08-4994-986f-d43d111bde3c\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.695743 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-user-template-provider-selection\") pod \"274a19c3-7e08-4994-986f-d43d111bde3c\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.695793 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-service-ca\") pod \"274a19c3-7e08-4994-986f-d43d111bde3c\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.695825 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-router-certs\") pod \"274a19c3-7e08-4994-986f-d43d111bde3c\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.695855 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-user-template-error\") pod \"274a19c3-7e08-4994-986f-d43d111bde3c\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.695883 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/274a19c3-7e08-4994-986f-d43d111bde3c-audit-policies\") pod \"274a19c3-7e08-4994-986f-d43d111bde3c\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.696092 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "274a19c3-7e08-4994-986f-d43d111bde3c" (UID: "274a19c3-7e08-4994-986f-d43d111bde3c"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.696439 4700 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.696463 4700 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/274a19c3-7e08-4994-986f-d43d111bde3c-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.698883 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "274a19c3-7e08-4994-986f-d43d111bde3c" (UID: "274a19c3-7e08-4994-986f-d43d111bde3c"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.698909 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/274a19c3-7e08-4994-986f-d43d111bde3c-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "274a19c3-7e08-4994-986f-d43d111bde3c" (UID: "274a19c3-7e08-4994-986f-d43d111bde3c"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.707436 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "274a19c3-7e08-4994-986f-d43d111bde3c" (UID: "274a19c3-7e08-4994-986f-d43d111bde3c"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.707998 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "274a19c3-7e08-4994-986f-d43d111bde3c" (UID: "274a19c3-7e08-4994-986f-d43d111bde3c"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.708595 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "274a19c3-7e08-4994-986f-d43d111bde3c" (UID: "274a19c3-7e08-4994-986f-d43d111bde3c"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.709083 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "274a19c3-7e08-4994-986f-d43d111bde3c" (UID: "274a19c3-7e08-4994-986f-d43d111bde3c"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.711193 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "274a19c3-7e08-4994-986f-d43d111bde3c" (UID: "274a19c3-7e08-4994-986f-d43d111bde3c"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.712043 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "274a19c3-7e08-4994-986f-d43d111bde3c" (UID: "274a19c3-7e08-4994-986f-d43d111bde3c"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.713695 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "274a19c3-7e08-4994-986f-d43d111bde3c" (UID: "274a19c3-7e08-4994-986f-d43d111bde3c"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.717176 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/274a19c3-7e08-4994-986f-d43d111bde3c-kube-api-access-9wbdv" (OuterVolumeSpecName: "kube-api-access-9wbdv") pod "274a19c3-7e08-4994-986f-d43d111bde3c" (UID: "274a19c3-7e08-4994-986f-d43d111bde3c"). InnerVolumeSpecName "kube-api-access-9wbdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.721507 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6bf5fff678-4g69k"] Oct 07 11:25:00 crc kubenswrapper[4700]: E1007 11:25:00.721771 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd6f2f13-b68f-4ff7-a1b4-5603f32dcea9" containerName="pruner" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.721794 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd6f2f13-b68f-4ff7-a1b4-5603f32dcea9" containerName="pruner" Oct 07 11:25:00 crc kubenswrapper[4700]: E1007 11:25:00.721808 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ea7e71b-3585-463c-ac5a-ac47dfbb9f47" containerName="extract-utilities" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.721817 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ea7e71b-3585-463c-ac5a-ac47dfbb9f47" containerName="extract-utilities" Oct 07 11:25:00 crc kubenswrapper[4700]: E1007 11:25:00.721833 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="630f464b-788f-48ee-93ee-d5644f705ec0" containerName="extract-utilities" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.721843 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="630f464b-788f-48ee-93ee-d5644f705ec0" containerName="extract-utilities" Oct 07 11:25:00 crc kubenswrapper[4700]: E1007 11:25:00.721855 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="630f464b-788f-48ee-93ee-d5644f705ec0" containerName="extract-content" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.721866 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="630f464b-788f-48ee-93ee-d5644f705ec0" containerName="extract-content" Oct 07 11:25:00 crc kubenswrapper[4700]: E1007 11:25:00.721889 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f084cb1b-50ca-41c8-8a54-0002371e9041" containerName="extract-content" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.721898 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="f084cb1b-50ca-41c8-8a54-0002371e9041" containerName="extract-content" Oct 07 11:25:00 crc kubenswrapper[4700]: E1007 11:25:00.721915 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="274a19c3-7e08-4994-986f-d43d111bde3c" containerName="oauth-openshift" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.721923 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="274a19c3-7e08-4994-986f-d43d111bde3c" containerName="oauth-openshift" Oct 07 11:25:00 crc kubenswrapper[4700]: E1007 11:25:00.721933 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ea7e71b-3585-463c-ac5a-ac47dfbb9f47" containerName="registry-server" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.721942 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ea7e71b-3585-463c-ac5a-ac47dfbb9f47" containerName="registry-server" Oct 07 11:25:00 crc kubenswrapper[4700]: E1007 11:25:00.721959 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcfdc410-2928-44f4-a636-a683a6106aa8" containerName="extract-utilities" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.721968 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcfdc410-2928-44f4-a636-a683a6106aa8" containerName="extract-utilities" Oct 07 11:25:00 crc kubenswrapper[4700]: E1007 11:25:00.721983 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6dab5aa-6c06-4814-839d-10efee6cfb77" containerName="pruner" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.721991 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6dab5aa-6c06-4814-839d-10efee6cfb77" containerName="pruner" Oct 07 11:25:00 crc kubenswrapper[4700]: E1007 11:25:00.722001 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f084cb1b-50ca-41c8-8a54-0002371e9041" containerName="extract-utilities" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.722011 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="f084cb1b-50ca-41c8-8a54-0002371e9041" containerName="extract-utilities" Oct 07 11:25:00 crc kubenswrapper[4700]: E1007 11:25:00.722022 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="630f464b-788f-48ee-93ee-d5644f705ec0" containerName="registry-server" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.722030 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="630f464b-788f-48ee-93ee-d5644f705ec0" containerName="registry-server" Oct 07 11:25:00 crc kubenswrapper[4700]: E1007 11:25:00.722043 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcfdc410-2928-44f4-a636-a683a6106aa8" containerName="extract-content" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.722051 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcfdc410-2928-44f4-a636-a683a6106aa8" containerName="extract-content" Oct 07 11:25:00 crc kubenswrapper[4700]: E1007 11:25:00.722060 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcfdc410-2928-44f4-a636-a683a6106aa8" containerName="registry-server" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.722069 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcfdc410-2928-44f4-a636-a683a6106aa8" containerName="registry-server" Oct 07 11:25:00 crc kubenswrapper[4700]: E1007 11:25:00.722084 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ea7e71b-3585-463c-ac5a-ac47dfbb9f47" containerName="extract-content" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.722093 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ea7e71b-3585-463c-ac5a-ac47dfbb9f47" containerName="extract-content" Oct 07 11:25:00 crc kubenswrapper[4700]: E1007 11:25:00.722106 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f084cb1b-50ca-41c8-8a54-0002371e9041" containerName="registry-server" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.722114 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="f084cb1b-50ca-41c8-8a54-0002371e9041" containerName="registry-server" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.722239 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd6f2f13-b68f-4ff7-a1b4-5603f32dcea9" containerName="pruner" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.722252 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="f084cb1b-50ca-41c8-8a54-0002371e9041" containerName="registry-server" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.722289 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ea7e71b-3585-463c-ac5a-ac47dfbb9f47" containerName="registry-server" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.722340 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcfdc410-2928-44f4-a636-a683a6106aa8" containerName="registry-server" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.722351 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="630f464b-788f-48ee-93ee-d5644f705ec0" containerName="registry-server" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.722365 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6dab5aa-6c06-4814-839d-10efee6cfb77" containerName="pruner" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.722377 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="274a19c3-7e08-4994-986f-d43d111bde3c" containerName="oauth-openshift" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.722872 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.757877 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6bf5fff678-4g69k"] Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.796619 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-session\") pod \"274a19c3-7e08-4994-986f-d43d111bde3c\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.796676 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-cliconfig\") pod \"274a19c3-7e08-4994-986f-d43d111bde3c\" (UID: \"274a19c3-7e08-4994-986f-d43d111bde3c\") " Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.796810 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fb42879d-29f5-459e-9857-391c581d3828-v4-0-config-system-session\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.796838 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fb42879d-29f5-459e-9857-391c581d3828-v4-0-config-system-service-ca\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.796863 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fb42879d-29f5-459e-9857-391c581d3828-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.796882 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fb42879d-29f5-459e-9857-391c581d3828-v4-0-config-system-router-certs\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.796911 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fb42879d-29f5-459e-9857-391c581d3828-v4-0-config-user-template-error\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.796935 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fb42879d-29f5-459e-9857-391c581d3828-audit-dir\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.796949 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8wfc\" (UniqueName: \"kubernetes.io/projected/fb42879d-29f5-459e-9857-391c581d3828-kube-api-access-l8wfc\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.796970 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fb42879d-29f5-459e-9857-391c581d3828-audit-policies\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.796985 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb42879d-29f5-459e-9857-391c581d3828-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.797010 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fb42879d-29f5-459e-9857-391c581d3828-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.797030 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fb42879d-29f5-459e-9857-391c581d3828-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.797046 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb42879d-29f5-459e-9857-391c581d3828-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.797064 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fb42879d-29f5-459e-9857-391c581d3828-v4-0-config-user-template-login\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.797082 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fb42879d-29f5-459e-9857-391c581d3828-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.797124 4700 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.797141 4700 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.797154 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wbdv\" (UniqueName: \"kubernetes.io/projected/274a19c3-7e08-4994-986f-d43d111bde3c-kube-api-access-9wbdv\") on node \"crc\" DevicePath \"\"" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.797167 4700 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.797180 4700 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.797196 4700 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.797208 4700 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.797223 4700 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.797235 4700 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.797247 4700 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/274a19c3-7e08-4994-986f-d43d111bde3c-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.797520 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "274a19c3-7e08-4994-986f-d43d111bde3c" (UID: "274a19c3-7e08-4994-986f-d43d111bde3c"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.802581 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "274a19c3-7e08-4994-986f-d43d111bde3c" (UID: "274a19c3-7e08-4994-986f-d43d111bde3c"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.897985 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fb42879d-29f5-459e-9857-391c581d3828-v4-0-config-system-router-certs\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.898049 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fb42879d-29f5-459e-9857-391c581d3828-v4-0-config-user-template-error\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.898076 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fb42879d-29f5-459e-9857-391c581d3828-audit-dir\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.898101 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8wfc\" (UniqueName: \"kubernetes.io/projected/fb42879d-29f5-459e-9857-391c581d3828-kube-api-access-l8wfc\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.898139 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fb42879d-29f5-459e-9857-391c581d3828-audit-policies\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.898163 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb42879d-29f5-459e-9857-391c581d3828-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.898198 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fb42879d-29f5-459e-9857-391c581d3828-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.898233 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fb42879d-29f5-459e-9857-391c581d3828-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.898258 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb42879d-29f5-459e-9857-391c581d3828-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.898373 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fb42879d-29f5-459e-9857-391c581d3828-audit-dir\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.898669 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fb42879d-29f5-459e-9857-391c581d3828-v4-0-config-user-template-login\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.898715 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fb42879d-29f5-459e-9857-391c581d3828-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.898763 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fb42879d-29f5-459e-9857-391c581d3828-v4-0-config-system-session\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.898794 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fb42879d-29f5-459e-9857-391c581d3828-v4-0-config-system-service-ca\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.898833 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fb42879d-29f5-459e-9857-391c581d3828-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.898879 4700 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.898898 4700 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/274a19c3-7e08-4994-986f-d43d111bde3c-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.899900 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fb42879d-29f5-459e-9857-391c581d3828-audit-policies\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.900064 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fb42879d-29f5-459e-9857-391c581d3828-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.900198 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fb42879d-29f5-459e-9857-391c581d3828-v4-0-config-system-service-ca\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.901928 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb42879d-29f5-459e-9857-391c581d3828-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.903139 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fb42879d-29f5-459e-9857-391c581d3828-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.903838 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fb42879d-29f5-459e-9857-391c581d3828-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.904626 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fb42879d-29f5-459e-9857-391c581d3828-v4-0-config-system-router-certs\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.904978 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb42879d-29f5-459e-9857-391c581d3828-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.905733 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fb42879d-29f5-459e-9857-391c581d3828-v4-0-config-user-template-error\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.905859 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fb42879d-29f5-459e-9857-391c581d3828-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.908527 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fb42879d-29f5-459e-9857-391c581d3828-v4-0-config-user-template-login\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.909024 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fb42879d-29f5-459e-9857-391c581d3828-v4-0-config-system-session\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:00 crc kubenswrapper[4700]: I1007 11:25:00.921916 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8wfc\" (UniqueName: \"kubernetes.io/projected/fb42879d-29f5-459e-9857-391c581d3828-kube-api-access-l8wfc\") pod \"oauth-openshift-6bf5fff678-4g69k\" (UID: \"fb42879d-29f5-459e-9857-391c581d3828\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:01 crc kubenswrapper[4700]: I1007 11:25:01.064380 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:01 crc kubenswrapper[4700]: I1007 11:25:01.295201 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6bf5fff678-4g69k"] Oct 07 11:25:01 crc kubenswrapper[4700]: I1007 11:25:01.608685 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" event={"ID":"274a19c3-7e08-4994-986f-d43d111bde3c","Type":"ContainerDied","Data":"ec2b30a4a9c5458750cfda38b4855805c77705295d8bbdbbf6f916eed128cf6c"} Oct 07 11:25:01 crc kubenswrapper[4700]: I1007 11:25:01.609717 4700 scope.go:117] "RemoveContainer" containerID="37784e25a0e2dc2e3c9b3254490435972bb12f0e74718e9e6d8269ded191cf20" Oct 07 11:25:01 crc kubenswrapper[4700]: I1007 11:25:01.608711 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wlgsp" Oct 07 11:25:01 crc kubenswrapper[4700]: I1007 11:25:01.615646 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" event={"ID":"fb42879d-29f5-459e-9857-391c581d3828","Type":"ContainerStarted","Data":"a9a500b72129f5f89397ad94f8f21b6f116d4ffc9ea5315fec6027a31a195c37"} Oct 07 11:25:01 crc kubenswrapper[4700]: I1007 11:25:01.615699 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" event={"ID":"fb42879d-29f5-459e-9857-391c581d3828","Type":"ContainerStarted","Data":"dd38e9cf97185c5d7e6b6adb8c4dfd2252c78eb67b2e3915464698caf7860c18"} Oct 07 11:25:01 crc kubenswrapper[4700]: I1007 11:25:01.616748 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:01 crc kubenswrapper[4700]: I1007 11:25:01.618961 4700 patch_prober.go:28] interesting pod/oauth-openshift-6bf5fff678-4g69k container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.54:6443/healthz\": dial tcp 10.217.0.54:6443: connect: connection refused" start-of-body= Oct 07 11:25:01 crc kubenswrapper[4700]: I1007 11:25:01.619039 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" podUID="fb42879d-29f5-459e-9857-391c581d3828" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.54:6443/healthz\": dial tcp 10.217.0.54:6443: connect: connection refused" Oct 07 11:25:01 crc kubenswrapper[4700]: I1007 11:25:01.667530 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" podStartSLOduration=26.667500267 podStartE2EDuration="26.667500267s" podCreationTimestamp="2025-10-07 11:24:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:25:01.646323041 +0000 UTC m=+268.442722080" watchObservedRunningTime="2025-10-07 11:25:01.667500267 +0000 UTC m=+268.463899266" Oct 07 11:25:01 crc kubenswrapper[4700]: I1007 11:25:01.675596 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wlgsp"] Oct 07 11:25:01 crc kubenswrapper[4700]: I1007 11:25:01.679721 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wlgsp"] Oct 07 11:25:01 crc kubenswrapper[4700]: I1007 11:25:01.965194 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="274a19c3-7e08-4994-986f-d43d111bde3c" path="/var/lib/kubelet/pods/274a19c3-7e08-4994-986f-d43d111bde3c/volumes" Oct 07 11:25:02 crc kubenswrapper[4700]: I1007 11:25:02.632669 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6bf5fff678-4g69k" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.154094 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8bwfw"] Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.156012 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8bwfw" podUID="a56f43f6-4fa8-47ab-b028-2bcc44e329d0" containerName="registry-server" containerID="cri-o://2dc4f1ea40e85f8880385647609d3d4337144af4536669fe652a004e60091f69" gracePeriod=30 Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.158066 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9r2ww"] Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.158398 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9r2ww" podUID="487ae31f-7fb6-4077-8f7d-11bb488b172b" containerName="registry-server" containerID="cri-o://2560fc8caa644a41d36efff79ae79ea49ca45a4dd7f3ac90b2fa4b40ff4eefe2" gracePeriod=30 Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.171424 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7qgxt"] Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.171719 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-7qgxt" podUID="ca3c8e5f-3994-409a-b8b2-58ee2ee245b6" containerName="marketplace-operator" containerID="cri-o://2ce9b6f608a04db4a5b0f3d9609382869e9684c9023703cdd8e6d1108e0df27d" gracePeriod=30 Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.184106 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c7vk6"] Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.184454 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-c7vk6" podUID="790ca1b9-1a1e-49d8-802b-e848efbc4c3e" containerName="registry-server" containerID="cri-o://dba3207861d7cbea85470b7c066d7e13465c00a96d4deb80c92eaf2e5f92a5a4" gracePeriod=30 Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.189173 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cr2l5"] Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.189538 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cr2l5" podUID="17497e3a-9f81-4f68-8881-80a6aaae79a1" containerName="registry-server" containerID="cri-o://1cdb0cef4b70ad8dba90741ebb2f73e5e2df2148980bf5928ef0b29c2315b4e3" gracePeriod=30 Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.198408 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4vqf8"] Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.199573 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4vqf8" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.220819 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4vqf8"] Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.328192 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9vcc\" (UniqueName: \"kubernetes.io/projected/280bc2c9-6204-4779-b7c8-a09260dd2a66-kube-api-access-h9vcc\") pod \"marketplace-operator-79b997595-4vqf8\" (UID: \"280bc2c9-6204-4779-b7c8-a09260dd2a66\") " pod="openshift-marketplace/marketplace-operator-79b997595-4vqf8" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.328243 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/280bc2c9-6204-4779-b7c8-a09260dd2a66-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4vqf8\" (UID: \"280bc2c9-6204-4779-b7c8-a09260dd2a66\") " pod="openshift-marketplace/marketplace-operator-79b997595-4vqf8" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.328270 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/280bc2c9-6204-4779-b7c8-a09260dd2a66-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4vqf8\" (UID: \"280bc2c9-6204-4779-b7c8-a09260dd2a66\") " pod="openshift-marketplace/marketplace-operator-79b997595-4vqf8" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.429346 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9vcc\" (UniqueName: \"kubernetes.io/projected/280bc2c9-6204-4779-b7c8-a09260dd2a66-kube-api-access-h9vcc\") pod \"marketplace-operator-79b997595-4vqf8\" (UID: \"280bc2c9-6204-4779-b7c8-a09260dd2a66\") " pod="openshift-marketplace/marketplace-operator-79b997595-4vqf8" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.429432 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/280bc2c9-6204-4779-b7c8-a09260dd2a66-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4vqf8\" (UID: \"280bc2c9-6204-4779-b7c8-a09260dd2a66\") " pod="openshift-marketplace/marketplace-operator-79b997595-4vqf8" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.429483 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/280bc2c9-6204-4779-b7c8-a09260dd2a66-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4vqf8\" (UID: \"280bc2c9-6204-4779-b7c8-a09260dd2a66\") " pod="openshift-marketplace/marketplace-operator-79b997595-4vqf8" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.431615 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/280bc2c9-6204-4779-b7c8-a09260dd2a66-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4vqf8\" (UID: \"280bc2c9-6204-4779-b7c8-a09260dd2a66\") " pod="openshift-marketplace/marketplace-operator-79b997595-4vqf8" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.443006 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/280bc2c9-6204-4779-b7c8-a09260dd2a66-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4vqf8\" (UID: \"280bc2c9-6204-4779-b7c8-a09260dd2a66\") " pod="openshift-marketplace/marketplace-operator-79b997595-4vqf8" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.461254 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9vcc\" (UniqueName: \"kubernetes.io/projected/280bc2c9-6204-4779-b7c8-a09260dd2a66-kube-api-access-h9vcc\") pod \"marketplace-operator-79b997595-4vqf8\" (UID: \"280bc2c9-6204-4779-b7c8-a09260dd2a66\") " pod="openshift-marketplace/marketplace-operator-79b997595-4vqf8" Oct 07 11:25:23 crc kubenswrapper[4700]: E1007 11:25:23.474953 4700 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2dc4f1ea40e85f8880385647609d3d4337144af4536669fe652a004e60091f69 is running failed: container process not found" containerID="2dc4f1ea40e85f8880385647609d3d4337144af4536669fe652a004e60091f69" cmd=["grpc_health_probe","-addr=:50051"] Oct 07 11:25:23 crc kubenswrapper[4700]: E1007 11:25:23.475441 4700 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2dc4f1ea40e85f8880385647609d3d4337144af4536669fe652a004e60091f69 is running failed: container process not found" containerID="2dc4f1ea40e85f8880385647609d3d4337144af4536669fe652a004e60091f69" cmd=["grpc_health_probe","-addr=:50051"] Oct 07 11:25:23 crc kubenswrapper[4700]: E1007 11:25:23.476002 4700 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2dc4f1ea40e85f8880385647609d3d4337144af4536669fe652a004e60091f69 is running failed: container process not found" containerID="2dc4f1ea40e85f8880385647609d3d4337144af4536669fe652a004e60091f69" cmd=["grpc_health_probe","-addr=:50051"] Oct 07 11:25:23 crc kubenswrapper[4700]: E1007 11:25:23.476049 4700 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2dc4f1ea40e85f8880385647609d3d4337144af4536669fe652a004e60091f69 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-8bwfw" podUID="a56f43f6-4fa8-47ab-b028-2bcc44e329d0" containerName="registry-server" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.520487 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4vqf8" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.581615 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9r2ww" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.586794 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c7vk6" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.605348 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cr2l5" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.657464 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7qgxt" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.660425 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8bwfw" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.748635 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/790ca1b9-1a1e-49d8-802b-e848efbc4c3e-catalog-content\") pod \"790ca1b9-1a1e-49d8-802b-e848efbc4c3e\" (UID: \"790ca1b9-1a1e-49d8-802b-e848efbc4c3e\") " Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.748764 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tb2p2\" (UniqueName: \"kubernetes.io/projected/17497e3a-9f81-4f68-8881-80a6aaae79a1-kube-api-access-tb2p2\") pod \"17497e3a-9f81-4f68-8881-80a6aaae79a1\" (UID: \"17497e3a-9f81-4f68-8881-80a6aaae79a1\") " Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.748836 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/790ca1b9-1a1e-49d8-802b-e848efbc4c3e-utilities\") pod \"790ca1b9-1a1e-49d8-802b-e848efbc4c3e\" (UID: \"790ca1b9-1a1e-49d8-802b-e848efbc4c3e\") " Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.748878 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcksd\" (UniqueName: \"kubernetes.io/projected/790ca1b9-1a1e-49d8-802b-e848efbc4c3e-kube-api-access-fcksd\") pod \"790ca1b9-1a1e-49d8-802b-e848efbc4c3e\" (UID: \"790ca1b9-1a1e-49d8-802b-e848efbc4c3e\") " Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.748931 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17497e3a-9f81-4f68-8881-80a6aaae79a1-catalog-content\") pod \"17497e3a-9f81-4f68-8881-80a6aaae79a1\" (UID: \"17497e3a-9f81-4f68-8881-80a6aaae79a1\") " Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.749004 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/487ae31f-7fb6-4077-8f7d-11bb488b172b-catalog-content\") pod \"487ae31f-7fb6-4077-8f7d-11bb488b172b\" (UID: \"487ae31f-7fb6-4077-8f7d-11bb488b172b\") " Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.749065 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/487ae31f-7fb6-4077-8f7d-11bb488b172b-utilities\") pod \"487ae31f-7fb6-4077-8f7d-11bb488b172b\" (UID: \"487ae31f-7fb6-4077-8f7d-11bb488b172b\") " Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.749142 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlwhg\" (UniqueName: \"kubernetes.io/projected/487ae31f-7fb6-4077-8f7d-11bb488b172b-kube-api-access-rlwhg\") pod \"487ae31f-7fb6-4077-8f7d-11bb488b172b\" (UID: \"487ae31f-7fb6-4077-8f7d-11bb488b172b\") " Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.749191 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17497e3a-9f81-4f68-8881-80a6aaae79a1-utilities\") pod \"17497e3a-9f81-4f68-8881-80a6aaae79a1\" (UID: \"17497e3a-9f81-4f68-8881-80a6aaae79a1\") " Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.750160 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/790ca1b9-1a1e-49d8-802b-e848efbc4c3e-utilities" (OuterVolumeSpecName: "utilities") pod "790ca1b9-1a1e-49d8-802b-e848efbc4c3e" (UID: "790ca1b9-1a1e-49d8-802b-e848efbc4c3e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.750168 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/487ae31f-7fb6-4077-8f7d-11bb488b172b-utilities" (OuterVolumeSpecName: "utilities") pod "487ae31f-7fb6-4077-8f7d-11bb488b172b" (UID: "487ae31f-7fb6-4077-8f7d-11bb488b172b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.751496 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17497e3a-9f81-4f68-8881-80a6aaae79a1-utilities" (OuterVolumeSpecName: "utilities") pod "17497e3a-9f81-4f68-8881-80a6aaae79a1" (UID: "17497e3a-9f81-4f68-8881-80a6aaae79a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.768542 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/790ca1b9-1a1e-49d8-802b-e848efbc4c3e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "790ca1b9-1a1e-49d8-802b-e848efbc4c3e" (UID: "790ca1b9-1a1e-49d8-802b-e848efbc4c3e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.794284 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4vqf8"] Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.806514 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/790ca1b9-1a1e-49d8-802b-e848efbc4c3e-kube-api-access-fcksd" (OuterVolumeSpecName: "kube-api-access-fcksd") pod "790ca1b9-1a1e-49d8-802b-e848efbc4c3e" (UID: "790ca1b9-1a1e-49d8-802b-e848efbc4c3e"). InnerVolumeSpecName "kube-api-access-fcksd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.806618 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/487ae31f-7fb6-4077-8f7d-11bb488b172b-kube-api-access-rlwhg" (OuterVolumeSpecName: "kube-api-access-rlwhg") pod "487ae31f-7fb6-4077-8f7d-11bb488b172b" (UID: "487ae31f-7fb6-4077-8f7d-11bb488b172b"). InnerVolumeSpecName "kube-api-access-rlwhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.806774 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17497e3a-9f81-4f68-8881-80a6aaae79a1-kube-api-access-tb2p2" (OuterVolumeSpecName: "kube-api-access-tb2p2") pod "17497e3a-9f81-4f68-8881-80a6aaae79a1" (UID: "17497e3a-9f81-4f68-8881-80a6aaae79a1"). InnerVolumeSpecName "kube-api-access-tb2p2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.810415 4700 generic.go:334] "Generic (PLEG): container finished" podID="ca3c8e5f-3994-409a-b8b2-58ee2ee245b6" containerID="2ce9b6f608a04db4a5b0f3d9609382869e9684c9023703cdd8e6d1108e0df27d" exitCode=0 Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.810507 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7qgxt" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.810503 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7qgxt" event={"ID":"ca3c8e5f-3994-409a-b8b2-58ee2ee245b6","Type":"ContainerDied","Data":"2ce9b6f608a04db4a5b0f3d9609382869e9684c9023703cdd8e6d1108e0df27d"} Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.810559 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7qgxt" event={"ID":"ca3c8e5f-3994-409a-b8b2-58ee2ee245b6","Type":"ContainerDied","Data":"84a7aef1d5114e3eb188a1e8142594b182314e1013334015db53934af281e5a7"} Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.810584 4700 scope.go:117] "RemoveContainer" containerID="2ce9b6f608a04db4a5b0f3d9609382869e9684c9023703cdd8e6d1108e0df27d" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.824266 4700 generic.go:334] "Generic (PLEG): container finished" podID="17497e3a-9f81-4f68-8881-80a6aaae79a1" containerID="1cdb0cef4b70ad8dba90741ebb2f73e5e2df2148980bf5928ef0b29c2315b4e3" exitCode=0 Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.824351 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cr2l5" event={"ID":"17497e3a-9f81-4f68-8881-80a6aaae79a1","Type":"ContainerDied","Data":"1cdb0cef4b70ad8dba90741ebb2f73e5e2df2148980bf5928ef0b29c2315b4e3"} Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.824365 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cr2l5" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.824379 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cr2l5" event={"ID":"17497e3a-9f81-4f68-8881-80a6aaae79a1","Type":"ContainerDied","Data":"9e91a9c1e30bd8f7a17a51116484964bc7014c5aa17044750c8b88a811e93f93"} Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.829630 4700 generic.go:334] "Generic (PLEG): container finished" podID="790ca1b9-1a1e-49d8-802b-e848efbc4c3e" containerID="dba3207861d7cbea85470b7c066d7e13465c00a96d4deb80c92eaf2e5f92a5a4" exitCode=0 Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.829693 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c7vk6" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.829715 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c7vk6" event={"ID":"790ca1b9-1a1e-49d8-802b-e848efbc4c3e","Type":"ContainerDied","Data":"dba3207861d7cbea85470b7c066d7e13465c00a96d4deb80c92eaf2e5f92a5a4"} Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.829763 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c7vk6" event={"ID":"790ca1b9-1a1e-49d8-802b-e848efbc4c3e","Type":"ContainerDied","Data":"83ff4946b21aaafbc17f413e5460c9295321cb073057819e10b2b66318943d13"} Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.833631 4700 generic.go:334] "Generic (PLEG): container finished" podID="487ae31f-7fb6-4077-8f7d-11bb488b172b" containerID="2560fc8caa644a41d36efff79ae79ea49ca45a4dd7f3ac90b2fa4b40ff4eefe2" exitCode=0 Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.833691 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9r2ww" event={"ID":"487ae31f-7fb6-4077-8f7d-11bb488b172b","Type":"ContainerDied","Data":"2560fc8caa644a41d36efff79ae79ea49ca45a4dd7f3ac90b2fa4b40ff4eefe2"} Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.833951 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9r2ww" event={"ID":"487ae31f-7fb6-4077-8f7d-11bb488b172b","Type":"ContainerDied","Data":"b2575d5bd169dc79d8b5e452ece37556c61750d094a446a7df7d6301d2390e4a"} Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.834012 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9r2ww" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.838496 4700 generic.go:334] "Generic (PLEG): container finished" podID="a56f43f6-4fa8-47ab-b028-2bcc44e329d0" containerID="2dc4f1ea40e85f8880385647609d3d4337144af4536669fe652a004e60091f69" exitCode=0 Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.838541 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8bwfw" event={"ID":"a56f43f6-4fa8-47ab-b028-2bcc44e329d0","Type":"ContainerDied","Data":"2dc4f1ea40e85f8880385647609d3d4337144af4536669fe652a004e60091f69"} Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.838563 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8bwfw" event={"ID":"a56f43f6-4fa8-47ab-b028-2bcc44e329d0","Type":"ContainerDied","Data":"b011b7bdb75dbcf24e770f7a962dd7f31d85e669b0de4b1c53e37c4f45926578"} Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.838617 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8bwfw" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.841589 4700 scope.go:117] "RemoveContainer" containerID="2ce9b6f608a04db4a5b0f3d9609382869e9684c9023703cdd8e6d1108e0df27d" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.841842 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/487ae31f-7fb6-4077-8f7d-11bb488b172b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "487ae31f-7fb6-4077-8f7d-11bb488b172b" (UID: "487ae31f-7fb6-4077-8f7d-11bb488b172b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:25:23 crc kubenswrapper[4700]: E1007 11:25:23.842143 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ce9b6f608a04db4a5b0f3d9609382869e9684c9023703cdd8e6d1108e0df27d\": container with ID starting with 2ce9b6f608a04db4a5b0f3d9609382869e9684c9023703cdd8e6d1108e0df27d not found: ID does not exist" containerID="2ce9b6f608a04db4a5b0f3d9609382869e9684c9023703cdd8e6d1108e0df27d" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.842172 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ce9b6f608a04db4a5b0f3d9609382869e9684c9023703cdd8e6d1108e0df27d"} err="failed to get container status \"2ce9b6f608a04db4a5b0f3d9609382869e9684c9023703cdd8e6d1108e0df27d\": rpc error: code = NotFound desc = could not find container \"2ce9b6f608a04db4a5b0f3d9609382869e9684c9023703cdd8e6d1108e0df27d\": container with ID starting with 2ce9b6f608a04db4a5b0f3d9609382869e9684c9023703cdd8e6d1108e0df27d not found: ID does not exist" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.842191 4700 scope.go:117] "RemoveContainer" containerID="1cdb0cef4b70ad8dba90741ebb2f73e5e2df2148980bf5928ef0b29c2315b4e3" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.851230 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a56f43f6-4fa8-47ab-b028-2bcc44e329d0-catalog-content\") pod \"a56f43f6-4fa8-47ab-b028-2bcc44e329d0\" (UID: \"a56f43f6-4fa8-47ab-b028-2bcc44e329d0\") " Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.851294 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ca3c8e5f-3994-409a-b8b2-58ee2ee245b6-marketplace-trusted-ca\") pod \"ca3c8e5f-3994-409a-b8b2-58ee2ee245b6\" (UID: \"ca3c8e5f-3994-409a-b8b2-58ee2ee245b6\") " Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.851352 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a56f43f6-4fa8-47ab-b028-2bcc44e329d0-utilities\") pod \"a56f43f6-4fa8-47ab-b028-2bcc44e329d0\" (UID: \"a56f43f6-4fa8-47ab-b028-2bcc44e329d0\") " Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.851409 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqs2x\" (UniqueName: \"kubernetes.io/projected/ca3c8e5f-3994-409a-b8b2-58ee2ee245b6-kube-api-access-mqs2x\") pod \"ca3c8e5f-3994-409a-b8b2-58ee2ee245b6\" (UID: \"ca3c8e5f-3994-409a-b8b2-58ee2ee245b6\") " Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.851438 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvktg\" (UniqueName: \"kubernetes.io/projected/a56f43f6-4fa8-47ab-b028-2bcc44e329d0-kube-api-access-zvktg\") pod \"a56f43f6-4fa8-47ab-b028-2bcc44e329d0\" (UID: \"a56f43f6-4fa8-47ab-b028-2bcc44e329d0\") " Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.851455 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ca3c8e5f-3994-409a-b8b2-58ee2ee245b6-marketplace-operator-metrics\") pod \"ca3c8e5f-3994-409a-b8b2-58ee2ee245b6\" (UID: \"ca3c8e5f-3994-409a-b8b2-58ee2ee245b6\") " Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.851664 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlwhg\" (UniqueName: \"kubernetes.io/projected/487ae31f-7fb6-4077-8f7d-11bb488b172b-kube-api-access-rlwhg\") on node \"crc\" DevicePath \"\"" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.851675 4700 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17497e3a-9f81-4f68-8881-80a6aaae79a1-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.851685 4700 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/790ca1b9-1a1e-49d8-802b-e848efbc4c3e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.851694 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tb2p2\" (UniqueName: \"kubernetes.io/projected/17497e3a-9f81-4f68-8881-80a6aaae79a1-kube-api-access-tb2p2\") on node \"crc\" DevicePath \"\"" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.851703 4700 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/790ca1b9-1a1e-49d8-802b-e848efbc4c3e-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.851711 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcksd\" (UniqueName: \"kubernetes.io/projected/790ca1b9-1a1e-49d8-802b-e848efbc4c3e-kube-api-access-fcksd\") on node \"crc\" DevicePath \"\"" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.851719 4700 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/487ae31f-7fb6-4077-8f7d-11bb488b172b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.851728 4700 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/487ae31f-7fb6-4077-8f7d-11bb488b172b-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.852734 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a56f43f6-4fa8-47ab-b028-2bcc44e329d0-utilities" (OuterVolumeSpecName: "utilities") pod "a56f43f6-4fa8-47ab-b028-2bcc44e329d0" (UID: "a56f43f6-4fa8-47ab-b028-2bcc44e329d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.853771 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca3c8e5f-3994-409a-b8b2-58ee2ee245b6-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "ca3c8e5f-3994-409a-b8b2-58ee2ee245b6" (UID: "ca3c8e5f-3994-409a-b8b2-58ee2ee245b6"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.856846 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca3c8e5f-3994-409a-b8b2-58ee2ee245b6-kube-api-access-mqs2x" (OuterVolumeSpecName: "kube-api-access-mqs2x") pod "ca3c8e5f-3994-409a-b8b2-58ee2ee245b6" (UID: "ca3c8e5f-3994-409a-b8b2-58ee2ee245b6"). InnerVolumeSpecName "kube-api-access-mqs2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.857130 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3c8e5f-3994-409a-b8b2-58ee2ee245b6-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "ca3c8e5f-3994-409a-b8b2-58ee2ee245b6" (UID: "ca3c8e5f-3994-409a-b8b2-58ee2ee245b6"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.858558 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a56f43f6-4fa8-47ab-b028-2bcc44e329d0-kube-api-access-zvktg" (OuterVolumeSpecName: "kube-api-access-zvktg") pod "a56f43f6-4fa8-47ab-b028-2bcc44e329d0" (UID: "a56f43f6-4fa8-47ab-b028-2bcc44e329d0"). InnerVolumeSpecName "kube-api-access-zvktg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.862017 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c7vk6"] Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.865588 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-c7vk6"] Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.869564 4700 scope.go:117] "RemoveContainer" containerID="58e9b1307667b3558200c0e004363280719e8b200716cbe649b132da1f4e30a1" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.871363 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17497e3a-9f81-4f68-8881-80a6aaae79a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "17497e3a-9f81-4f68-8881-80a6aaae79a1" (UID: "17497e3a-9f81-4f68-8881-80a6aaae79a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.890160 4700 scope.go:117] "RemoveContainer" containerID="6cf45c1a147b35322bddc4b8086229252d4e3b19bccd8949106ef4ed990e19a3" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.906182 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a56f43f6-4fa8-47ab-b028-2bcc44e329d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a56f43f6-4fa8-47ab-b028-2bcc44e329d0" (UID: "a56f43f6-4fa8-47ab-b028-2bcc44e329d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.908923 4700 scope.go:117] "RemoveContainer" containerID="1cdb0cef4b70ad8dba90741ebb2f73e5e2df2148980bf5928ef0b29c2315b4e3" Oct 07 11:25:23 crc kubenswrapper[4700]: E1007 11:25:23.909391 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cdb0cef4b70ad8dba90741ebb2f73e5e2df2148980bf5928ef0b29c2315b4e3\": container with ID starting with 1cdb0cef4b70ad8dba90741ebb2f73e5e2df2148980bf5928ef0b29c2315b4e3 not found: ID does not exist" containerID="1cdb0cef4b70ad8dba90741ebb2f73e5e2df2148980bf5928ef0b29c2315b4e3" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.909500 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cdb0cef4b70ad8dba90741ebb2f73e5e2df2148980bf5928ef0b29c2315b4e3"} err="failed to get container status \"1cdb0cef4b70ad8dba90741ebb2f73e5e2df2148980bf5928ef0b29c2315b4e3\": rpc error: code = NotFound desc = could not find container \"1cdb0cef4b70ad8dba90741ebb2f73e5e2df2148980bf5928ef0b29c2315b4e3\": container with ID starting with 1cdb0cef4b70ad8dba90741ebb2f73e5e2df2148980bf5928ef0b29c2315b4e3 not found: ID does not exist" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.909583 4700 scope.go:117] "RemoveContainer" containerID="58e9b1307667b3558200c0e004363280719e8b200716cbe649b132da1f4e30a1" Oct 07 11:25:23 crc kubenswrapper[4700]: E1007 11:25:23.910163 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58e9b1307667b3558200c0e004363280719e8b200716cbe649b132da1f4e30a1\": container with ID starting with 58e9b1307667b3558200c0e004363280719e8b200716cbe649b132da1f4e30a1 not found: ID does not exist" containerID="58e9b1307667b3558200c0e004363280719e8b200716cbe649b132da1f4e30a1" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.910209 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58e9b1307667b3558200c0e004363280719e8b200716cbe649b132da1f4e30a1"} err="failed to get container status \"58e9b1307667b3558200c0e004363280719e8b200716cbe649b132da1f4e30a1\": rpc error: code = NotFound desc = could not find container \"58e9b1307667b3558200c0e004363280719e8b200716cbe649b132da1f4e30a1\": container with ID starting with 58e9b1307667b3558200c0e004363280719e8b200716cbe649b132da1f4e30a1 not found: ID does not exist" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.910241 4700 scope.go:117] "RemoveContainer" containerID="6cf45c1a147b35322bddc4b8086229252d4e3b19bccd8949106ef4ed990e19a3" Oct 07 11:25:23 crc kubenswrapper[4700]: E1007 11:25:23.910784 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cf45c1a147b35322bddc4b8086229252d4e3b19bccd8949106ef4ed990e19a3\": container with ID starting with 6cf45c1a147b35322bddc4b8086229252d4e3b19bccd8949106ef4ed990e19a3 not found: ID does not exist" containerID="6cf45c1a147b35322bddc4b8086229252d4e3b19bccd8949106ef4ed990e19a3" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.910894 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cf45c1a147b35322bddc4b8086229252d4e3b19bccd8949106ef4ed990e19a3"} err="failed to get container status \"6cf45c1a147b35322bddc4b8086229252d4e3b19bccd8949106ef4ed990e19a3\": rpc error: code = NotFound desc = could not find container \"6cf45c1a147b35322bddc4b8086229252d4e3b19bccd8949106ef4ed990e19a3\": container with ID starting with 6cf45c1a147b35322bddc4b8086229252d4e3b19bccd8949106ef4ed990e19a3 not found: ID does not exist" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.910968 4700 scope.go:117] "RemoveContainer" containerID="dba3207861d7cbea85470b7c066d7e13465c00a96d4deb80c92eaf2e5f92a5a4" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.927353 4700 scope.go:117] "RemoveContainer" containerID="beb4c08326a724ef416b73c2f409f1ef1d353dc961a9c0a3e0ff9884a20b903c" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.942113 4700 scope.go:117] "RemoveContainer" containerID="596c919ebf6b875b2799a9a5109011d8f075136d269c7384aa7597f02754c6e2" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.953031 4700 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17497e3a-9f81-4f68-8881-80a6aaae79a1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.953069 4700 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a56f43f6-4fa8-47ab-b028-2bcc44e329d0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.953081 4700 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ca3c8e5f-3994-409a-b8b2-58ee2ee245b6-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.953093 4700 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a56f43f6-4fa8-47ab-b028-2bcc44e329d0-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.953102 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqs2x\" (UniqueName: \"kubernetes.io/projected/ca3c8e5f-3994-409a-b8b2-58ee2ee245b6-kube-api-access-mqs2x\") on node \"crc\" DevicePath \"\"" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.953112 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvktg\" (UniqueName: \"kubernetes.io/projected/a56f43f6-4fa8-47ab-b028-2bcc44e329d0-kube-api-access-zvktg\") on node \"crc\" DevicePath \"\"" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.953120 4700 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ca3c8e5f-3994-409a-b8b2-58ee2ee245b6-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.956144 4700 scope.go:117] "RemoveContainer" containerID="dba3207861d7cbea85470b7c066d7e13465c00a96d4deb80c92eaf2e5f92a5a4" Oct 07 11:25:23 crc kubenswrapper[4700]: E1007 11:25:23.957653 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dba3207861d7cbea85470b7c066d7e13465c00a96d4deb80c92eaf2e5f92a5a4\": container with ID starting with dba3207861d7cbea85470b7c066d7e13465c00a96d4deb80c92eaf2e5f92a5a4 not found: ID does not exist" containerID="dba3207861d7cbea85470b7c066d7e13465c00a96d4deb80c92eaf2e5f92a5a4" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.957692 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dba3207861d7cbea85470b7c066d7e13465c00a96d4deb80c92eaf2e5f92a5a4"} err="failed to get container status \"dba3207861d7cbea85470b7c066d7e13465c00a96d4deb80c92eaf2e5f92a5a4\": rpc error: code = NotFound desc = could not find container \"dba3207861d7cbea85470b7c066d7e13465c00a96d4deb80c92eaf2e5f92a5a4\": container with ID starting with dba3207861d7cbea85470b7c066d7e13465c00a96d4deb80c92eaf2e5f92a5a4 not found: ID does not exist" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.957726 4700 scope.go:117] "RemoveContainer" containerID="beb4c08326a724ef416b73c2f409f1ef1d353dc961a9c0a3e0ff9884a20b903c" Oct 07 11:25:23 crc kubenswrapper[4700]: E1007 11:25:23.958609 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"beb4c08326a724ef416b73c2f409f1ef1d353dc961a9c0a3e0ff9884a20b903c\": container with ID starting with beb4c08326a724ef416b73c2f409f1ef1d353dc961a9c0a3e0ff9884a20b903c not found: ID does not exist" containerID="beb4c08326a724ef416b73c2f409f1ef1d353dc961a9c0a3e0ff9884a20b903c" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.958657 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"beb4c08326a724ef416b73c2f409f1ef1d353dc961a9c0a3e0ff9884a20b903c"} err="failed to get container status \"beb4c08326a724ef416b73c2f409f1ef1d353dc961a9c0a3e0ff9884a20b903c\": rpc error: code = NotFound desc = could not find container \"beb4c08326a724ef416b73c2f409f1ef1d353dc961a9c0a3e0ff9884a20b903c\": container with ID starting with beb4c08326a724ef416b73c2f409f1ef1d353dc961a9c0a3e0ff9884a20b903c not found: ID does not exist" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.958687 4700 scope.go:117] "RemoveContainer" containerID="596c919ebf6b875b2799a9a5109011d8f075136d269c7384aa7597f02754c6e2" Oct 07 11:25:23 crc kubenswrapper[4700]: E1007 11:25:23.959250 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"596c919ebf6b875b2799a9a5109011d8f075136d269c7384aa7597f02754c6e2\": container with ID starting with 596c919ebf6b875b2799a9a5109011d8f075136d269c7384aa7597f02754c6e2 not found: ID does not exist" containerID="596c919ebf6b875b2799a9a5109011d8f075136d269c7384aa7597f02754c6e2" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.959376 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"596c919ebf6b875b2799a9a5109011d8f075136d269c7384aa7597f02754c6e2"} err="failed to get container status \"596c919ebf6b875b2799a9a5109011d8f075136d269c7384aa7597f02754c6e2\": rpc error: code = NotFound desc = could not find container \"596c919ebf6b875b2799a9a5109011d8f075136d269c7384aa7597f02754c6e2\": container with ID starting with 596c919ebf6b875b2799a9a5109011d8f075136d269c7384aa7597f02754c6e2 not found: ID does not exist" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.959470 4700 scope.go:117] "RemoveContainer" containerID="2560fc8caa644a41d36efff79ae79ea49ca45a4dd7f3ac90b2fa4b40ff4eefe2" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.967418 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="790ca1b9-1a1e-49d8-802b-e848efbc4c3e" path="/var/lib/kubelet/pods/790ca1b9-1a1e-49d8-802b-e848efbc4c3e/volumes" Oct 07 11:25:23 crc kubenswrapper[4700]: I1007 11:25:23.980112 4700 scope.go:117] "RemoveContainer" containerID="d48430ee2c175830564f0031f926693aaef3d42c788c76061e0747e60b4ee58d" Oct 07 11:25:24 crc kubenswrapper[4700]: I1007 11:25:24.009561 4700 scope.go:117] "RemoveContainer" containerID="33dacfefec07737c4b2e99f06b29c1a9285b2959fbcd51e7de6567c493de6bcf" Oct 07 11:25:24 crc kubenswrapper[4700]: I1007 11:25:24.037133 4700 scope.go:117] "RemoveContainer" containerID="2560fc8caa644a41d36efff79ae79ea49ca45a4dd7f3ac90b2fa4b40ff4eefe2" Oct 07 11:25:24 crc kubenswrapper[4700]: E1007 11:25:24.037923 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2560fc8caa644a41d36efff79ae79ea49ca45a4dd7f3ac90b2fa4b40ff4eefe2\": container with ID starting with 2560fc8caa644a41d36efff79ae79ea49ca45a4dd7f3ac90b2fa4b40ff4eefe2 not found: ID does not exist" containerID="2560fc8caa644a41d36efff79ae79ea49ca45a4dd7f3ac90b2fa4b40ff4eefe2" Oct 07 11:25:24 crc kubenswrapper[4700]: I1007 11:25:24.037959 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2560fc8caa644a41d36efff79ae79ea49ca45a4dd7f3ac90b2fa4b40ff4eefe2"} err="failed to get container status \"2560fc8caa644a41d36efff79ae79ea49ca45a4dd7f3ac90b2fa4b40ff4eefe2\": rpc error: code = NotFound desc = could not find container \"2560fc8caa644a41d36efff79ae79ea49ca45a4dd7f3ac90b2fa4b40ff4eefe2\": container with ID starting with 2560fc8caa644a41d36efff79ae79ea49ca45a4dd7f3ac90b2fa4b40ff4eefe2 not found: ID does not exist" Oct 07 11:25:24 crc kubenswrapper[4700]: I1007 11:25:24.037985 4700 scope.go:117] "RemoveContainer" containerID="d48430ee2c175830564f0031f926693aaef3d42c788c76061e0747e60b4ee58d" Oct 07 11:25:24 crc kubenswrapper[4700]: E1007 11:25:24.038352 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d48430ee2c175830564f0031f926693aaef3d42c788c76061e0747e60b4ee58d\": container with ID starting with d48430ee2c175830564f0031f926693aaef3d42c788c76061e0747e60b4ee58d not found: ID does not exist" containerID="d48430ee2c175830564f0031f926693aaef3d42c788c76061e0747e60b4ee58d" Oct 07 11:25:24 crc kubenswrapper[4700]: I1007 11:25:24.038379 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d48430ee2c175830564f0031f926693aaef3d42c788c76061e0747e60b4ee58d"} err="failed to get container status \"d48430ee2c175830564f0031f926693aaef3d42c788c76061e0747e60b4ee58d\": rpc error: code = NotFound desc = could not find container \"d48430ee2c175830564f0031f926693aaef3d42c788c76061e0747e60b4ee58d\": container with ID starting with d48430ee2c175830564f0031f926693aaef3d42c788c76061e0747e60b4ee58d not found: ID does not exist" Oct 07 11:25:24 crc kubenswrapper[4700]: I1007 11:25:24.038396 4700 scope.go:117] "RemoveContainer" containerID="33dacfefec07737c4b2e99f06b29c1a9285b2959fbcd51e7de6567c493de6bcf" Oct 07 11:25:24 crc kubenswrapper[4700]: E1007 11:25:24.038825 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33dacfefec07737c4b2e99f06b29c1a9285b2959fbcd51e7de6567c493de6bcf\": container with ID starting with 33dacfefec07737c4b2e99f06b29c1a9285b2959fbcd51e7de6567c493de6bcf not found: ID does not exist" containerID="33dacfefec07737c4b2e99f06b29c1a9285b2959fbcd51e7de6567c493de6bcf" Oct 07 11:25:24 crc kubenswrapper[4700]: I1007 11:25:24.038892 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33dacfefec07737c4b2e99f06b29c1a9285b2959fbcd51e7de6567c493de6bcf"} err="failed to get container status \"33dacfefec07737c4b2e99f06b29c1a9285b2959fbcd51e7de6567c493de6bcf\": rpc error: code = NotFound desc = could not find container \"33dacfefec07737c4b2e99f06b29c1a9285b2959fbcd51e7de6567c493de6bcf\": container with ID starting with 33dacfefec07737c4b2e99f06b29c1a9285b2959fbcd51e7de6567c493de6bcf not found: ID does not exist" Oct 07 11:25:24 crc kubenswrapper[4700]: I1007 11:25:24.038975 4700 scope.go:117] "RemoveContainer" containerID="2dc4f1ea40e85f8880385647609d3d4337144af4536669fe652a004e60091f69" Oct 07 11:25:24 crc kubenswrapper[4700]: I1007 11:25:24.060579 4700 scope.go:117] "RemoveContainer" containerID="cd5e927a233fc46878a2302a8d40e3f8781094e78fa9c286bf589901a784aa78" Oct 07 11:25:24 crc kubenswrapper[4700]: I1007 11:25:24.080697 4700 scope.go:117] "RemoveContainer" containerID="a271f2831e9b301668c8c4d812b50aaf39614e29bf1fa4fdf0f772b29525828a" Oct 07 11:25:24 crc kubenswrapper[4700]: I1007 11:25:24.098559 4700 scope.go:117] "RemoveContainer" containerID="2dc4f1ea40e85f8880385647609d3d4337144af4536669fe652a004e60091f69" Oct 07 11:25:24 crc kubenswrapper[4700]: E1007 11:25:24.099096 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dc4f1ea40e85f8880385647609d3d4337144af4536669fe652a004e60091f69\": container with ID starting with 2dc4f1ea40e85f8880385647609d3d4337144af4536669fe652a004e60091f69 not found: ID does not exist" containerID="2dc4f1ea40e85f8880385647609d3d4337144af4536669fe652a004e60091f69" Oct 07 11:25:24 crc kubenswrapper[4700]: I1007 11:25:24.099127 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dc4f1ea40e85f8880385647609d3d4337144af4536669fe652a004e60091f69"} err="failed to get container status \"2dc4f1ea40e85f8880385647609d3d4337144af4536669fe652a004e60091f69\": rpc error: code = NotFound desc = could not find container \"2dc4f1ea40e85f8880385647609d3d4337144af4536669fe652a004e60091f69\": container with ID starting with 2dc4f1ea40e85f8880385647609d3d4337144af4536669fe652a004e60091f69 not found: ID does not exist" Oct 07 11:25:24 crc kubenswrapper[4700]: I1007 11:25:24.099152 4700 scope.go:117] "RemoveContainer" containerID="cd5e927a233fc46878a2302a8d40e3f8781094e78fa9c286bf589901a784aa78" Oct 07 11:25:24 crc kubenswrapper[4700]: E1007 11:25:24.099469 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd5e927a233fc46878a2302a8d40e3f8781094e78fa9c286bf589901a784aa78\": container with ID starting with cd5e927a233fc46878a2302a8d40e3f8781094e78fa9c286bf589901a784aa78 not found: ID does not exist" containerID="cd5e927a233fc46878a2302a8d40e3f8781094e78fa9c286bf589901a784aa78" Oct 07 11:25:24 crc kubenswrapper[4700]: I1007 11:25:24.099511 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd5e927a233fc46878a2302a8d40e3f8781094e78fa9c286bf589901a784aa78"} err="failed to get container status \"cd5e927a233fc46878a2302a8d40e3f8781094e78fa9c286bf589901a784aa78\": rpc error: code = NotFound desc = could not find container \"cd5e927a233fc46878a2302a8d40e3f8781094e78fa9c286bf589901a784aa78\": container with ID starting with cd5e927a233fc46878a2302a8d40e3f8781094e78fa9c286bf589901a784aa78 not found: ID does not exist" Oct 07 11:25:24 crc kubenswrapper[4700]: I1007 11:25:24.099533 4700 scope.go:117] "RemoveContainer" containerID="a271f2831e9b301668c8c4d812b50aaf39614e29bf1fa4fdf0f772b29525828a" Oct 07 11:25:24 crc kubenswrapper[4700]: E1007 11:25:24.099803 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a271f2831e9b301668c8c4d812b50aaf39614e29bf1fa4fdf0f772b29525828a\": container with ID starting with a271f2831e9b301668c8c4d812b50aaf39614e29bf1fa4fdf0f772b29525828a not found: ID does not exist" containerID="a271f2831e9b301668c8c4d812b50aaf39614e29bf1fa4fdf0f772b29525828a" Oct 07 11:25:24 crc kubenswrapper[4700]: I1007 11:25:24.099827 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a271f2831e9b301668c8c4d812b50aaf39614e29bf1fa4fdf0f772b29525828a"} err="failed to get container status \"a271f2831e9b301668c8c4d812b50aaf39614e29bf1fa4fdf0f772b29525828a\": rpc error: code = NotFound desc = could not find container \"a271f2831e9b301668c8c4d812b50aaf39614e29bf1fa4fdf0f772b29525828a\": container with ID starting with a271f2831e9b301668c8c4d812b50aaf39614e29bf1fa4fdf0f772b29525828a not found: ID does not exist" Oct 07 11:25:24 crc kubenswrapper[4700]: I1007 11:25:24.126594 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7qgxt"] Oct 07 11:25:24 crc kubenswrapper[4700]: I1007 11:25:24.131003 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7qgxt"] Oct 07 11:25:24 crc kubenswrapper[4700]: I1007 11:25:24.143637 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cr2l5"] Oct 07 11:25:24 crc kubenswrapper[4700]: I1007 11:25:24.147622 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cr2l5"] Oct 07 11:25:24 crc kubenswrapper[4700]: I1007 11:25:24.159074 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9r2ww"] Oct 07 11:25:24 crc kubenswrapper[4700]: I1007 11:25:24.163802 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9r2ww"] Oct 07 11:25:24 crc kubenswrapper[4700]: I1007 11:25:24.168337 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8bwfw"] Oct 07 11:25:24 crc kubenswrapper[4700]: I1007 11:25:24.175762 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8bwfw"] Oct 07 11:25:24 crc kubenswrapper[4700]: I1007 11:25:24.851423 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4vqf8" event={"ID":"280bc2c9-6204-4779-b7c8-a09260dd2a66","Type":"ContainerStarted","Data":"c081834229ee6a48cf6397cf2f909fdaeeb44fd8dade9882da0712c8f18ab5f6"} Oct 07 11:25:24 crc kubenswrapper[4700]: I1007 11:25:24.851472 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4vqf8" event={"ID":"280bc2c9-6204-4779-b7c8-a09260dd2a66","Type":"ContainerStarted","Data":"60616ae142b3edf3b682bc69eb13ea14fea93a90dda0382b8d383a8da1e815ef"} Oct 07 11:25:24 crc kubenswrapper[4700]: I1007 11:25:24.851625 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-4vqf8" Oct 07 11:25:24 crc kubenswrapper[4700]: I1007 11:25:24.856751 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-4vqf8" Oct 07 11:25:24 crc kubenswrapper[4700]: I1007 11:25:24.874972 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-4vqf8" podStartSLOduration=1.874937533 podStartE2EDuration="1.874937533s" podCreationTimestamp="2025-10-07 11:25:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:25:24.869974647 +0000 UTC m=+291.666373626" watchObservedRunningTime="2025-10-07 11:25:24.874937533 +0000 UTC m=+291.671336552" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.366093 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fvvh8"] Oct 07 11:25:25 crc kubenswrapper[4700]: E1007 11:25:25.366858 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17497e3a-9f81-4f68-8881-80a6aaae79a1" containerName="extract-content" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.366877 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="17497e3a-9f81-4f68-8881-80a6aaae79a1" containerName="extract-content" Oct 07 11:25:25 crc kubenswrapper[4700]: E1007 11:25:25.366892 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a56f43f6-4fa8-47ab-b028-2bcc44e329d0" containerName="extract-utilities" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.366900 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="a56f43f6-4fa8-47ab-b028-2bcc44e329d0" containerName="extract-utilities" Oct 07 11:25:25 crc kubenswrapper[4700]: E1007 11:25:25.366912 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17497e3a-9f81-4f68-8881-80a6aaae79a1" containerName="extract-utilities" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.366921 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="17497e3a-9f81-4f68-8881-80a6aaae79a1" containerName="extract-utilities" Oct 07 11:25:25 crc kubenswrapper[4700]: E1007 11:25:25.366931 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca3c8e5f-3994-409a-b8b2-58ee2ee245b6" containerName="marketplace-operator" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.366939 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca3c8e5f-3994-409a-b8b2-58ee2ee245b6" containerName="marketplace-operator" Oct 07 11:25:25 crc kubenswrapper[4700]: E1007 11:25:25.366950 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="487ae31f-7fb6-4077-8f7d-11bb488b172b" containerName="extract-utilities" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.366957 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="487ae31f-7fb6-4077-8f7d-11bb488b172b" containerName="extract-utilities" Oct 07 11:25:25 crc kubenswrapper[4700]: E1007 11:25:25.366969 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17497e3a-9f81-4f68-8881-80a6aaae79a1" containerName="registry-server" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.366977 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="17497e3a-9f81-4f68-8881-80a6aaae79a1" containerName="registry-server" Oct 07 11:25:25 crc kubenswrapper[4700]: E1007 11:25:25.366991 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="487ae31f-7fb6-4077-8f7d-11bb488b172b" containerName="extract-content" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.366999 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="487ae31f-7fb6-4077-8f7d-11bb488b172b" containerName="extract-content" Oct 07 11:25:25 crc kubenswrapper[4700]: E1007 11:25:25.367009 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a56f43f6-4fa8-47ab-b028-2bcc44e329d0" containerName="registry-server" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.367017 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="a56f43f6-4fa8-47ab-b028-2bcc44e329d0" containerName="registry-server" Oct 07 11:25:25 crc kubenswrapper[4700]: E1007 11:25:25.367030 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a56f43f6-4fa8-47ab-b028-2bcc44e329d0" containerName="extract-content" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.367038 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="a56f43f6-4fa8-47ab-b028-2bcc44e329d0" containerName="extract-content" Oct 07 11:25:25 crc kubenswrapper[4700]: E1007 11:25:25.367048 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="790ca1b9-1a1e-49d8-802b-e848efbc4c3e" containerName="registry-server" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.367055 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="790ca1b9-1a1e-49d8-802b-e848efbc4c3e" containerName="registry-server" Oct 07 11:25:25 crc kubenswrapper[4700]: E1007 11:25:25.367066 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="790ca1b9-1a1e-49d8-802b-e848efbc4c3e" containerName="extract-content" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.367073 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="790ca1b9-1a1e-49d8-802b-e848efbc4c3e" containerName="extract-content" Oct 07 11:25:25 crc kubenswrapper[4700]: E1007 11:25:25.367080 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="487ae31f-7fb6-4077-8f7d-11bb488b172b" containerName="registry-server" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.367088 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="487ae31f-7fb6-4077-8f7d-11bb488b172b" containerName="registry-server" Oct 07 11:25:25 crc kubenswrapper[4700]: E1007 11:25:25.367099 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="790ca1b9-1a1e-49d8-802b-e848efbc4c3e" containerName="extract-utilities" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.367105 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="790ca1b9-1a1e-49d8-802b-e848efbc4c3e" containerName="extract-utilities" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.367191 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca3c8e5f-3994-409a-b8b2-58ee2ee245b6" containerName="marketplace-operator" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.367199 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="790ca1b9-1a1e-49d8-802b-e848efbc4c3e" containerName="registry-server" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.367206 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="a56f43f6-4fa8-47ab-b028-2bcc44e329d0" containerName="registry-server" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.367219 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="17497e3a-9f81-4f68-8881-80a6aaae79a1" containerName="registry-server" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.367230 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="487ae31f-7fb6-4077-8f7d-11bb488b172b" containerName="registry-server" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.368118 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fvvh8" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.370183 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.370354 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32b97e41-e040-497c-8feb-312e9b11364a-utilities\") pod \"redhat-marketplace-fvvh8\" (UID: \"32b97e41-e040-497c-8feb-312e9b11364a\") " pod="openshift-marketplace/redhat-marketplace-fvvh8" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.370377 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swhj9\" (UniqueName: \"kubernetes.io/projected/32b97e41-e040-497c-8feb-312e9b11364a-kube-api-access-swhj9\") pod \"redhat-marketplace-fvvh8\" (UID: \"32b97e41-e040-497c-8feb-312e9b11364a\") " pod="openshift-marketplace/redhat-marketplace-fvvh8" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.370409 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32b97e41-e040-497c-8feb-312e9b11364a-catalog-content\") pod \"redhat-marketplace-fvvh8\" (UID: \"32b97e41-e040-497c-8feb-312e9b11364a\") " pod="openshift-marketplace/redhat-marketplace-fvvh8" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.381517 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fvvh8"] Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.472035 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32b97e41-e040-497c-8feb-312e9b11364a-utilities\") pod \"redhat-marketplace-fvvh8\" (UID: \"32b97e41-e040-497c-8feb-312e9b11364a\") " pod="openshift-marketplace/redhat-marketplace-fvvh8" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.472089 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swhj9\" (UniqueName: \"kubernetes.io/projected/32b97e41-e040-497c-8feb-312e9b11364a-kube-api-access-swhj9\") pod \"redhat-marketplace-fvvh8\" (UID: \"32b97e41-e040-497c-8feb-312e9b11364a\") " pod="openshift-marketplace/redhat-marketplace-fvvh8" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.472159 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32b97e41-e040-497c-8feb-312e9b11364a-catalog-content\") pod \"redhat-marketplace-fvvh8\" (UID: \"32b97e41-e040-497c-8feb-312e9b11364a\") " pod="openshift-marketplace/redhat-marketplace-fvvh8" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.472660 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32b97e41-e040-497c-8feb-312e9b11364a-utilities\") pod \"redhat-marketplace-fvvh8\" (UID: \"32b97e41-e040-497c-8feb-312e9b11364a\") " pod="openshift-marketplace/redhat-marketplace-fvvh8" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.472736 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32b97e41-e040-497c-8feb-312e9b11364a-catalog-content\") pod \"redhat-marketplace-fvvh8\" (UID: \"32b97e41-e040-497c-8feb-312e9b11364a\") " pod="openshift-marketplace/redhat-marketplace-fvvh8" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.490842 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swhj9\" (UniqueName: \"kubernetes.io/projected/32b97e41-e040-497c-8feb-312e9b11364a-kube-api-access-swhj9\") pod \"redhat-marketplace-fvvh8\" (UID: \"32b97e41-e040-497c-8feb-312e9b11364a\") " pod="openshift-marketplace/redhat-marketplace-fvvh8" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.574203 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pw4v8"] Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.579834 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pw4v8" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.585545 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.594845 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pw4v8"] Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.691186 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fvvh8" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.775428 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3df6a670-4c88-45f9-a160-f35e4b7b0b64-utilities\") pod \"redhat-operators-pw4v8\" (UID: \"3df6a670-4c88-45f9-a160-f35e4b7b0b64\") " pod="openshift-marketplace/redhat-operators-pw4v8" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.775466 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3df6a670-4c88-45f9-a160-f35e4b7b0b64-catalog-content\") pod \"redhat-operators-pw4v8\" (UID: \"3df6a670-4c88-45f9-a160-f35e4b7b0b64\") " pod="openshift-marketplace/redhat-operators-pw4v8" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.775542 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm2kv\" (UniqueName: \"kubernetes.io/projected/3df6a670-4c88-45f9-a160-f35e4b7b0b64-kube-api-access-nm2kv\") pod \"redhat-operators-pw4v8\" (UID: \"3df6a670-4c88-45f9-a160-f35e4b7b0b64\") " pod="openshift-marketplace/redhat-operators-pw4v8" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.876801 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fvvh8"] Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.877575 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm2kv\" (UniqueName: \"kubernetes.io/projected/3df6a670-4c88-45f9-a160-f35e4b7b0b64-kube-api-access-nm2kv\") pod \"redhat-operators-pw4v8\" (UID: \"3df6a670-4c88-45f9-a160-f35e4b7b0b64\") " pod="openshift-marketplace/redhat-operators-pw4v8" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.877877 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3df6a670-4c88-45f9-a160-f35e4b7b0b64-utilities\") pod \"redhat-operators-pw4v8\" (UID: \"3df6a670-4c88-45f9-a160-f35e4b7b0b64\") " pod="openshift-marketplace/redhat-operators-pw4v8" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.877909 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3df6a670-4c88-45f9-a160-f35e4b7b0b64-catalog-content\") pod \"redhat-operators-pw4v8\" (UID: \"3df6a670-4c88-45f9-a160-f35e4b7b0b64\") " pod="openshift-marketplace/redhat-operators-pw4v8" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.878816 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3df6a670-4c88-45f9-a160-f35e4b7b0b64-catalog-content\") pod \"redhat-operators-pw4v8\" (UID: \"3df6a670-4c88-45f9-a160-f35e4b7b0b64\") " pod="openshift-marketplace/redhat-operators-pw4v8" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.878896 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3df6a670-4c88-45f9-a160-f35e4b7b0b64-utilities\") pod \"redhat-operators-pw4v8\" (UID: \"3df6a670-4c88-45f9-a160-f35e4b7b0b64\") " pod="openshift-marketplace/redhat-operators-pw4v8" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.898947 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm2kv\" (UniqueName: \"kubernetes.io/projected/3df6a670-4c88-45f9-a160-f35e4b7b0b64-kube-api-access-nm2kv\") pod \"redhat-operators-pw4v8\" (UID: \"3df6a670-4c88-45f9-a160-f35e4b7b0b64\") " pod="openshift-marketplace/redhat-operators-pw4v8" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.915891 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pw4v8" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.965230 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17497e3a-9f81-4f68-8881-80a6aaae79a1" path="/var/lib/kubelet/pods/17497e3a-9f81-4f68-8881-80a6aaae79a1/volumes" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.966037 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="487ae31f-7fb6-4077-8f7d-11bb488b172b" path="/var/lib/kubelet/pods/487ae31f-7fb6-4077-8f7d-11bb488b172b/volumes" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.966809 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a56f43f6-4fa8-47ab-b028-2bcc44e329d0" path="/var/lib/kubelet/pods/a56f43f6-4fa8-47ab-b028-2bcc44e329d0/volumes" Oct 07 11:25:25 crc kubenswrapper[4700]: I1007 11:25:25.968580 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca3c8e5f-3994-409a-b8b2-58ee2ee245b6" path="/var/lib/kubelet/pods/ca3c8e5f-3994-409a-b8b2-58ee2ee245b6/volumes" Oct 07 11:25:26 crc kubenswrapper[4700]: I1007 11:25:26.098346 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pw4v8"] Oct 07 11:25:26 crc kubenswrapper[4700]: W1007 11:25:26.152107 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3df6a670_4c88_45f9_a160_f35e4b7b0b64.slice/crio-4cdcf7d612460fefc6a1e419b4965fb53b128ede494e4662d460a8e0999d270b WatchSource:0}: Error finding container 4cdcf7d612460fefc6a1e419b4965fb53b128ede494e4662d460a8e0999d270b: Status 404 returned error can't find the container with id 4cdcf7d612460fefc6a1e419b4965fb53b128ede494e4662d460a8e0999d270b Oct 07 11:25:26 crc kubenswrapper[4700]: I1007 11:25:26.862704 4700 generic.go:334] "Generic (PLEG): container finished" podID="3df6a670-4c88-45f9-a160-f35e4b7b0b64" containerID="c4edfb0577f9e6d2181d6d19f412578a858887d8c4107bb9e36b88cb914ea500" exitCode=0 Oct 07 11:25:26 crc kubenswrapper[4700]: I1007 11:25:26.862785 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pw4v8" event={"ID":"3df6a670-4c88-45f9-a160-f35e4b7b0b64","Type":"ContainerDied","Data":"c4edfb0577f9e6d2181d6d19f412578a858887d8c4107bb9e36b88cb914ea500"} Oct 07 11:25:26 crc kubenswrapper[4700]: I1007 11:25:26.863198 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pw4v8" event={"ID":"3df6a670-4c88-45f9-a160-f35e4b7b0b64","Type":"ContainerStarted","Data":"4cdcf7d612460fefc6a1e419b4965fb53b128ede494e4662d460a8e0999d270b"} Oct 07 11:25:26 crc kubenswrapper[4700]: I1007 11:25:26.865179 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fvvh8" event={"ID":"32b97e41-e040-497c-8feb-312e9b11364a","Type":"ContainerDied","Data":"72a6ac6f581217a88371b53b1a6a31a0ec9eb9c2cb70259787cd6590d3f2a12a"} Oct 07 11:25:26 crc kubenswrapper[4700]: I1007 11:25:26.865146 4700 generic.go:334] "Generic (PLEG): container finished" podID="32b97e41-e040-497c-8feb-312e9b11364a" containerID="72a6ac6f581217a88371b53b1a6a31a0ec9eb9c2cb70259787cd6590d3f2a12a" exitCode=0 Oct 07 11:25:26 crc kubenswrapper[4700]: I1007 11:25:26.865273 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fvvh8" event={"ID":"32b97e41-e040-497c-8feb-312e9b11364a","Type":"ContainerStarted","Data":"8bb94342da24d584ebf79ee97436ccb7012e1ad87b7aff6f76a003dccd9efc91"} Oct 07 11:25:27 crc kubenswrapper[4700]: I1007 11:25:27.769937 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ngg7w"] Oct 07 11:25:27 crc kubenswrapper[4700]: I1007 11:25:27.771958 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ngg7w" Oct 07 11:25:27 crc kubenswrapper[4700]: I1007 11:25:27.776684 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 07 11:25:27 crc kubenswrapper[4700]: I1007 11:25:27.781754 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ngg7w"] Oct 07 11:25:27 crc kubenswrapper[4700]: I1007 11:25:27.802908 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/152601b0-4148-4077-bf15-899a1ee66ce7-utilities\") pod \"certified-operators-ngg7w\" (UID: \"152601b0-4148-4077-bf15-899a1ee66ce7\") " pod="openshift-marketplace/certified-operators-ngg7w" Oct 07 11:25:27 crc kubenswrapper[4700]: I1007 11:25:27.803029 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/152601b0-4148-4077-bf15-899a1ee66ce7-catalog-content\") pod \"certified-operators-ngg7w\" (UID: \"152601b0-4148-4077-bf15-899a1ee66ce7\") " pod="openshift-marketplace/certified-operators-ngg7w" Oct 07 11:25:27 crc kubenswrapper[4700]: I1007 11:25:27.803097 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85fvb\" (UniqueName: \"kubernetes.io/projected/152601b0-4148-4077-bf15-899a1ee66ce7-kube-api-access-85fvb\") pod \"certified-operators-ngg7w\" (UID: \"152601b0-4148-4077-bf15-899a1ee66ce7\") " pod="openshift-marketplace/certified-operators-ngg7w" Oct 07 11:25:27 crc kubenswrapper[4700]: I1007 11:25:27.907117 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/152601b0-4148-4077-bf15-899a1ee66ce7-utilities\") pod \"certified-operators-ngg7w\" (UID: \"152601b0-4148-4077-bf15-899a1ee66ce7\") " pod="openshift-marketplace/certified-operators-ngg7w" Oct 07 11:25:27 crc kubenswrapper[4700]: I1007 11:25:27.907222 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/152601b0-4148-4077-bf15-899a1ee66ce7-catalog-content\") pod \"certified-operators-ngg7w\" (UID: \"152601b0-4148-4077-bf15-899a1ee66ce7\") " pod="openshift-marketplace/certified-operators-ngg7w" Oct 07 11:25:27 crc kubenswrapper[4700]: I1007 11:25:27.907673 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/152601b0-4148-4077-bf15-899a1ee66ce7-utilities\") pod \"certified-operators-ngg7w\" (UID: \"152601b0-4148-4077-bf15-899a1ee66ce7\") " pod="openshift-marketplace/certified-operators-ngg7w" Oct 07 11:25:27 crc kubenswrapper[4700]: I1007 11:25:27.907261 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85fvb\" (UniqueName: \"kubernetes.io/projected/152601b0-4148-4077-bf15-899a1ee66ce7-kube-api-access-85fvb\") pod \"certified-operators-ngg7w\" (UID: \"152601b0-4148-4077-bf15-899a1ee66ce7\") " pod="openshift-marketplace/certified-operators-ngg7w" Oct 07 11:25:27 crc kubenswrapper[4700]: I1007 11:25:27.907679 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/152601b0-4148-4077-bf15-899a1ee66ce7-catalog-content\") pod \"certified-operators-ngg7w\" (UID: \"152601b0-4148-4077-bf15-899a1ee66ce7\") " pod="openshift-marketplace/certified-operators-ngg7w" Oct 07 11:25:27 crc kubenswrapper[4700]: I1007 11:25:27.932749 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85fvb\" (UniqueName: \"kubernetes.io/projected/152601b0-4148-4077-bf15-899a1ee66ce7-kube-api-access-85fvb\") pod \"certified-operators-ngg7w\" (UID: \"152601b0-4148-4077-bf15-899a1ee66ce7\") " pod="openshift-marketplace/certified-operators-ngg7w" Oct 07 11:25:27 crc kubenswrapper[4700]: I1007 11:25:27.989559 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k5jbs"] Oct 07 11:25:27 crc kubenswrapper[4700]: I1007 11:25:27.992490 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k5jbs" Oct 07 11:25:27 crc kubenswrapper[4700]: I1007 11:25:27.997034 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 07 11:25:28 crc kubenswrapper[4700]: I1007 11:25:28.000953 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k5jbs"] Oct 07 11:25:28 crc kubenswrapper[4700]: I1007 11:25:28.008932 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a424dee-5265-41d8-8ef7-fe5d2bcec50c-catalog-content\") pod \"community-operators-k5jbs\" (UID: \"4a424dee-5265-41d8-8ef7-fe5d2bcec50c\") " pod="openshift-marketplace/community-operators-k5jbs" Oct 07 11:25:28 crc kubenswrapper[4700]: I1007 11:25:28.008979 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a424dee-5265-41d8-8ef7-fe5d2bcec50c-utilities\") pod \"community-operators-k5jbs\" (UID: \"4a424dee-5265-41d8-8ef7-fe5d2bcec50c\") " pod="openshift-marketplace/community-operators-k5jbs" Oct 07 11:25:28 crc kubenswrapper[4700]: I1007 11:25:28.009077 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlvd4\" (UniqueName: \"kubernetes.io/projected/4a424dee-5265-41d8-8ef7-fe5d2bcec50c-kube-api-access-zlvd4\") pod \"community-operators-k5jbs\" (UID: \"4a424dee-5265-41d8-8ef7-fe5d2bcec50c\") " pod="openshift-marketplace/community-operators-k5jbs" Oct 07 11:25:28 crc kubenswrapper[4700]: I1007 11:25:28.107536 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ngg7w" Oct 07 11:25:28 crc kubenswrapper[4700]: I1007 11:25:28.111746 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a424dee-5265-41d8-8ef7-fe5d2bcec50c-catalog-content\") pod \"community-operators-k5jbs\" (UID: \"4a424dee-5265-41d8-8ef7-fe5d2bcec50c\") " pod="openshift-marketplace/community-operators-k5jbs" Oct 07 11:25:28 crc kubenswrapper[4700]: I1007 11:25:28.111821 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a424dee-5265-41d8-8ef7-fe5d2bcec50c-utilities\") pod \"community-operators-k5jbs\" (UID: \"4a424dee-5265-41d8-8ef7-fe5d2bcec50c\") " pod="openshift-marketplace/community-operators-k5jbs" Oct 07 11:25:28 crc kubenswrapper[4700]: I1007 11:25:28.111982 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlvd4\" (UniqueName: \"kubernetes.io/projected/4a424dee-5265-41d8-8ef7-fe5d2bcec50c-kube-api-access-zlvd4\") pod \"community-operators-k5jbs\" (UID: \"4a424dee-5265-41d8-8ef7-fe5d2bcec50c\") " pod="openshift-marketplace/community-operators-k5jbs" Oct 07 11:25:28 crc kubenswrapper[4700]: I1007 11:25:28.113259 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a424dee-5265-41d8-8ef7-fe5d2bcec50c-catalog-content\") pod \"community-operators-k5jbs\" (UID: \"4a424dee-5265-41d8-8ef7-fe5d2bcec50c\") " pod="openshift-marketplace/community-operators-k5jbs" Oct 07 11:25:28 crc kubenswrapper[4700]: I1007 11:25:28.113785 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a424dee-5265-41d8-8ef7-fe5d2bcec50c-utilities\") pod \"community-operators-k5jbs\" (UID: \"4a424dee-5265-41d8-8ef7-fe5d2bcec50c\") " pod="openshift-marketplace/community-operators-k5jbs" Oct 07 11:25:28 crc kubenswrapper[4700]: I1007 11:25:28.140968 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlvd4\" (UniqueName: \"kubernetes.io/projected/4a424dee-5265-41d8-8ef7-fe5d2bcec50c-kube-api-access-zlvd4\") pod \"community-operators-k5jbs\" (UID: \"4a424dee-5265-41d8-8ef7-fe5d2bcec50c\") " pod="openshift-marketplace/community-operators-k5jbs" Oct 07 11:25:28 crc kubenswrapper[4700]: I1007 11:25:28.313840 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k5jbs" Oct 07 11:25:28 crc kubenswrapper[4700]: I1007 11:25:28.517625 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k5jbs"] Oct 07 11:25:28 crc kubenswrapper[4700]: W1007 11:25:28.527738 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a424dee_5265_41d8_8ef7_fe5d2bcec50c.slice/crio-2a95e7ac2d8151c36e7dcfc6020cf37e5f2ac752264770422ca23856afc52047 WatchSource:0}: Error finding container 2a95e7ac2d8151c36e7dcfc6020cf37e5f2ac752264770422ca23856afc52047: Status 404 returned error can't find the container with id 2a95e7ac2d8151c36e7dcfc6020cf37e5f2ac752264770422ca23856afc52047 Oct 07 11:25:28 crc kubenswrapper[4700]: I1007 11:25:28.600460 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ngg7w"] Oct 07 11:25:28 crc kubenswrapper[4700]: W1007 11:25:28.610217 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod152601b0_4148_4077_bf15_899a1ee66ce7.slice/crio-7a52f64819c826575b60c627271655759d5a2f9299cc5d3ed287decfad8468b4 WatchSource:0}: Error finding container 7a52f64819c826575b60c627271655759d5a2f9299cc5d3ed287decfad8468b4: Status 404 returned error can't find the container with id 7a52f64819c826575b60c627271655759d5a2f9299cc5d3ed287decfad8468b4 Oct 07 11:25:28 crc kubenswrapper[4700]: I1007 11:25:28.878549 4700 generic.go:334] "Generic (PLEG): container finished" podID="152601b0-4148-4077-bf15-899a1ee66ce7" containerID="5fa0a5cc50e0389bf8381cd7930c90b691568ef7be151da041db9312a7320ccf" exitCode=0 Oct 07 11:25:28 crc kubenswrapper[4700]: I1007 11:25:28.878641 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngg7w" event={"ID":"152601b0-4148-4077-bf15-899a1ee66ce7","Type":"ContainerDied","Data":"5fa0a5cc50e0389bf8381cd7930c90b691568ef7be151da041db9312a7320ccf"} Oct 07 11:25:28 crc kubenswrapper[4700]: I1007 11:25:28.878676 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngg7w" event={"ID":"152601b0-4148-4077-bf15-899a1ee66ce7","Type":"ContainerStarted","Data":"7a52f64819c826575b60c627271655759d5a2f9299cc5d3ed287decfad8468b4"} Oct 07 11:25:28 crc kubenswrapper[4700]: I1007 11:25:28.881093 4700 generic.go:334] "Generic (PLEG): container finished" podID="32b97e41-e040-497c-8feb-312e9b11364a" containerID="17a2cef0c5531df8cfe834de78d291b6d48bca78cd779fe5587837d517bbe5ac" exitCode=0 Oct 07 11:25:28 crc kubenswrapper[4700]: I1007 11:25:28.881287 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fvvh8" event={"ID":"32b97e41-e040-497c-8feb-312e9b11364a","Type":"ContainerDied","Data":"17a2cef0c5531df8cfe834de78d291b6d48bca78cd779fe5587837d517bbe5ac"} Oct 07 11:25:28 crc kubenswrapper[4700]: I1007 11:25:28.901360 4700 generic.go:334] "Generic (PLEG): container finished" podID="3df6a670-4c88-45f9-a160-f35e4b7b0b64" containerID="341b827ddafe0be27747205e54b7266b6c86bd30d4148c0eeab6d5a7c4976321" exitCode=0 Oct 07 11:25:28 crc kubenswrapper[4700]: I1007 11:25:28.901434 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pw4v8" event={"ID":"3df6a670-4c88-45f9-a160-f35e4b7b0b64","Type":"ContainerDied","Data":"341b827ddafe0be27747205e54b7266b6c86bd30d4148c0eeab6d5a7c4976321"} Oct 07 11:25:28 crc kubenswrapper[4700]: I1007 11:25:28.904984 4700 generic.go:334] "Generic (PLEG): container finished" podID="4a424dee-5265-41d8-8ef7-fe5d2bcec50c" containerID="378378161b35686c50add770238afb76d89e98f498af1d86f7831894daa8a517" exitCode=0 Oct 07 11:25:28 crc kubenswrapper[4700]: I1007 11:25:28.905046 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k5jbs" event={"ID":"4a424dee-5265-41d8-8ef7-fe5d2bcec50c","Type":"ContainerDied","Data":"378378161b35686c50add770238afb76d89e98f498af1d86f7831894daa8a517"} Oct 07 11:25:28 crc kubenswrapper[4700]: I1007 11:25:28.905092 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k5jbs" event={"ID":"4a424dee-5265-41d8-8ef7-fe5d2bcec50c","Type":"ContainerStarted","Data":"2a95e7ac2d8151c36e7dcfc6020cf37e5f2ac752264770422ca23856afc52047"} Oct 07 11:25:29 crc kubenswrapper[4700]: I1007 11:25:29.912327 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fvvh8" event={"ID":"32b97e41-e040-497c-8feb-312e9b11364a","Type":"ContainerStarted","Data":"5492dab52accf1840b07eda783d535267a17296977246cdb2597977794742bf5"} Oct 07 11:25:29 crc kubenswrapper[4700]: I1007 11:25:29.939362 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fvvh8" podStartSLOduration=2.320067279 podStartE2EDuration="4.93933549s" podCreationTimestamp="2025-10-07 11:25:25 +0000 UTC" firstStartedPulling="2025-10-07 11:25:26.870462071 +0000 UTC m=+293.666861060" lastFinishedPulling="2025-10-07 11:25:29.489730272 +0000 UTC m=+296.286129271" observedRunningTime="2025-10-07 11:25:29.936449727 +0000 UTC m=+296.732848756" watchObservedRunningTime="2025-10-07 11:25:29.93933549 +0000 UTC m=+296.735734489" Oct 07 11:25:30 crc kubenswrapper[4700]: I1007 11:25:30.919741 4700 generic.go:334] "Generic (PLEG): container finished" podID="4a424dee-5265-41d8-8ef7-fe5d2bcec50c" containerID="9125d14834f3eaeb7fa65f90ed0f246c1fbd6aaaa95fda44a211aac1e1b1a00a" exitCode=0 Oct 07 11:25:30 crc kubenswrapper[4700]: I1007 11:25:30.919800 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k5jbs" event={"ID":"4a424dee-5265-41d8-8ef7-fe5d2bcec50c","Type":"ContainerDied","Data":"9125d14834f3eaeb7fa65f90ed0f246c1fbd6aaaa95fda44a211aac1e1b1a00a"} Oct 07 11:25:30 crc kubenswrapper[4700]: I1007 11:25:30.922613 4700 generic.go:334] "Generic (PLEG): container finished" podID="152601b0-4148-4077-bf15-899a1ee66ce7" containerID="ada7572aa57a9e3b1ba75a3a1735cea6e66050c75dbc683f1469be87ddedcacf" exitCode=0 Oct 07 11:25:30 crc kubenswrapper[4700]: I1007 11:25:30.922759 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngg7w" event={"ID":"152601b0-4148-4077-bf15-899a1ee66ce7","Type":"ContainerDied","Data":"ada7572aa57a9e3b1ba75a3a1735cea6e66050c75dbc683f1469be87ddedcacf"} Oct 07 11:25:30 crc kubenswrapper[4700]: I1007 11:25:30.927732 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pw4v8" event={"ID":"3df6a670-4c88-45f9-a160-f35e4b7b0b64","Type":"ContainerStarted","Data":"2f825b75fd626a149cdea06fd21b612da21badabe611adfcc3973f965b6135ea"} Oct 07 11:25:30 crc kubenswrapper[4700]: I1007 11:25:30.977830 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pw4v8" podStartSLOduration=3.201076602 podStartE2EDuration="5.977804128s" podCreationTimestamp="2025-10-07 11:25:25 +0000 UTC" firstStartedPulling="2025-10-07 11:25:26.865019253 +0000 UTC m=+293.661418242" lastFinishedPulling="2025-10-07 11:25:29.641746779 +0000 UTC m=+296.438145768" observedRunningTime="2025-10-07 11:25:30.974869514 +0000 UTC m=+297.771268523" watchObservedRunningTime="2025-10-07 11:25:30.977804128 +0000 UTC m=+297.774203117" Oct 07 11:25:31 crc kubenswrapper[4700]: I1007 11:25:31.933637 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k5jbs" event={"ID":"4a424dee-5265-41d8-8ef7-fe5d2bcec50c","Type":"ContainerStarted","Data":"d54b209349b959e2d176ae3614ca4eb52552066cb06051a675274f7661d9df0a"} Oct 07 11:25:31 crc kubenswrapper[4700]: I1007 11:25:31.935981 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngg7w" event={"ID":"152601b0-4148-4077-bf15-899a1ee66ce7","Type":"ContainerStarted","Data":"2454626ebe3dc98ea7d8d2f9c4fec089b4251d6d2f5b3a6ae80def2cd34fe181"} Oct 07 11:25:31 crc kubenswrapper[4700]: I1007 11:25:31.954704 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k5jbs" podStartSLOduration=2.211125352 podStartE2EDuration="4.954684779s" podCreationTimestamp="2025-10-07 11:25:27 +0000 UTC" firstStartedPulling="2025-10-07 11:25:28.906964075 +0000 UTC m=+295.703363064" lastFinishedPulling="2025-10-07 11:25:31.650523502 +0000 UTC m=+298.446922491" observedRunningTime="2025-10-07 11:25:31.951727654 +0000 UTC m=+298.748126653" watchObservedRunningTime="2025-10-07 11:25:31.954684779 +0000 UTC m=+298.751083768" Oct 07 11:25:31 crc kubenswrapper[4700]: I1007 11:25:31.977841 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ngg7w" podStartSLOduration=2.153539075 podStartE2EDuration="4.977818804s" podCreationTimestamp="2025-10-07 11:25:27 +0000 UTC" firstStartedPulling="2025-10-07 11:25:28.881040319 +0000 UTC m=+295.677439348" lastFinishedPulling="2025-10-07 11:25:31.705320088 +0000 UTC m=+298.501719077" observedRunningTime="2025-10-07 11:25:31.974558642 +0000 UTC m=+298.770957651" watchObservedRunningTime="2025-10-07 11:25:31.977818804 +0000 UTC m=+298.774217793" Oct 07 11:25:35 crc kubenswrapper[4700]: I1007 11:25:35.692117 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fvvh8" Oct 07 11:25:35 crc kubenswrapper[4700]: I1007 11:25:35.692826 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fvvh8" Oct 07 11:25:35 crc kubenswrapper[4700]: I1007 11:25:35.740023 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fvvh8" Oct 07 11:25:35 crc kubenswrapper[4700]: I1007 11:25:35.917254 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pw4v8" Oct 07 11:25:35 crc kubenswrapper[4700]: I1007 11:25:35.917341 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pw4v8" Oct 07 11:25:35 crc kubenswrapper[4700]: I1007 11:25:35.984597 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pw4v8" Oct 07 11:25:36 crc kubenswrapper[4700]: I1007 11:25:36.012354 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fvvh8" Oct 07 11:25:36 crc kubenswrapper[4700]: I1007 11:25:36.027409 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pw4v8" Oct 07 11:25:38 crc kubenswrapper[4700]: I1007 11:25:38.108808 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ngg7w" Oct 07 11:25:38 crc kubenswrapper[4700]: I1007 11:25:38.109141 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ngg7w" Oct 07 11:25:38 crc kubenswrapper[4700]: I1007 11:25:38.159199 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ngg7w" Oct 07 11:25:38 crc kubenswrapper[4700]: I1007 11:25:38.314021 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k5jbs" Oct 07 11:25:38 crc kubenswrapper[4700]: I1007 11:25:38.314446 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k5jbs" Oct 07 11:25:38 crc kubenswrapper[4700]: I1007 11:25:38.356960 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k5jbs" Oct 07 11:25:39 crc kubenswrapper[4700]: I1007 11:25:39.025066 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ngg7w" Oct 07 11:25:39 crc kubenswrapper[4700]: I1007 11:25:39.036113 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k5jbs" Oct 07 11:26:45 crc kubenswrapper[4700]: I1007 11:26:45.334615 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 11:26:45 crc kubenswrapper[4700]: I1007 11:26:45.335670 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 11:27:15 crc kubenswrapper[4700]: I1007 11:27:15.333564 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 11:27:15 crc kubenswrapper[4700]: I1007 11:27:15.334501 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 11:27:34 crc kubenswrapper[4700]: I1007 11:27:34.079820 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-ccsbj"] Oct 07 11:27:34 crc kubenswrapper[4700]: I1007 11:27:34.081743 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-ccsbj" Oct 07 11:27:34 crc kubenswrapper[4700]: I1007 11:27:34.104788 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-ccsbj"] Oct 07 11:27:34 crc kubenswrapper[4700]: I1007 11:27:34.215190 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b35bd4d3-da96-4c83-bca0-0522f57090dc-trusted-ca\") pod \"image-registry-66df7c8f76-ccsbj\" (UID: \"b35bd4d3-da96-4c83-bca0-0522f57090dc\") " pod="openshift-image-registry/image-registry-66df7c8f76-ccsbj" Oct 07 11:27:34 crc kubenswrapper[4700]: I1007 11:27:34.215250 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-ccsbj\" (UID: \"b35bd4d3-da96-4c83-bca0-0522f57090dc\") " pod="openshift-image-registry/image-registry-66df7c8f76-ccsbj" Oct 07 11:27:34 crc kubenswrapper[4700]: I1007 11:27:34.215290 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgsrq\" (UniqueName: \"kubernetes.io/projected/b35bd4d3-da96-4c83-bca0-0522f57090dc-kube-api-access-rgsrq\") pod \"image-registry-66df7c8f76-ccsbj\" (UID: \"b35bd4d3-da96-4c83-bca0-0522f57090dc\") " pod="openshift-image-registry/image-registry-66df7c8f76-ccsbj" Oct 07 11:27:34 crc kubenswrapper[4700]: I1007 11:27:34.215328 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b35bd4d3-da96-4c83-bca0-0522f57090dc-registry-certificates\") pod \"image-registry-66df7c8f76-ccsbj\" (UID: \"b35bd4d3-da96-4c83-bca0-0522f57090dc\") " pod="openshift-image-registry/image-registry-66df7c8f76-ccsbj" Oct 07 11:27:34 crc kubenswrapper[4700]: I1007 11:27:34.215351 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b35bd4d3-da96-4c83-bca0-0522f57090dc-installation-pull-secrets\") pod \"image-registry-66df7c8f76-ccsbj\" (UID: \"b35bd4d3-da96-4c83-bca0-0522f57090dc\") " pod="openshift-image-registry/image-registry-66df7c8f76-ccsbj" Oct 07 11:27:34 crc kubenswrapper[4700]: I1007 11:27:34.215381 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b35bd4d3-da96-4c83-bca0-0522f57090dc-registry-tls\") pod \"image-registry-66df7c8f76-ccsbj\" (UID: \"b35bd4d3-da96-4c83-bca0-0522f57090dc\") " pod="openshift-image-registry/image-registry-66df7c8f76-ccsbj" Oct 07 11:27:34 crc kubenswrapper[4700]: I1007 11:27:34.215395 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b35bd4d3-da96-4c83-bca0-0522f57090dc-bound-sa-token\") pod \"image-registry-66df7c8f76-ccsbj\" (UID: \"b35bd4d3-da96-4c83-bca0-0522f57090dc\") " pod="openshift-image-registry/image-registry-66df7c8f76-ccsbj" Oct 07 11:27:34 crc kubenswrapper[4700]: I1007 11:27:34.215429 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b35bd4d3-da96-4c83-bca0-0522f57090dc-ca-trust-extracted\") pod \"image-registry-66df7c8f76-ccsbj\" (UID: \"b35bd4d3-da96-4c83-bca0-0522f57090dc\") " pod="openshift-image-registry/image-registry-66df7c8f76-ccsbj" Oct 07 11:27:34 crc kubenswrapper[4700]: I1007 11:27:34.250690 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-ccsbj\" (UID: \"b35bd4d3-da96-4c83-bca0-0522f57090dc\") " pod="openshift-image-registry/image-registry-66df7c8f76-ccsbj" Oct 07 11:27:34 crc kubenswrapper[4700]: I1007 11:27:34.317197 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgsrq\" (UniqueName: \"kubernetes.io/projected/b35bd4d3-da96-4c83-bca0-0522f57090dc-kube-api-access-rgsrq\") pod \"image-registry-66df7c8f76-ccsbj\" (UID: \"b35bd4d3-da96-4c83-bca0-0522f57090dc\") " pod="openshift-image-registry/image-registry-66df7c8f76-ccsbj" Oct 07 11:27:34 crc kubenswrapper[4700]: I1007 11:27:34.317264 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b35bd4d3-da96-4c83-bca0-0522f57090dc-registry-certificates\") pod \"image-registry-66df7c8f76-ccsbj\" (UID: \"b35bd4d3-da96-4c83-bca0-0522f57090dc\") " pod="openshift-image-registry/image-registry-66df7c8f76-ccsbj" Oct 07 11:27:34 crc kubenswrapper[4700]: I1007 11:27:34.317298 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b35bd4d3-da96-4c83-bca0-0522f57090dc-installation-pull-secrets\") pod \"image-registry-66df7c8f76-ccsbj\" (UID: \"b35bd4d3-da96-4c83-bca0-0522f57090dc\") " pod="openshift-image-registry/image-registry-66df7c8f76-ccsbj" Oct 07 11:27:34 crc kubenswrapper[4700]: I1007 11:27:34.317361 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b35bd4d3-da96-4c83-bca0-0522f57090dc-registry-tls\") pod \"image-registry-66df7c8f76-ccsbj\" (UID: \"b35bd4d3-da96-4c83-bca0-0522f57090dc\") " pod="openshift-image-registry/image-registry-66df7c8f76-ccsbj" Oct 07 11:27:34 crc kubenswrapper[4700]: I1007 11:27:34.317383 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b35bd4d3-da96-4c83-bca0-0522f57090dc-bound-sa-token\") pod \"image-registry-66df7c8f76-ccsbj\" (UID: \"b35bd4d3-da96-4c83-bca0-0522f57090dc\") " pod="openshift-image-registry/image-registry-66df7c8f76-ccsbj" Oct 07 11:27:34 crc kubenswrapper[4700]: I1007 11:27:34.317437 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b35bd4d3-da96-4c83-bca0-0522f57090dc-ca-trust-extracted\") pod \"image-registry-66df7c8f76-ccsbj\" (UID: \"b35bd4d3-da96-4c83-bca0-0522f57090dc\") " pod="openshift-image-registry/image-registry-66df7c8f76-ccsbj" Oct 07 11:27:34 crc kubenswrapper[4700]: I1007 11:27:34.317469 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b35bd4d3-da96-4c83-bca0-0522f57090dc-trusted-ca\") pod \"image-registry-66df7c8f76-ccsbj\" (UID: \"b35bd4d3-da96-4c83-bca0-0522f57090dc\") " pod="openshift-image-registry/image-registry-66df7c8f76-ccsbj" Oct 07 11:27:34 crc kubenswrapper[4700]: I1007 11:27:34.318781 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b35bd4d3-da96-4c83-bca0-0522f57090dc-registry-certificates\") pod \"image-registry-66df7c8f76-ccsbj\" (UID: \"b35bd4d3-da96-4c83-bca0-0522f57090dc\") " pod="openshift-image-registry/image-registry-66df7c8f76-ccsbj" Oct 07 11:27:34 crc kubenswrapper[4700]: I1007 11:27:34.319155 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b35bd4d3-da96-4c83-bca0-0522f57090dc-ca-trust-extracted\") pod \"image-registry-66df7c8f76-ccsbj\" (UID: \"b35bd4d3-da96-4c83-bca0-0522f57090dc\") " pod="openshift-image-registry/image-registry-66df7c8f76-ccsbj" Oct 07 11:27:34 crc kubenswrapper[4700]: I1007 11:27:34.319521 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b35bd4d3-da96-4c83-bca0-0522f57090dc-trusted-ca\") pod \"image-registry-66df7c8f76-ccsbj\" (UID: \"b35bd4d3-da96-4c83-bca0-0522f57090dc\") " pod="openshift-image-registry/image-registry-66df7c8f76-ccsbj" Oct 07 11:27:34 crc kubenswrapper[4700]: I1007 11:27:34.325931 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b35bd4d3-da96-4c83-bca0-0522f57090dc-installation-pull-secrets\") pod \"image-registry-66df7c8f76-ccsbj\" (UID: \"b35bd4d3-da96-4c83-bca0-0522f57090dc\") " pod="openshift-image-registry/image-registry-66df7c8f76-ccsbj" Oct 07 11:27:34 crc kubenswrapper[4700]: I1007 11:27:34.326114 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b35bd4d3-da96-4c83-bca0-0522f57090dc-registry-tls\") pod \"image-registry-66df7c8f76-ccsbj\" (UID: \"b35bd4d3-da96-4c83-bca0-0522f57090dc\") " pod="openshift-image-registry/image-registry-66df7c8f76-ccsbj" Oct 07 11:27:34 crc kubenswrapper[4700]: I1007 11:27:34.340382 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgsrq\" (UniqueName: \"kubernetes.io/projected/b35bd4d3-da96-4c83-bca0-0522f57090dc-kube-api-access-rgsrq\") pod \"image-registry-66df7c8f76-ccsbj\" (UID: \"b35bd4d3-da96-4c83-bca0-0522f57090dc\") " pod="openshift-image-registry/image-registry-66df7c8f76-ccsbj" Oct 07 11:27:34 crc kubenswrapper[4700]: I1007 11:27:34.340475 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b35bd4d3-da96-4c83-bca0-0522f57090dc-bound-sa-token\") pod \"image-registry-66df7c8f76-ccsbj\" (UID: \"b35bd4d3-da96-4c83-bca0-0522f57090dc\") " pod="openshift-image-registry/image-registry-66df7c8f76-ccsbj" Oct 07 11:27:34 crc kubenswrapper[4700]: I1007 11:27:34.405687 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-ccsbj" Oct 07 11:27:34 crc kubenswrapper[4700]: I1007 11:27:34.736539 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-ccsbj"] Oct 07 11:27:34 crc kubenswrapper[4700]: I1007 11:27:34.766487 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-ccsbj" event={"ID":"b35bd4d3-da96-4c83-bca0-0522f57090dc","Type":"ContainerStarted","Data":"680dd710fb4ac6aaf71daf46c69cd46a9994143f6995e629976c58ced2ec3987"} Oct 07 11:27:35 crc kubenswrapper[4700]: I1007 11:27:35.775176 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-ccsbj" event={"ID":"b35bd4d3-da96-4c83-bca0-0522f57090dc","Type":"ContainerStarted","Data":"586642b3cb828e31f3d5d19f72f920d2dd61a00e3286919d96b41e5af8e7998f"} Oct 07 11:27:35 crc kubenswrapper[4700]: I1007 11:27:35.775648 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-ccsbj" Oct 07 11:27:35 crc kubenswrapper[4700]: I1007 11:27:35.803914 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-ccsbj" podStartSLOduration=1.803889123 podStartE2EDuration="1.803889123s" podCreationTimestamp="2025-10-07 11:27:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:27:35.801995303 +0000 UTC m=+422.598394362" watchObservedRunningTime="2025-10-07 11:27:35.803889123 +0000 UTC m=+422.600288122" Oct 07 11:27:45 crc kubenswrapper[4700]: I1007 11:27:45.333714 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 11:27:45 crc kubenswrapper[4700]: I1007 11:27:45.334149 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 11:27:45 crc kubenswrapper[4700]: I1007 11:27:45.334216 4700 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" Oct 07 11:27:45 crc kubenswrapper[4700]: I1007 11:27:45.335017 4700 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eccd7bcc9d2e4841d5c5ffebb71c3562830e1b3391f2acddd70627baba88e9fd"} pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 11:27:45 crc kubenswrapper[4700]: I1007 11:27:45.335107 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" containerID="cri-o://eccd7bcc9d2e4841d5c5ffebb71c3562830e1b3391f2acddd70627baba88e9fd" gracePeriod=600 Oct 07 11:27:45 crc kubenswrapper[4700]: I1007 11:27:45.848223 4700 generic.go:334] "Generic (PLEG): container finished" podID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerID="eccd7bcc9d2e4841d5c5ffebb71c3562830e1b3391f2acddd70627baba88e9fd" exitCode=0 Oct 07 11:27:45 crc kubenswrapper[4700]: I1007 11:27:45.848383 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" event={"ID":"97a77b38-e9b1-4243-ac3a-28d83d87cf15","Type":"ContainerDied","Data":"eccd7bcc9d2e4841d5c5ffebb71c3562830e1b3391f2acddd70627baba88e9fd"} Oct 07 11:27:45 crc kubenswrapper[4700]: I1007 11:27:45.848673 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" event={"ID":"97a77b38-e9b1-4243-ac3a-28d83d87cf15","Type":"ContainerStarted","Data":"7b5dd63de890f68091da6e9c7a22abc43dbafc0c1de89538465502461bd7d04c"} Oct 07 11:27:45 crc kubenswrapper[4700]: I1007 11:27:45.848706 4700 scope.go:117] "RemoveContainer" containerID="8ce702bc5de0802cbfc52ed8597ef780fa0873babfee573d2937d33236481fa6" Oct 07 11:27:54 crc kubenswrapper[4700]: I1007 11:27:54.413082 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-ccsbj" Oct 07 11:27:54 crc kubenswrapper[4700]: I1007 11:27:54.474376 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xshkj"] Oct 07 11:28:19 crc kubenswrapper[4700]: I1007 11:28:19.526646 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" podUID="b8df7762-8c45-42ad-a645-83eb0bbed34a" containerName="registry" containerID="cri-o://a149b9e903fe3816460bcfafb884f0c5d26b872f0806a05dcd1c05f22865899a" gracePeriod=30 Oct 07 11:28:19 crc kubenswrapper[4700]: E1007 11:28:19.661413 4700 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8df7762_8c45_42ad_a645_83eb0bbed34a.slice/crio-conmon-a149b9e903fe3816460bcfafb884f0c5d26b872f0806a05dcd1c05f22865899a.scope\": RecentStats: unable to find data in memory cache]" Oct 07 11:28:20 crc kubenswrapper[4700]: I1007 11:28:20.026043 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:28:20 crc kubenswrapper[4700]: I1007 11:28:20.051745 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b8df7762-8c45-42ad-a645-83eb0bbed34a-registry-tls\") pod \"b8df7762-8c45-42ad-a645-83eb0bbed34a\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " Oct 07 11:28:20 crc kubenswrapper[4700]: I1007 11:28:20.051829 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b8df7762-8c45-42ad-a645-83eb0bbed34a-trusted-ca\") pod \"b8df7762-8c45-42ad-a645-83eb0bbed34a\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " Oct 07 11:28:20 crc kubenswrapper[4700]: I1007 11:28:20.051918 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7jsq\" (UniqueName: \"kubernetes.io/projected/b8df7762-8c45-42ad-a645-83eb0bbed34a-kube-api-access-x7jsq\") pod \"b8df7762-8c45-42ad-a645-83eb0bbed34a\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " Oct 07 11:28:20 crc kubenswrapper[4700]: I1007 11:28:20.052156 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"b8df7762-8c45-42ad-a645-83eb0bbed34a\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " Oct 07 11:28:20 crc kubenswrapper[4700]: I1007 11:28:20.052225 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b8df7762-8c45-42ad-a645-83eb0bbed34a-ca-trust-extracted\") pod \"b8df7762-8c45-42ad-a645-83eb0bbed34a\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " Oct 07 11:28:20 crc kubenswrapper[4700]: I1007 11:28:20.052269 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b8df7762-8c45-42ad-a645-83eb0bbed34a-installation-pull-secrets\") pod \"b8df7762-8c45-42ad-a645-83eb0bbed34a\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " Oct 07 11:28:20 crc kubenswrapper[4700]: I1007 11:28:20.052353 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b8df7762-8c45-42ad-a645-83eb0bbed34a-registry-certificates\") pod \"b8df7762-8c45-42ad-a645-83eb0bbed34a\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " Oct 07 11:28:20 crc kubenswrapper[4700]: I1007 11:28:20.052410 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b8df7762-8c45-42ad-a645-83eb0bbed34a-bound-sa-token\") pod \"b8df7762-8c45-42ad-a645-83eb0bbed34a\" (UID: \"b8df7762-8c45-42ad-a645-83eb0bbed34a\") " Oct 07 11:28:20 crc kubenswrapper[4700]: I1007 11:28:20.053858 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8df7762-8c45-42ad-a645-83eb0bbed34a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b8df7762-8c45-42ad-a645-83eb0bbed34a" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:28:20 crc kubenswrapper[4700]: I1007 11:28:20.054124 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8df7762-8c45-42ad-a645-83eb0bbed34a-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b8df7762-8c45-42ad-a645-83eb0bbed34a" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:28:20 crc kubenswrapper[4700]: I1007 11:28:20.063787 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8df7762-8c45-42ad-a645-83eb0bbed34a-kube-api-access-x7jsq" (OuterVolumeSpecName: "kube-api-access-x7jsq") pod "b8df7762-8c45-42ad-a645-83eb0bbed34a" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a"). InnerVolumeSpecName "kube-api-access-x7jsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:28:20 crc kubenswrapper[4700]: I1007 11:28:20.068532 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8df7762-8c45-42ad-a645-83eb0bbed34a-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b8df7762-8c45-42ad-a645-83eb0bbed34a" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:28:20 crc kubenswrapper[4700]: I1007 11:28:20.069766 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8df7762-8c45-42ad-a645-83eb0bbed34a-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b8df7762-8c45-42ad-a645-83eb0bbed34a" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:28:20 crc kubenswrapper[4700]: I1007 11:28:20.071283 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8df7762-8c45-42ad-a645-83eb0bbed34a-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b8df7762-8c45-42ad-a645-83eb0bbed34a" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:28:20 crc kubenswrapper[4700]: I1007 11:28:20.082071 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "b8df7762-8c45-42ad-a645-83eb0bbed34a" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 07 11:28:20 crc kubenswrapper[4700]: I1007 11:28:20.103747 4700 generic.go:334] "Generic (PLEG): container finished" podID="b8df7762-8c45-42ad-a645-83eb0bbed34a" containerID="a149b9e903fe3816460bcfafb884f0c5d26b872f0806a05dcd1c05f22865899a" exitCode=0 Oct 07 11:28:20 crc kubenswrapper[4700]: I1007 11:28:20.103800 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" event={"ID":"b8df7762-8c45-42ad-a645-83eb0bbed34a","Type":"ContainerDied","Data":"a149b9e903fe3816460bcfafb884f0c5d26b872f0806a05dcd1c05f22865899a"} Oct 07 11:28:20 crc kubenswrapper[4700]: I1007 11:28:20.103834 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" event={"ID":"b8df7762-8c45-42ad-a645-83eb0bbed34a","Type":"ContainerDied","Data":"479b3920f0016fb8980940d2c7706d0bda066f7f6836130ed3ea30a20334361e"} Oct 07 11:28:20 crc kubenswrapper[4700]: I1007 11:28:20.103855 4700 scope.go:117] "RemoveContainer" containerID="a149b9e903fe3816460bcfafb884f0c5d26b872f0806a05dcd1c05f22865899a" Oct 07 11:28:20 crc kubenswrapper[4700]: I1007 11:28:20.103979 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-xshkj" Oct 07 11:28:20 crc kubenswrapper[4700]: I1007 11:28:20.105976 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8df7762-8c45-42ad-a645-83eb0bbed34a-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b8df7762-8c45-42ad-a645-83eb0bbed34a" (UID: "b8df7762-8c45-42ad-a645-83eb0bbed34a"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:28:20 crc kubenswrapper[4700]: I1007 11:28:20.127732 4700 scope.go:117] "RemoveContainer" containerID="a149b9e903fe3816460bcfafb884f0c5d26b872f0806a05dcd1c05f22865899a" Oct 07 11:28:20 crc kubenswrapper[4700]: E1007 11:28:20.128246 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a149b9e903fe3816460bcfafb884f0c5d26b872f0806a05dcd1c05f22865899a\": container with ID starting with a149b9e903fe3816460bcfafb884f0c5d26b872f0806a05dcd1c05f22865899a not found: ID does not exist" containerID="a149b9e903fe3816460bcfafb884f0c5d26b872f0806a05dcd1c05f22865899a" Oct 07 11:28:20 crc kubenswrapper[4700]: I1007 11:28:20.128293 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a149b9e903fe3816460bcfafb884f0c5d26b872f0806a05dcd1c05f22865899a"} err="failed to get container status \"a149b9e903fe3816460bcfafb884f0c5d26b872f0806a05dcd1c05f22865899a\": rpc error: code = NotFound desc = could not find container \"a149b9e903fe3816460bcfafb884f0c5d26b872f0806a05dcd1c05f22865899a\": container with ID starting with a149b9e903fe3816460bcfafb884f0c5d26b872f0806a05dcd1c05f22865899a not found: ID does not exist" Oct 07 11:28:20 crc kubenswrapper[4700]: I1007 11:28:20.155452 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7jsq\" (UniqueName: \"kubernetes.io/projected/b8df7762-8c45-42ad-a645-83eb0bbed34a-kube-api-access-x7jsq\") on node \"crc\" DevicePath \"\"" Oct 07 11:28:20 crc kubenswrapper[4700]: I1007 11:28:20.155488 4700 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b8df7762-8c45-42ad-a645-83eb0bbed34a-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 07 11:28:20 crc kubenswrapper[4700]: I1007 11:28:20.155499 4700 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b8df7762-8c45-42ad-a645-83eb0bbed34a-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 07 11:28:20 crc kubenswrapper[4700]: I1007 11:28:20.155508 4700 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b8df7762-8c45-42ad-a645-83eb0bbed34a-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 07 11:28:20 crc kubenswrapper[4700]: I1007 11:28:20.155518 4700 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b8df7762-8c45-42ad-a645-83eb0bbed34a-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 07 11:28:20 crc kubenswrapper[4700]: I1007 11:28:20.155526 4700 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b8df7762-8c45-42ad-a645-83eb0bbed34a-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 07 11:28:20 crc kubenswrapper[4700]: I1007 11:28:20.155535 4700 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b8df7762-8c45-42ad-a645-83eb0bbed34a-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 11:28:20 crc kubenswrapper[4700]: I1007 11:28:20.450530 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xshkj"] Oct 07 11:28:20 crc kubenswrapper[4700]: I1007 11:28:20.457199 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xshkj"] Oct 07 11:28:21 crc kubenswrapper[4700]: I1007 11:28:21.970917 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8df7762-8c45-42ad-a645-83eb0bbed34a" path="/var/lib/kubelet/pods/b8df7762-8c45-42ad-a645-83eb0bbed34a/volumes" Oct 07 11:29:45 crc kubenswrapper[4700]: I1007 11:29:45.334268 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 11:29:45 crc kubenswrapper[4700]: I1007 11:29:45.335024 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 11:30:00 crc kubenswrapper[4700]: I1007 11:30:00.154825 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330610-l56qp"] Oct 07 11:30:00 crc kubenswrapper[4700]: E1007 11:30:00.156017 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8df7762-8c45-42ad-a645-83eb0bbed34a" containerName="registry" Oct 07 11:30:00 crc kubenswrapper[4700]: I1007 11:30:00.156041 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8df7762-8c45-42ad-a645-83eb0bbed34a" containerName="registry" Oct 07 11:30:00 crc kubenswrapper[4700]: I1007 11:30:00.156339 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8df7762-8c45-42ad-a645-83eb0bbed34a" containerName="registry" Oct 07 11:30:00 crc kubenswrapper[4700]: I1007 11:30:00.157202 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330610-l56qp" Oct 07 11:30:00 crc kubenswrapper[4700]: I1007 11:30:00.159918 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 11:30:00 crc kubenswrapper[4700]: I1007 11:30:00.160366 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 11:30:00 crc kubenswrapper[4700]: I1007 11:30:00.172443 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f8db84c-25d7-401f-b5f1-93fb1c324f87-secret-volume\") pod \"collect-profiles-29330610-l56qp\" (UID: \"3f8db84c-25d7-401f-b5f1-93fb1c324f87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330610-l56qp" Oct 07 11:30:00 crc kubenswrapper[4700]: I1007 11:30:00.172694 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72fs6\" (UniqueName: \"kubernetes.io/projected/3f8db84c-25d7-401f-b5f1-93fb1c324f87-kube-api-access-72fs6\") pod \"collect-profiles-29330610-l56qp\" (UID: \"3f8db84c-25d7-401f-b5f1-93fb1c324f87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330610-l56qp" Oct 07 11:30:00 crc kubenswrapper[4700]: I1007 11:30:00.172764 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f8db84c-25d7-401f-b5f1-93fb1c324f87-config-volume\") pod \"collect-profiles-29330610-l56qp\" (UID: \"3f8db84c-25d7-401f-b5f1-93fb1c324f87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330610-l56qp" Oct 07 11:30:00 crc kubenswrapper[4700]: I1007 11:30:00.173361 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330610-l56qp"] Oct 07 11:30:00 crc kubenswrapper[4700]: I1007 11:30:00.273746 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f8db84c-25d7-401f-b5f1-93fb1c324f87-config-volume\") pod \"collect-profiles-29330610-l56qp\" (UID: \"3f8db84c-25d7-401f-b5f1-93fb1c324f87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330610-l56qp" Oct 07 11:30:00 crc kubenswrapper[4700]: I1007 11:30:00.273799 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72fs6\" (UniqueName: \"kubernetes.io/projected/3f8db84c-25d7-401f-b5f1-93fb1c324f87-kube-api-access-72fs6\") pod \"collect-profiles-29330610-l56qp\" (UID: \"3f8db84c-25d7-401f-b5f1-93fb1c324f87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330610-l56qp" Oct 07 11:30:00 crc kubenswrapper[4700]: I1007 11:30:00.273838 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f8db84c-25d7-401f-b5f1-93fb1c324f87-secret-volume\") pod \"collect-profiles-29330610-l56qp\" (UID: \"3f8db84c-25d7-401f-b5f1-93fb1c324f87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330610-l56qp" Oct 07 11:30:00 crc kubenswrapper[4700]: I1007 11:30:00.275139 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f8db84c-25d7-401f-b5f1-93fb1c324f87-config-volume\") pod \"collect-profiles-29330610-l56qp\" (UID: \"3f8db84c-25d7-401f-b5f1-93fb1c324f87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330610-l56qp" Oct 07 11:30:00 crc kubenswrapper[4700]: I1007 11:30:00.282669 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f8db84c-25d7-401f-b5f1-93fb1c324f87-secret-volume\") pod \"collect-profiles-29330610-l56qp\" (UID: \"3f8db84c-25d7-401f-b5f1-93fb1c324f87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330610-l56qp" Oct 07 11:30:00 crc kubenswrapper[4700]: I1007 11:30:00.296878 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72fs6\" (UniqueName: \"kubernetes.io/projected/3f8db84c-25d7-401f-b5f1-93fb1c324f87-kube-api-access-72fs6\") pod \"collect-profiles-29330610-l56qp\" (UID: \"3f8db84c-25d7-401f-b5f1-93fb1c324f87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330610-l56qp" Oct 07 11:30:00 crc kubenswrapper[4700]: I1007 11:30:00.533392 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330610-l56qp" Oct 07 11:30:00 crc kubenswrapper[4700]: I1007 11:30:00.766181 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330610-l56qp"] Oct 07 11:30:00 crc kubenswrapper[4700]: I1007 11:30:00.798618 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330610-l56qp" event={"ID":"3f8db84c-25d7-401f-b5f1-93fb1c324f87","Type":"ContainerStarted","Data":"fe98c1db5ea60ff8613bbd693023883e0613aef68707c46f54dfe401cffba92a"} Oct 07 11:30:01 crc kubenswrapper[4700]: I1007 11:30:01.807782 4700 generic.go:334] "Generic (PLEG): container finished" podID="3f8db84c-25d7-401f-b5f1-93fb1c324f87" containerID="f895705b2f8fe8c60aee9726859046272402b99218a6bdd7f69f6941b2c470b6" exitCode=0 Oct 07 11:30:01 crc kubenswrapper[4700]: I1007 11:30:01.807878 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330610-l56qp" event={"ID":"3f8db84c-25d7-401f-b5f1-93fb1c324f87","Type":"ContainerDied","Data":"f895705b2f8fe8c60aee9726859046272402b99218a6bdd7f69f6941b2c470b6"} Oct 07 11:30:03 crc kubenswrapper[4700]: I1007 11:30:03.126688 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330610-l56qp" Oct 07 11:30:03 crc kubenswrapper[4700]: I1007 11:30:03.315690 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f8db84c-25d7-401f-b5f1-93fb1c324f87-secret-volume\") pod \"3f8db84c-25d7-401f-b5f1-93fb1c324f87\" (UID: \"3f8db84c-25d7-401f-b5f1-93fb1c324f87\") " Oct 07 11:30:03 crc kubenswrapper[4700]: I1007 11:30:03.315806 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72fs6\" (UniqueName: \"kubernetes.io/projected/3f8db84c-25d7-401f-b5f1-93fb1c324f87-kube-api-access-72fs6\") pod \"3f8db84c-25d7-401f-b5f1-93fb1c324f87\" (UID: \"3f8db84c-25d7-401f-b5f1-93fb1c324f87\") " Oct 07 11:30:03 crc kubenswrapper[4700]: I1007 11:30:03.315867 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f8db84c-25d7-401f-b5f1-93fb1c324f87-config-volume\") pod \"3f8db84c-25d7-401f-b5f1-93fb1c324f87\" (UID: \"3f8db84c-25d7-401f-b5f1-93fb1c324f87\") " Oct 07 11:30:03 crc kubenswrapper[4700]: I1007 11:30:03.316994 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f8db84c-25d7-401f-b5f1-93fb1c324f87-config-volume" (OuterVolumeSpecName: "config-volume") pod "3f8db84c-25d7-401f-b5f1-93fb1c324f87" (UID: "3f8db84c-25d7-401f-b5f1-93fb1c324f87"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:30:03 crc kubenswrapper[4700]: I1007 11:30:03.323632 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f8db84c-25d7-401f-b5f1-93fb1c324f87-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3f8db84c-25d7-401f-b5f1-93fb1c324f87" (UID: "3f8db84c-25d7-401f-b5f1-93fb1c324f87"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:30:03 crc kubenswrapper[4700]: I1007 11:30:03.324195 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f8db84c-25d7-401f-b5f1-93fb1c324f87-kube-api-access-72fs6" (OuterVolumeSpecName: "kube-api-access-72fs6") pod "3f8db84c-25d7-401f-b5f1-93fb1c324f87" (UID: "3f8db84c-25d7-401f-b5f1-93fb1c324f87"). InnerVolumeSpecName "kube-api-access-72fs6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:30:03 crc kubenswrapper[4700]: I1007 11:30:03.417259 4700 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f8db84c-25d7-401f-b5f1-93fb1c324f87-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 11:30:03 crc kubenswrapper[4700]: I1007 11:30:03.417350 4700 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f8db84c-25d7-401f-b5f1-93fb1c324f87-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 11:30:03 crc kubenswrapper[4700]: I1007 11:30:03.417372 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72fs6\" (UniqueName: \"kubernetes.io/projected/3f8db84c-25d7-401f-b5f1-93fb1c324f87-kube-api-access-72fs6\") on node \"crc\" DevicePath \"\"" Oct 07 11:30:03 crc kubenswrapper[4700]: I1007 11:30:03.830082 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330610-l56qp" event={"ID":"3f8db84c-25d7-401f-b5f1-93fb1c324f87","Type":"ContainerDied","Data":"fe98c1db5ea60ff8613bbd693023883e0613aef68707c46f54dfe401cffba92a"} Oct 07 11:30:03 crc kubenswrapper[4700]: I1007 11:30:03.830135 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe98c1db5ea60ff8613bbd693023883e0613aef68707c46f54dfe401cffba92a" Oct 07 11:30:03 crc kubenswrapper[4700]: I1007 11:30:03.830196 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330610-l56qp" Oct 07 11:30:15 crc kubenswrapper[4700]: I1007 11:30:15.333699 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 11:30:15 crc kubenswrapper[4700]: I1007 11:30:15.334440 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 11:30:31 crc kubenswrapper[4700]: I1007 11:30:31.563182 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-x2cvv"] Oct 07 11:30:31 crc kubenswrapper[4700]: E1007 11:30:31.563972 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f8db84c-25d7-401f-b5f1-93fb1c324f87" containerName="collect-profiles" Oct 07 11:30:31 crc kubenswrapper[4700]: I1007 11:30:31.563985 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f8db84c-25d7-401f-b5f1-93fb1c324f87" containerName="collect-profiles" Oct 07 11:30:31 crc kubenswrapper[4700]: I1007 11:30:31.564104 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f8db84c-25d7-401f-b5f1-93fb1c324f87" containerName="collect-profiles" Oct 07 11:30:31 crc kubenswrapper[4700]: I1007 11:30:31.564526 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-x2cvv" Oct 07 11:30:31 crc kubenswrapper[4700]: I1007 11:30:31.567579 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 07 11:30:31 crc kubenswrapper[4700]: I1007 11:30:31.571788 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 07 11:30:31 crc kubenswrapper[4700]: I1007 11:30:31.571940 4700 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-zmltb" Oct 07 11:30:31 crc kubenswrapper[4700]: I1007 11:30:31.586950 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-x2cvv"] Oct 07 11:30:31 crc kubenswrapper[4700]: I1007 11:30:31.598414 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-lv822"] Oct 07 11:30:31 crc kubenswrapper[4700]: I1007 11:30:31.599291 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-lv822" Oct 07 11:30:31 crc kubenswrapper[4700]: I1007 11:30:31.601843 4700 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-vkx6j" Oct 07 11:30:31 crc kubenswrapper[4700]: I1007 11:30:31.612026 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-nxpb9"] Oct 07 11:30:31 crc kubenswrapper[4700]: I1007 11:30:31.612714 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-nxpb9" Oct 07 11:30:31 crc kubenswrapper[4700]: I1007 11:30:31.615928 4700 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-d4rv9" Oct 07 11:30:31 crc kubenswrapper[4700]: I1007 11:30:31.621207 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-lv822"] Oct 07 11:30:31 crc kubenswrapper[4700]: I1007 11:30:31.656970 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-nxpb9"] Oct 07 11:30:31 crc kubenswrapper[4700]: I1007 11:30:31.708011 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mlf7\" (UniqueName: \"kubernetes.io/projected/43b66d69-a0b2-4c0c-85d4-107ab9700398-kube-api-access-9mlf7\") pod \"cert-manager-5b446d88c5-lv822\" (UID: \"43b66d69-a0b2-4c0c-85d4-107ab9700398\") " pod="cert-manager/cert-manager-5b446d88c5-lv822" Oct 07 11:30:31 crc kubenswrapper[4700]: I1007 11:30:31.708602 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqptr\" (UniqueName: \"kubernetes.io/projected/df92deba-17c9-40f7-8079-4699a5c17bf8-kube-api-access-qqptr\") pod \"cert-manager-cainjector-7f985d654d-x2cvv\" (UID: \"df92deba-17c9-40f7-8079-4699a5c17bf8\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-x2cvv" Oct 07 11:30:31 crc kubenswrapper[4700]: I1007 11:30:31.809964 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mlf7\" (UniqueName: \"kubernetes.io/projected/43b66d69-a0b2-4c0c-85d4-107ab9700398-kube-api-access-9mlf7\") pod \"cert-manager-5b446d88c5-lv822\" (UID: \"43b66d69-a0b2-4c0c-85d4-107ab9700398\") " pod="cert-manager/cert-manager-5b446d88c5-lv822" Oct 07 11:30:31 crc kubenswrapper[4700]: I1007 11:30:31.810060 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqptr\" (UniqueName: \"kubernetes.io/projected/df92deba-17c9-40f7-8079-4699a5c17bf8-kube-api-access-qqptr\") pod \"cert-manager-cainjector-7f985d654d-x2cvv\" (UID: \"df92deba-17c9-40f7-8079-4699a5c17bf8\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-x2cvv" Oct 07 11:30:31 crc kubenswrapper[4700]: I1007 11:30:31.810108 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrchw\" (UniqueName: \"kubernetes.io/projected/ad60c720-bd0b-4a09-807d-88587ca33ed7-kube-api-access-zrchw\") pod \"cert-manager-webhook-5655c58dd6-nxpb9\" (UID: \"ad60c720-bd0b-4a09-807d-88587ca33ed7\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-nxpb9" Oct 07 11:30:31 crc kubenswrapper[4700]: I1007 11:30:31.829223 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mlf7\" (UniqueName: \"kubernetes.io/projected/43b66d69-a0b2-4c0c-85d4-107ab9700398-kube-api-access-9mlf7\") pod \"cert-manager-5b446d88c5-lv822\" (UID: \"43b66d69-a0b2-4c0c-85d4-107ab9700398\") " pod="cert-manager/cert-manager-5b446d88c5-lv822" Oct 07 11:30:31 crc kubenswrapper[4700]: I1007 11:30:31.829344 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqptr\" (UniqueName: \"kubernetes.io/projected/df92deba-17c9-40f7-8079-4699a5c17bf8-kube-api-access-qqptr\") pod \"cert-manager-cainjector-7f985d654d-x2cvv\" (UID: \"df92deba-17c9-40f7-8079-4699a5c17bf8\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-x2cvv" Oct 07 11:30:31 crc kubenswrapper[4700]: I1007 11:30:31.883454 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-x2cvv" Oct 07 11:30:31 crc kubenswrapper[4700]: I1007 11:30:31.911395 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrchw\" (UniqueName: \"kubernetes.io/projected/ad60c720-bd0b-4a09-807d-88587ca33ed7-kube-api-access-zrchw\") pod \"cert-manager-webhook-5655c58dd6-nxpb9\" (UID: \"ad60c720-bd0b-4a09-807d-88587ca33ed7\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-nxpb9" Oct 07 11:30:31 crc kubenswrapper[4700]: I1007 11:30:31.918398 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-lv822" Oct 07 11:30:31 crc kubenswrapper[4700]: I1007 11:30:31.938426 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrchw\" (UniqueName: \"kubernetes.io/projected/ad60c720-bd0b-4a09-807d-88587ca33ed7-kube-api-access-zrchw\") pod \"cert-manager-webhook-5655c58dd6-nxpb9\" (UID: \"ad60c720-bd0b-4a09-807d-88587ca33ed7\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-nxpb9" Oct 07 11:30:31 crc kubenswrapper[4700]: I1007 11:30:31.955864 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-nxpb9" Oct 07 11:30:32 crc kubenswrapper[4700]: I1007 11:30:32.156272 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-lv822"] Oct 07 11:30:32 crc kubenswrapper[4700]: I1007 11:30:32.167893 4700 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 11:30:32 crc kubenswrapper[4700]: I1007 11:30:32.204704 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-x2cvv"] Oct 07 11:30:32 crc kubenswrapper[4700]: I1007 11:30:32.219040 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-nxpb9"] Oct 07 11:30:32 crc kubenswrapper[4700]: W1007 11:30:32.228064 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad60c720_bd0b_4a09_807d_88587ca33ed7.slice/crio-f8482dfc99ccfef3a941247c033f058e8aea78df864e53b66b0b32b5efa552f7 WatchSource:0}: Error finding container f8482dfc99ccfef3a941247c033f058e8aea78df864e53b66b0b32b5efa552f7: Status 404 returned error can't find the container with id f8482dfc99ccfef3a941247c033f058e8aea78df864e53b66b0b32b5efa552f7 Oct 07 11:30:33 crc kubenswrapper[4700]: I1007 11:30:33.046018 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-x2cvv" event={"ID":"df92deba-17c9-40f7-8079-4699a5c17bf8","Type":"ContainerStarted","Data":"0d86dd0ae2561a572c4957bcac70f3769c9220638c6b9a04334e5e14306f2c31"} Oct 07 11:30:33 crc kubenswrapper[4700]: I1007 11:30:33.049073 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-lv822" event={"ID":"43b66d69-a0b2-4c0c-85d4-107ab9700398","Type":"ContainerStarted","Data":"8ccf72101b7d41ef1c5ae1aab27c9ddc0978fe1e395d492f1f2101af5fb0db86"} Oct 07 11:30:33 crc kubenswrapper[4700]: I1007 11:30:33.051905 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-nxpb9" event={"ID":"ad60c720-bd0b-4a09-807d-88587ca33ed7","Type":"ContainerStarted","Data":"f8482dfc99ccfef3a941247c033f058e8aea78df864e53b66b0b32b5efa552f7"} Oct 07 11:30:35 crc kubenswrapper[4700]: I1007 11:30:35.067855 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-nxpb9" event={"ID":"ad60c720-bd0b-4a09-807d-88587ca33ed7","Type":"ContainerStarted","Data":"22d30406fbb75f7326c64f9b1553dd15052beb7e63f34b93e52c0c68cd25196f"} Oct 07 11:30:35 crc kubenswrapper[4700]: I1007 11:30:35.069275 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-nxpb9" Oct 07 11:30:35 crc kubenswrapper[4700]: I1007 11:30:35.089736 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-nxpb9" podStartSLOduration=1.696455849 podStartE2EDuration="4.08927896s" podCreationTimestamp="2025-10-07 11:30:31 +0000 UTC" firstStartedPulling="2025-10-07 11:30:32.230179411 +0000 UTC m=+599.026578390" lastFinishedPulling="2025-10-07 11:30:34.623002502 +0000 UTC m=+601.419401501" observedRunningTime="2025-10-07 11:30:35.084084126 +0000 UTC m=+601.880483115" watchObservedRunningTime="2025-10-07 11:30:35.08927896 +0000 UTC m=+601.885677949" Oct 07 11:30:37 crc kubenswrapper[4700]: I1007 11:30:37.084019 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-x2cvv" event={"ID":"df92deba-17c9-40f7-8079-4699a5c17bf8","Type":"ContainerStarted","Data":"4914632c2b73b450e457703621384fd69532937c8f969da0e532b4860a513062"} Oct 07 11:30:37 crc kubenswrapper[4700]: I1007 11:30:37.088686 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-lv822" event={"ID":"43b66d69-a0b2-4c0c-85d4-107ab9700398","Type":"ContainerStarted","Data":"1f0ef8502e0e4a644dbf1d8a3ae41b3c8585b6ca7400e6b092af3a1d101a5fa4"} Oct 07 11:30:37 crc kubenswrapper[4700]: I1007 11:30:37.111344 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-x2cvv" podStartSLOduration=2.0915965 podStartE2EDuration="6.11129432s" podCreationTimestamp="2025-10-07 11:30:31 +0000 UTC" firstStartedPulling="2025-10-07 11:30:32.207442973 +0000 UTC m=+599.003841962" lastFinishedPulling="2025-10-07 11:30:36.227140783 +0000 UTC m=+603.023539782" observedRunningTime="2025-10-07 11:30:37.109561436 +0000 UTC m=+603.905960495" watchObservedRunningTime="2025-10-07 11:30:37.11129432 +0000 UTC m=+603.907693319" Oct 07 11:30:37 crc kubenswrapper[4700]: I1007 11:30:37.135797 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-lv822" podStartSLOduration=2.129386377 podStartE2EDuration="6.135769673s" podCreationTimestamp="2025-10-07 11:30:31 +0000 UTC" firstStartedPulling="2025-10-07 11:30:32.167668065 +0000 UTC m=+598.964067054" lastFinishedPulling="2025-10-07 11:30:36.174051371 +0000 UTC m=+602.970450350" observedRunningTime="2025-10-07 11:30:37.130945128 +0000 UTC m=+603.927344157" watchObservedRunningTime="2025-10-07 11:30:37.135769673 +0000 UTC m=+603.932168702" Oct 07 11:30:41 crc kubenswrapper[4700]: I1007 11:30:41.969066 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-nxpb9" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.337278 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fk4xc"] Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.338247 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="ovn-controller" containerID="cri-o://648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452" gracePeriod=30 Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.339023 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="sbdb" containerID="cri-o://71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a" gracePeriod=30 Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.339136 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="nbdb" containerID="cri-o://e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0" gracePeriod=30 Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.339230 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="northd" containerID="cri-o://173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86" gracePeriod=30 Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.339668 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23" gracePeriod=30 Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.339813 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="kube-rbac-proxy-node" containerID="cri-o://7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb" gracePeriod=30 Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.339916 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="ovn-acl-logging" containerID="cri-o://41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8" gracePeriod=30 Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.397256 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="ovnkube-controller" containerID="cri-o://42401a336cd3f03495d93fb5f03a7d4274ada94d3d81f1e8290a31f25f00bbad" gracePeriod=30 Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.727772 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fk4xc_d0a75e4c-2144-40de-9abc-f0bb7a143a0e/ovnkube-controller/3.log" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.732497 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fk4xc_d0a75e4c-2144-40de-9abc-f0bb7a143a0e/ovn-acl-logging/0.log" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.733655 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fk4xc_d0a75e4c-2144-40de-9abc-f0bb7a143a0e/ovn-controller/0.log" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.734719 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.797592 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-run-openvswitch\") pod \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.797678 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-run-netns\") pod \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.797740 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-cni-bin\") pod \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.797763 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "d0a75e4c-2144-40de-9abc-f0bb7a143a0e" (UID: "d0a75e4c-2144-40de-9abc-f0bb7a143a0e"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.797802 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-kubelet\") pod \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.797851 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "d0a75e4c-2144-40de-9abc-f0bb7a143a0e" (UID: "d0a75e4c-2144-40de-9abc-f0bb7a143a0e"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.797863 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-env-overrides\") pod \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.797892 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "d0a75e4c-2144-40de-9abc-f0bb7a143a0e" (UID: "d0a75e4c-2144-40de-9abc-f0bb7a143a0e"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.797934 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "d0a75e4c-2144-40de-9abc-f0bb7a143a0e" (UID: "d0a75e4c-2144-40de-9abc-f0bb7a143a0e"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.797979 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6z5xw\" (UniqueName: \"kubernetes.io/projected/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-kube-api-access-6z5xw\") pod \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.798046 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-slash\") pod \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.798092 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-cni-netd\") pod \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.798133 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-run-ovn\") pod \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.798186 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-ovnkube-config\") pod \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.798230 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-etc-openvswitch\") pod \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.798280 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-var-lib-openvswitch\") pod \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.798351 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-log-socket\") pod \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.798395 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-ovnkube-script-lib\") pod \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.798434 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-ovn-node-metrics-cert\") pod \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.798488 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-node-log\") pod \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.798539 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-systemd-units\") pod \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.798592 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-run-systemd\") pod \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.798608 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-slash" (OuterVolumeSpecName: "host-slash") pod "d0a75e4c-2144-40de-9abc-f0bb7a143a0e" (UID: "d0a75e4c-2144-40de-9abc-f0bb7a143a0e"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.798631 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-run-ovn-kubernetes\") pod \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.798707 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "d0a75e4c-2144-40de-9abc-f0bb7a143a0e" (UID: "d0a75e4c-2144-40de-9abc-f0bb7a143a0e"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.798739 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\" (UID: \"d0a75e4c-2144-40de-9abc-f0bb7a143a0e\") " Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.798775 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "d0a75e4c-2144-40de-9abc-f0bb7a143a0e" (UID: "d0a75e4c-2144-40de-9abc-f0bb7a143a0e"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.798815 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "d0a75e4c-2144-40de-9abc-f0bb7a143a0e" (UID: "d0a75e4c-2144-40de-9abc-f0bb7a143a0e"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.798855 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-log-socket" (OuterVolumeSpecName: "log-socket") pod "d0a75e4c-2144-40de-9abc-f0bb7a143a0e" (UID: "d0a75e4c-2144-40de-9abc-f0bb7a143a0e"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.798869 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "d0a75e4c-2144-40de-9abc-f0bb7a143a0e" (UID: "d0a75e4c-2144-40de-9abc-f0bb7a143a0e"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.799066 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "d0a75e4c-2144-40de-9abc-f0bb7a143a0e" (UID: "d0a75e4c-2144-40de-9abc-f0bb7a143a0e"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.799108 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "d0a75e4c-2144-40de-9abc-f0bb7a143a0e" (UID: "d0a75e4c-2144-40de-9abc-f0bb7a143a0e"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.799395 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "d0a75e4c-2144-40de-9abc-f0bb7a143a0e" (UID: "d0a75e4c-2144-40de-9abc-f0bb7a143a0e"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.799475 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-node-log" (OuterVolumeSpecName: "node-log") pod "d0a75e4c-2144-40de-9abc-f0bb7a143a0e" (UID: "d0a75e4c-2144-40de-9abc-f0bb7a143a0e"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.799548 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "d0a75e4c-2144-40de-9abc-f0bb7a143a0e" (UID: "d0a75e4c-2144-40de-9abc-f0bb7a143a0e"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.799610 4700 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.799635 4700 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.799661 4700 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.799682 4700 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.799703 4700 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.799721 4700 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.799745 4700 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-slash\") on node \"crc\" DevicePath \"\"" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.799762 4700 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.799815 4700 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.799836 4700 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.799858 4700 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.799870 4700 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-log-socket\") on node \"crc\" DevicePath \"\"" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.799909 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "d0a75e4c-2144-40de-9abc-f0bb7a143a0e" (UID: "d0a75e4c-2144-40de-9abc-f0bb7a143a0e"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.805762 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "d0a75e4c-2144-40de-9abc-f0bb7a143a0e" (UID: "d0a75e4c-2144-40de-9abc-f0bb7a143a0e"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.808475 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-kube-api-access-6z5xw" (OuterVolumeSpecName: "kube-api-access-6z5xw") pod "d0a75e4c-2144-40de-9abc-f0bb7a143a0e" (UID: "d0a75e4c-2144-40de-9abc-f0bb7a143a0e"). InnerVolumeSpecName "kube-api-access-6z5xw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.812834 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "d0a75e4c-2144-40de-9abc-f0bb7a143a0e" (UID: "d0a75e4c-2144-40de-9abc-f0bb7a143a0e"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.815864 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nqb2t"] Oct 07 11:30:42 crc kubenswrapper[4700]: E1007 11:30:42.816207 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="kube-rbac-proxy-ovn-metrics" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.816239 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="kube-rbac-proxy-ovn-metrics" Oct 07 11:30:42 crc kubenswrapper[4700]: E1007 11:30:42.816260 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="ovnkube-controller" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.816273 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="ovnkube-controller" Oct 07 11:30:42 crc kubenswrapper[4700]: E1007 11:30:42.816288 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="ovnkube-controller" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.816300 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="ovnkube-controller" Oct 07 11:30:42 crc kubenswrapper[4700]: E1007 11:30:42.816346 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="ovnkube-controller" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.816359 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="ovnkube-controller" Oct 07 11:30:42 crc kubenswrapper[4700]: E1007 11:30:42.816379 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="northd" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.816391 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="northd" Oct 07 11:30:42 crc kubenswrapper[4700]: E1007 11:30:42.816412 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="ovn-controller" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.816426 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="ovn-controller" Oct 07 11:30:42 crc kubenswrapper[4700]: E1007 11:30:42.816450 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="kube-rbac-proxy-node" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.816463 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="kube-rbac-proxy-node" Oct 07 11:30:42 crc kubenswrapper[4700]: E1007 11:30:42.816478 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="ovnkube-controller" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.816490 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="ovnkube-controller" Oct 07 11:30:42 crc kubenswrapper[4700]: E1007 11:30:42.816503 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="sbdb" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.816514 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="sbdb" Oct 07 11:30:42 crc kubenswrapper[4700]: E1007 11:30:42.816532 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="nbdb" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.816543 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="nbdb" Oct 07 11:30:42 crc kubenswrapper[4700]: E1007 11:30:42.816561 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="ovn-acl-logging" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.816573 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="ovn-acl-logging" Oct 07 11:30:42 crc kubenswrapper[4700]: E1007 11:30:42.816589 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="kubecfg-setup" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.816601 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="kubecfg-setup" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.816771 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="ovn-acl-logging" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.816792 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="ovnkube-controller" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.816809 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="ovnkube-controller" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.816825 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="ovn-controller" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.816839 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="ovnkube-controller" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.816856 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="ovnkube-controller" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.816873 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="ovnkube-controller" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.816888 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="kube-rbac-proxy-ovn-metrics" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.816902 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="northd" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.816919 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="nbdb" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.816934 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="kube-rbac-proxy-node" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.816950 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="sbdb" Oct 07 11:30:42 crc kubenswrapper[4700]: E1007 11:30:42.817104 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="ovnkube-controller" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.817119 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerName="ovnkube-controller" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.820206 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.827421 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "d0a75e4c-2144-40de-9abc-f0bb7a143a0e" (UID: "d0a75e4c-2144-40de-9abc-f0bb7a143a0e"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.901721 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-run-ovn\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.901826 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-etc-openvswitch\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.901881 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-host-slash\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.901930 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-systemd-units\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.901964 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-host-cni-bin\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.901998 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-env-overrides\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.902031 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-run-systemd\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.902064 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-node-log\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.902095 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-log-socket\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.902123 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-ovnkube-script-lib\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.902157 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-var-lib-openvswitch\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.902190 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-ovn-node-metrics-cert\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.902220 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-run-openvswitch\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.902251 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-host-run-netns\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.902281 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-host-run-ovn-kubernetes\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.902380 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-host-cni-netd\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.902436 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-ovnkube-config\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.902490 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rzmg\" (UniqueName: \"kubernetes.io/projected/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-kube-api-access-2rzmg\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.902531 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.902576 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-host-kubelet\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.902922 4700 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.902994 4700 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.903022 4700 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-node-log\") on node \"crc\" DevicePath \"\"" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.903104 4700 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.903126 4700 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.903153 4700 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.903181 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6z5xw\" (UniqueName: \"kubernetes.io/projected/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-kube-api-access-6z5xw\") on node \"crc\" DevicePath \"\"" Oct 07 11:30:42 crc kubenswrapper[4700]: I1007 11:30:42.903209 4700 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d0a75e4c-2144-40de-9abc-f0bb7a143a0e-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.004856 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-host-slash\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.005097 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-systemd-units\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.005179 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-host-cni-bin\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.005218 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-env-overrides\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.005187 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-host-slash\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.005347 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-run-systemd\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.005339 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-systemd-units\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.005255 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-run-systemd\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.005488 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-host-cni-bin\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.005673 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-node-log\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.005736 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-log-socket\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.005786 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-ovnkube-script-lib\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.005847 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-var-lib-openvswitch\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.005881 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-ovn-node-metrics-cert\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.005868 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-node-log\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.005957 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-run-openvswitch\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.005916 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-run-openvswitch\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.006041 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-var-lib-openvswitch\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.006075 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-host-run-netns\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.006158 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-host-run-ovn-kubernetes\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.006207 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-host-cni-netd\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.006280 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-ovnkube-config\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.006330 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-host-run-netns\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.006900 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-log-socket\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.006895 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-host-run-ovn-kubernetes\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.006899 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rzmg\" (UniqueName: \"kubernetes.io/projected/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-kube-api-access-2rzmg\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.006966 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-host-cni-netd\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.007017 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.007204 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.007293 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-env-overrides\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.007363 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-host-kubelet\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.007298 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-ovnkube-script-lib\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.007396 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-host-kubelet\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.007597 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-run-ovn\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.007687 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-etc-openvswitch\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.007768 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-run-ovn\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.007828 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-ovnkube-config\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.008346 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-etc-openvswitch\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.011290 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-ovn-node-metrics-cert\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.038352 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rzmg\" (UniqueName: \"kubernetes.io/projected/f01802d7-7bda-4bd9-b9af-fc3a312a34ab-kube-api-access-2rzmg\") pod \"ovnkube-node-nqb2t\" (UID: \"f01802d7-7bda-4bd9-b9af-fc3a312a34ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.132584 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zhd4s_869af552-a034-4af4-b46a-492798633d24/kube-multus/2.log" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.133786 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zhd4s_869af552-a034-4af4-b46a-492798633d24/kube-multus/1.log" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.133874 4700 generic.go:334] "Generic (PLEG): container finished" podID="869af552-a034-4af4-b46a-492798633d24" containerID="bc3f96fdc39238a4256a6eefb116100d5bff845db1385f44433113a8962718d5" exitCode=2 Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.134023 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zhd4s" event={"ID":"869af552-a034-4af4-b46a-492798633d24","Type":"ContainerDied","Data":"bc3f96fdc39238a4256a6eefb116100d5bff845db1385f44433113a8962718d5"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.134090 4700 scope.go:117] "RemoveContainer" containerID="3343c4ca5375885915b7fdee68df54d6460363b58b4802ff380f38334a9312bb" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.135033 4700 scope.go:117] "RemoveContainer" containerID="bc3f96fdc39238a4256a6eefb116100d5bff845db1385f44433113a8962718d5" Oct 07 11:30:43 crc kubenswrapper[4700]: E1007 11:30:43.135534 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-zhd4s_openshift-multus(869af552-a034-4af4-b46a-492798633d24)\"" pod="openshift-multus/multus-zhd4s" podUID="869af552-a034-4af4-b46a-492798633d24" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.137779 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.139091 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fk4xc_d0a75e4c-2144-40de-9abc-f0bb7a143a0e/ovnkube-controller/3.log" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.146856 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fk4xc_d0a75e4c-2144-40de-9abc-f0bb7a143a0e/ovn-acl-logging/0.log" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.148016 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fk4xc_d0a75e4c-2144-40de-9abc-f0bb7a143a0e/ovn-controller/0.log" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.148782 4700 generic.go:334] "Generic (PLEG): container finished" podID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerID="42401a336cd3f03495d93fb5f03a7d4274ada94d3d81f1e8290a31f25f00bbad" exitCode=0 Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.148833 4700 generic.go:334] "Generic (PLEG): container finished" podID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerID="71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a" exitCode=0 Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.148854 4700 generic.go:334] "Generic (PLEG): container finished" podID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerID="e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0" exitCode=0 Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.148877 4700 generic.go:334] "Generic (PLEG): container finished" podID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerID="173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86" exitCode=0 Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.148897 4700 generic.go:334] "Generic (PLEG): container finished" podID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerID="d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23" exitCode=0 Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.148914 4700 generic.go:334] "Generic (PLEG): container finished" podID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerID="7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb" exitCode=0 Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.148932 4700 generic.go:334] "Generic (PLEG): container finished" podID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerID="41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8" exitCode=143 Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.148950 4700 generic.go:334] "Generic (PLEG): container finished" podID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" containerID="648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452" exitCode=143 Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.148952 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" event={"ID":"d0a75e4c-2144-40de-9abc-f0bb7a143a0e","Type":"ContainerDied","Data":"42401a336cd3f03495d93fb5f03a7d4274ada94d3d81f1e8290a31f25f00bbad"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149010 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" event={"ID":"d0a75e4c-2144-40de-9abc-f0bb7a143a0e","Type":"ContainerDied","Data":"71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149038 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" event={"ID":"d0a75e4c-2144-40de-9abc-f0bb7a143a0e","Type":"ContainerDied","Data":"e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149059 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" event={"ID":"d0a75e4c-2144-40de-9abc-f0bb7a143a0e","Type":"ContainerDied","Data":"173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149079 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" event={"ID":"d0a75e4c-2144-40de-9abc-f0bb7a143a0e","Type":"ContainerDied","Data":"d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149104 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" event={"ID":"d0a75e4c-2144-40de-9abc-f0bb7a143a0e","Type":"ContainerDied","Data":"7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149126 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"42401a336cd3f03495d93fb5f03a7d4274ada94d3d81f1e8290a31f25f00bbad"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149145 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149158 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149170 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149183 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149195 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149206 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149218 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149231 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149242 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149258 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" event={"ID":"d0a75e4c-2144-40de-9abc-f0bb7a143a0e","Type":"ContainerDied","Data":"41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149274 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"42401a336cd3f03495d93fb5f03a7d4274ada94d3d81f1e8290a31f25f00bbad"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149286 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149298 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149334 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149346 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149358 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149368 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149379 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149391 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149402 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149416 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" event={"ID":"d0a75e4c-2144-40de-9abc-f0bb7a143a0e","Type":"ContainerDied","Data":"648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149434 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"42401a336cd3f03495d93fb5f03a7d4274ada94d3d81f1e8290a31f25f00bbad"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149446 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149456 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149466 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149477 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149487 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149499 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149510 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149520 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149532 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149546 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" event={"ID":"d0a75e4c-2144-40de-9abc-f0bb7a143a0e","Type":"ContainerDied","Data":"a49cc3a7f7bd18c4a35ef0fd84fdff299f285659a581d3dbc11583db5f606b3a"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149563 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"42401a336cd3f03495d93fb5f03a7d4274ada94d3d81f1e8290a31f25f00bbad"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149580 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149594 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149613 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149625 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149636 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149647 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149658 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149669 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.149680 4700 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379"} Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.148966 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fk4xc" Oct 07 11:30:43 crc kubenswrapper[4700]: W1007 11:30:43.180415 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf01802d7_7bda_4bd9_b9af_fc3a312a34ab.slice/crio-4c41bf025a19bec40eb553250e231156323fbdc2b7283ced97a6a8f0b51667a9 WatchSource:0}: Error finding container 4c41bf025a19bec40eb553250e231156323fbdc2b7283ced97a6a8f0b51667a9: Status 404 returned error can't find the container with id 4c41bf025a19bec40eb553250e231156323fbdc2b7283ced97a6a8f0b51667a9 Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.184110 4700 scope.go:117] "RemoveContainer" containerID="42401a336cd3f03495d93fb5f03a7d4274ada94d3d81f1e8290a31f25f00bbad" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.204970 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fk4xc"] Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.209757 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fk4xc"] Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.227688 4700 scope.go:117] "RemoveContainer" containerID="86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.252420 4700 scope.go:117] "RemoveContainer" containerID="71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.277498 4700 scope.go:117] "RemoveContainer" containerID="e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.318165 4700 scope.go:117] "RemoveContainer" containerID="173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.355741 4700 scope.go:117] "RemoveContainer" containerID="d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.385976 4700 scope.go:117] "RemoveContainer" containerID="7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.535814 4700 scope.go:117] "RemoveContainer" containerID="41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.555593 4700 scope.go:117] "RemoveContainer" containerID="648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.574051 4700 scope.go:117] "RemoveContainer" containerID="6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.588447 4700 scope.go:117] "RemoveContainer" containerID="42401a336cd3f03495d93fb5f03a7d4274ada94d3d81f1e8290a31f25f00bbad" Oct 07 11:30:43 crc kubenswrapper[4700]: E1007 11:30:43.588923 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42401a336cd3f03495d93fb5f03a7d4274ada94d3d81f1e8290a31f25f00bbad\": container with ID starting with 42401a336cd3f03495d93fb5f03a7d4274ada94d3d81f1e8290a31f25f00bbad not found: ID does not exist" containerID="42401a336cd3f03495d93fb5f03a7d4274ada94d3d81f1e8290a31f25f00bbad" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.588971 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42401a336cd3f03495d93fb5f03a7d4274ada94d3d81f1e8290a31f25f00bbad"} err="failed to get container status \"42401a336cd3f03495d93fb5f03a7d4274ada94d3d81f1e8290a31f25f00bbad\": rpc error: code = NotFound desc = could not find container \"42401a336cd3f03495d93fb5f03a7d4274ada94d3d81f1e8290a31f25f00bbad\": container with ID starting with 42401a336cd3f03495d93fb5f03a7d4274ada94d3d81f1e8290a31f25f00bbad not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.589002 4700 scope.go:117] "RemoveContainer" containerID="86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb" Oct 07 11:30:43 crc kubenswrapper[4700]: E1007 11:30:43.589346 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb\": container with ID starting with 86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb not found: ID does not exist" containerID="86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.589393 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb"} err="failed to get container status \"86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb\": rpc error: code = NotFound desc = could not find container \"86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb\": container with ID starting with 86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.589421 4700 scope.go:117] "RemoveContainer" containerID="71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a" Oct 07 11:30:43 crc kubenswrapper[4700]: E1007 11:30:43.589677 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\": container with ID starting with 71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a not found: ID does not exist" containerID="71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.589705 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a"} err="failed to get container status \"71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\": rpc error: code = NotFound desc = could not find container \"71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\": container with ID starting with 71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.589726 4700 scope.go:117] "RemoveContainer" containerID="e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0" Oct 07 11:30:43 crc kubenswrapper[4700]: E1007 11:30:43.590166 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\": container with ID starting with e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0 not found: ID does not exist" containerID="e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.590195 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0"} err="failed to get container status \"e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\": rpc error: code = NotFound desc = could not find container \"e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\": container with ID starting with e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0 not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.590213 4700 scope.go:117] "RemoveContainer" containerID="173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86" Oct 07 11:30:43 crc kubenswrapper[4700]: E1007 11:30:43.590722 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\": container with ID starting with 173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86 not found: ID does not exist" containerID="173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.590810 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86"} err="failed to get container status \"173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\": rpc error: code = NotFound desc = could not find container \"173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\": container with ID starting with 173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86 not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.590871 4700 scope.go:117] "RemoveContainer" containerID="d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23" Oct 07 11:30:43 crc kubenswrapper[4700]: E1007 11:30:43.591344 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\": container with ID starting with d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23 not found: ID does not exist" containerID="d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.591376 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23"} err="failed to get container status \"d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\": rpc error: code = NotFound desc = could not find container \"d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\": container with ID starting with d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23 not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.591396 4700 scope.go:117] "RemoveContainer" containerID="7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb" Oct 07 11:30:43 crc kubenswrapper[4700]: E1007 11:30:43.592012 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\": container with ID starting with 7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb not found: ID does not exist" containerID="7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.592042 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb"} err="failed to get container status \"7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\": rpc error: code = NotFound desc = could not find container \"7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\": container with ID starting with 7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.592061 4700 scope.go:117] "RemoveContainer" containerID="41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8" Oct 07 11:30:43 crc kubenswrapper[4700]: E1007 11:30:43.592452 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\": container with ID starting with 41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8 not found: ID does not exist" containerID="41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.592505 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8"} err="failed to get container status \"41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\": rpc error: code = NotFound desc = could not find container \"41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\": container with ID starting with 41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8 not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.592537 4700 scope.go:117] "RemoveContainer" containerID="648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452" Oct 07 11:30:43 crc kubenswrapper[4700]: E1007 11:30:43.592881 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\": container with ID starting with 648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452 not found: ID does not exist" containerID="648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.592912 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452"} err="failed to get container status \"648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\": rpc error: code = NotFound desc = could not find container \"648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\": container with ID starting with 648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452 not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.592933 4700 scope.go:117] "RemoveContainer" containerID="6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379" Oct 07 11:30:43 crc kubenswrapper[4700]: E1007 11:30:43.593334 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\": container with ID starting with 6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379 not found: ID does not exist" containerID="6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.593402 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379"} err="failed to get container status \"6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\": rpc error: code = NotFound desc = could not find container \"6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\": container with ID starting with 6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379 not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.593443 4700 scope.go:117] "RemoveContainer" containerID="42401a336cd3f03495d93fb5f03a7d4274ada94d3d81f1e8290a31f25f00bbad" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.593778 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42401a336cd3f03495d93fb5f03a7d4274ada94d3d81f1e8290a31f25f00bbad"} err="failed to get container status \"42401a336cd3f03495d93fb5f03a7d4274ada94d3d81f1e8290a31f25f00bbad\": rpc error: code = NotFound desc = could not find container \"42401a336cd3f03495d93fb5f03a7d4274ada94d3d81f1e8290a31f25f00bbad\": container with ID starting with 42401a336cd3f03495d93fb5f03a7d4274ada94d3d81f1e8290a31f25f00bbad not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.593809 4700 scope.go:117] "RemoveContainer" containerID="86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.594187 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb"} err="failed to get container status \"86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb\": rpc error: code = NotFound desc = could not find container \"86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb\": container with ID starting with 86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.594233 4700 scope.go:117] "RemoveContainer" containerID="71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.594567 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a"} err="failed to get container status \"71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\": rpc error: code = NotFound desc = could not find container \"71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\": container with ID starting with 71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.594595 4700 scope.go:117] "RemoveContainer" containerID="e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.595159 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0"} err="failed to get container status \"e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\": rpc error: code = NotFound desc = could not find container \"e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\": container with ID starting with e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0 not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.595196 4700 scope.go:117] "RemoveContainer" containerID="173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.595669 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86"} err="failed to get container status \"173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\": rpc error: code = NotFound desc = could not find container \"173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\": container with ID starting with 173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86 not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.595705 4700 scope.go:117] "RemoveContainer" containerID="d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.596042 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23"} err="failed to get container status \"d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\": rpc error: code = NotFound desc = could not find container \"d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\": container with ID starting with d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23 not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.596077 4700 scope.go:117] "RemoveContainer" containerID="7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.596544 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb"} err="failed to get container status \"7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\": rpc error: code = NotFound desc = could not find container \"7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\": container with ID starting with 7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.596607 4700 scope.go:117] "RemoveContainer" containerID="41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.597003 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8"} err="failed to get container status \"41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\": rpc error: code = NotFound desc = could not find container \"41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\": container with ID starting with 41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8 not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.597040 4700 scope.go:117] "RemoveContainer" containerID="648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.597489 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452"} err="failed to get container status \"648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\": rpc error: code = NotFound desc = could not find container \"648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\": container with ID starting with 648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452 not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.597543 4700 scope.go:117] "RemoveContainer" containerID="6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.598022 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379"} err="failed to get container status \"6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\": rpc error: code = NotFound desc = could not find container \"6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\": container with ID starting with 6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379 not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.598057 4700 scope.go:117] "RemoveContainer" containerID="42401a336cd3f03495d93fb5f03a7d4274ada94d3d81f1e8290a31f25f00bbad" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.598555 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42401a336cd3f03495d93fb5f03a7d4274ada94d3d81f1e8290a31f25f00bbad"} err="failed to get container status \"42401a336cd3f03495d93fb5f03a7d4274ada94d3d81f1e8290a31f25f00bbad\": rpc error: code = NotFound desc = could not find container \"42401a336cd3f03495d93fb5f03a7d4274ada94d3d81f1e8290a31f25f00bbad\": container with ID starting with 42401a336cd3f03495d93fb5f03a7d4274ada94d3d81f1e8290a31f25f00bbad not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.598612 4700 scope.go:117] "RemoveContainer" containerID="86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.599210 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb"} err="failed to get container status \"86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb\": rpc error: code = NotFound desc = could not find container \"86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb\": container with ID starting with 86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.599241 4700 scope.go:117] "RemoveContainer" containerID="71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.599883 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a"} err="failed to get container status \"71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\": rpc error: code = NotFound desc = could not find container \"71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\": container with ID starting with 71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.599923 4700 scope.go:117] "RemoveContainer" containerID="e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.600384 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0"} err="failed to get container status \"e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\": rpc error: code = NotFound desc = could not find container \"e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\": container with ID starting with e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0 not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.600412 4700 scope.go:117] "RemoveContainer" containerID="173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.600782 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86"} err="failed to get container status \"173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\": rpc error: code = NotFound desc = could not find container \"173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\": container with ID starting with 173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86 not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.600842 4700 scope.go:117] "RemoveContainer" containerID="d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.601263 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23"} err="failed to get container status \"d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\": rpc error: code = NotFound desc = could not find container \"d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\": container with ID starting with d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23 not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.601290 4700 scope.go:117] "RemoveContainer" containerID="7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.601943 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb"} err="failed to get container status \"7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\": rpc error: code = NotFound desc = could not find container \"7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\": container with ID starting with 7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.601979 4700 scope.go:117] "RemoveContainer" containerID="41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.602472 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8"} err="failed to get container status \"41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\": rpc error: code = NotFound desc = could not find container \"41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\": container with ID starting with 41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8 not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.602528 4700 scope.go:117] "RemoveContainer" containerID="648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.603020 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452"} err="failed to get container status \"648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\": rpc error: code = NotFound desc = could not find container \"648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\": container with ID starting with 648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452 not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.603059 4700 scope.go:117] "RemoveContainer" containerID="6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.603471 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379"} err="failed to get container status \"6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\": rpc error: code = NotFound desc = could not find container \"6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\": container with ID starting with 6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379 not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.603503 4700 scope.go:117] "RemoveContainer" containerID="42401a336cd3f03495d93fb5f03a7d4274ada94d3d81f1e8290a31f25f00bbad" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.603823 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42401a336cd3f03495d93fb5f03a7d4274ada94d3d81f1e8290a31f25f00bbad"} err="failed to get container status \"42401a336cd3f03495d93fb5f03a7d4274ada94d3d81f1e8290a31f25f00bbad\": rpc error: code = NotFound desc = could not find container \"42401a336cd3f03495d93fb5f03a7d4274ada94d3d81f1e8290a31f25f00bbad\": container with ID starting with 42401a336cd3f03495d93fb5f03a7d4274ada94d3d81f1e8290a31f25f00bbad not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.603888 4700 scope.go:117] "RemoveContainer" containerID="86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.604292 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb"} err="failed to get container status \"86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb\": rpc error: code = NotFound desc = could not find container \"86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb\": container with ID starting with 86540f7fd2c2a3387c6bf7f79bfd3c9fc54dc7ced04664bd60a291f8e9399ceb not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.604350 4700 scope.go:117] "RemoveContainer" containerID="71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.604647 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a"} err="failed to get container status \"71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\": rpc error: code = NotFound desc = could not find container \"71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a\": container with ID starting with 71289efcbed87fbdab1561f028ff95bbdb5d0ddd65b142f1e5bd62be168a309a not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.604685 4700 scope.go:117] "RemoveContainer" containerID="e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.605025 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0"} err="failed to get container status \"e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\": rpc error: code = NotFound desc = could not find container \"e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0\": container with ID starting with e626bee848ef46f14709d2fe347c773f7705ae21acc50b1d81a652b08baf78c0 not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.605045 4700 scope.go:117] "RemoveContainer" containerID="173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.605417 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86"} err="failed to get container status \"173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\": rpc error: code = NotFound desc = could not find container \"173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86\": container with ID starting with 173cc165159634f0ed32f1b38851a047ae510d572452927b101c1d87b4858f86 not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.605455 4700 scope.go:117] "RemoveContainer" containerID="d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.605836 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23"} err="failed to get container status \"d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\": rpc error: code = NotFound desc = could not find container \"d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23\": container with ID starting with d84fe5db338bda1fd47c733b6da5990478778637314718c820189bb8b01c2f23 not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.605890 4700 scope.go:117] "RemoveContainer" containerID="7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.606387 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb"} err="failed to get container status \"7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\": rpc error: code = NotFound desc = could not find container \"7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb\": container with ID starting with 7cb6d9d1eb1253291da0d3d3ff3341122d2b0cabfee7757c3a1716f4b034ddeb not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.606431 4700 scope.go:117] "RemoveContainer" containerID="41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.606956 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8"} err="failed to get container status \"41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\": rpc error: code = NotFound desc = could not find container \"41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8\": container with ID starting with 41323d024efb7e6efbe385fa27ccbd991f977ed383acb33e0c60ad23126627b8 not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.607009 4700 scope.go:117] "RemoveContainer" containerID="648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.607493 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452"} err="failed to get container status \"648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\": rpc error: code = NotFound desc = could not find container \"648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452\": container with ID starting with 648914d093457c40763b027c38bd998ea80c7bc065f464cfb7f97df8a68c9452 not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.607537 4700 scope.go:117] "RemoveContainer" containerID="6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.607975 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379"} err="failed to get container status \"6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\": rpc error: code = NotFound desc = could not find container \"6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379\": container with ID starting with 6ce847a7963c2a0c1e6630b1ec76dab37c307515bd02cb35769ffb7ff4886379 not found: ID does not exist" Oct 07 11:30:43 crc kubenswrapper[4700]: I1007 11:30:43.969614 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0a75e4c-2144-40de-9abc-f0bb7a143a0e" path="/var/lib/kubelet/pods/d0a75e4c-2144-40de-9abc-f0bb7a143a0e/volumes" Oct 07 11:30:44 crc kubenswrapper[4700]: I1007 11:30:44.164874 4700 generic.go:334] "Generic (PLEG): container finished" podID="f01802d7-7bda-4bd9-b9af-fc3a312a34ab" containerID="ae10e9c37e0ffbfa5ddc628830a28846ba6875b2a067626e4faba43ac55b74e5" exitCode=0 Oct 07 11:30:44 crc kubenswrapper[4700]: I1007 11:30:44.165071 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" event={"ID":"f01802d7-7bda-4bd9-b9af-fc3a312a34ab","Type":"ContainerDied","Data":"ae10e9c37e0ffbfa5ddc628830a28846ba6875b2a067626e4faba43ac55b74e5"} Oct 07 11:30:44 crc kubenswrapper[4700]: I1007 11:30:44.165342 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" event={"ID":"f01802d7-7bda-4bd9-b9af-fc3a312a34ab","Type":"ContainerStarted","Data":"4c41bf025a19bec40eb553250e231156323fbdc2b7283ced97a6a8f0b51667a9"} Oct 07 11:30:44 crc kubenswrapper[4700]: I1007 11:30:44.170727 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zhd4s_869af552-a034-4af4-b46a-492798633d24/kube-multus/2.log" Oct 07 11:30:45 crc kubenswrapper[4700]: I1007 11:30:45.184067 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" event={"ID":"f01802d7-7bda-4bd9-b9af-fc3a312a34ab","Type":"ContainerStarted","Data":"0d469799ab49137fd0a3d12c38eda2936bd869f66548f6871a5682c826b5d38a"} Oct 07 11:30:45 crc kubenswrapper[4700]: I1007 11:30:45.184560 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" event={"ID":"f01802d7-7bda-4bd9-b9af-fc3a312a34ab","Type":"ContainerStarted","Data":"3edf3be26df3a48b733afa3fa22d86caf9e25a50738085da75b30c44fb983f89"} Oct 07 11:30:45 crc kubenswrapper[4700]: I1007 11:30:45.184587 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" event={"ID":"f01802d7-7bda-4bd9-b9af-fc3a312a34ab","Type":"ContainerStarted","Data":"7f4aba75c9504e6115b999a7a32c8c0258cff8b8116f2cce9fc7cbbf1bfcf4f7"} Oct 07 11:30:45 crc kubenswrapper[4700]: I1007 11:30:45.184607 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" event={"ID":"f01802d7-7bda-4bd9-b9af-fc3a312a34ab","Type":"ContainerStarted","Data":"1ef2b7b4dad53d6f7c461bf05c393c1875948d596998f695947b12beeef33a6b"} Oct 07 11:30:45 crc kubenswrapper[4700]: I1007 11:30:45.184625 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" event={"ID":"f01802d7-7bda-4bd9-b9af-fc3a312a34ab","Type":"ContainerStarted","Data":"8cf9af5064bbe01b15e6be9e6c78bce225d66c1b0ab5f5829353d295702f41b9"} Oct 07 11:30:45 crc kubenswrapper[4700]: I1007 11:30:45.184644 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" event={"ID":"f01802d7-7bda-4bd9-b9af-fc3a312a34ab","Type":"ContainerStarted","Data":"0b17a7328bf8219f00c3e41aa648d7a0a1d15d4976407a0783a43c3b6a136817"} Oct 07 11:30:45 crc kubenswrapper[4700]: I1007 11:30:45.334608 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 11:30:45 crc kubenswrapper[4700]: I1007 11:30:45.334692 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 11:30:45 crc kubenswrapper[4700]: I1007 11:30:45.334754 4700 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" Oct 07 11:30:45 crc kubenswrapper[4700]: I1007 11:30:45.335337 4700 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7b5dd63de890f68091da6e9c7a22abc43dbafc0c1de89538465502461bd7d04c"} pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 11:30:45 crc kubenswrapper[4700]: I1007 11:30:45.335419 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" containerID="cri-o://7b5dd63de890f68091da6e9c7a22abc43dbafc0c1de89538465502461bd7d04c" gracePeriod=600 Oct 07 11:30:46 crc kubenswrapper[4700]: I1007 11:30:46.198686 4700 generic.go:334] "Generic (PLEG): container finished" podID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerID="7b5dd63de890f68091da6e9c7a22abc43dbafc0c1de89538465502461bd7d04c" exitCode=0 Oct 07 11:30:46 crc kubenswrapper[4700]: I1007 11:30:46.198776 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" event={"ID":"97a77b38-e9b1-4243-ac3a-28d83d87cf15","Type":"ContainerDied","Data":"7b5dd63de890f68091da6e9c7a22abc43dbafc0c1de89538465502461bd7d04c"} Oct 07 11:30:46 crc kubenswrapper[4700]: I1007 11:30:46.199214 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" event={"ID":"97a77b38-e9b1-4243-ac3a-28d83d87cf15","Type":"ContainerStarted","Data":"01b641391ace1b00a610b14b6a967ed35cc42ded426b03a9ec0a64a8438621b6"} Oct 07 11:30:46 crc kubenswrapper[4700]: I1007 11:30:46.199263 4700 scope.go:117] "RemoveContainer" containerID="eccd7bcc9d2e4841d5c5ffebb71c3562830e1b3391f2acddd70627baba88e9fd" Oct 07 11:30:48 crc kubenswrapper[4700]: I1007 11:30:48.224244 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" event={"ID":"f01802d7-7bda-4bd9-b9af-fc3a312a34ab","Type":"ContainerStarted","Data":"93ddea9c6486f432d83461bb57447f22badbb2214fcad9639a4509aa9d1fa502"} Oct 07 11:30:50 crc kubenswrapper[4700]: I1007 11:30:50.241635 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" event={"ID":"f01802d7-7bda-4bd9-b9af-fc3a312a34ab","Type":"ContainerStarted","Data":"6d719e653b9399289b605cb5c44ecb0644041a48e8725e18210fbdd270ef2b15"} Oct 07 11:30:50 crc kubenswrapper[4700]: I1007 11:30:50.242443 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:50 crc kubenswrapper[4700]: I1007 11:30:50.242468 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:50 crc kubenswrapper[4700]: I1007 11:30:50.271090 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" podStartSLOduration=8.271071009 podStartE2EDuration="8.271071009s" podCreationTimestamp="2025-10-07 11:30:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:30:50.269914859 +0000 UTC m=+617.066313848" watchObservedRunningTime="2025-10-07 11:30:50.271071009 +0000 UTC m=+617.067469998" Oct 07 11:30:50 crc kubenswrapper[4700]: I1007 11:30:50.276766 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:51 crc kubenswrapper[4700]: I1007 11:30:51.248172 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:51 crc kubenswrapper[4700]: I1007 11:30:51.296036 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:30:57 crc kubenswrapper[4700]: I1007 11:30:57.958227 4700 scope.go:117] "RemoveContainer" containerID="bc3f96fdc39238a4256a6eefb116100d5bff845db1385f44433113a8962718d5" Oct 07 11:30:57 crc kubenswrapper[4700]: E1007 11:30:57.959536 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-zhd4s_openshift-multus(869af552-a034-4af4-b46a-492798633d24)\"" pod="openshift-multus/multus-zhd4s" podUID="869af552-a034-4af4-b46a-492798633d24" Oct 07 11:31:12 crc kubenswrapper[4700]: I1007 11:31:12.957993 4700 scope.go:117] "RemoveContainer" containerID="bc3f96fdc39238a4256a6eefb116100d5bff845db1385f44433113a8962718d5" Oct 07 11:31:13 crc kubenswrapper[4700]: I1007 11:31:13.173957 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nqb2t" Oct 07 11:31:13 crc kubenswrapper[4700]: I1007 11:31:13.419451 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zhd4s_869af552-a034-4af4-b46a-492798633d24/kube-multus/2.log" Oct 07 11:31:13 crc kubenswrapper[4700]: I1007 11:31:13.419500 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zhd4s" event={"ID":"869af552-a034-4af4-b46a-492798633d24","Type":"ContainerStarted","Data":"d9ec2ca8c4ebe11358cb32814e531515a3b255e32672e95c62fd5cc585569e3b"} Oct 07 11:31:19 crc kubenswrapper[4700]: I1007 11:31:19.174981 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7"] Oct 07 11:31:19 crc kubenswrapper[4700]: I1007 11:31:19.176351 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7" Oct 07 11:31:19 crc kubenswrapper[4700]: I1007 11:31:19.178733 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 07 11:31:19 crc kubenswrapper[4700]: I1007 11:31:19.191637 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7"] Oct 07 11:31:19 crc kubenswrapper[4700]: I1007 11:31:19.371089 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/012580db-7236-454c-a0c9-e53ca0cefe4c-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7\" (UID: \"012580db-7236-454c-a0c9-e53ca0cefe4c\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7" Oct 07 11:31:19 crc kubenswrapper[4700]: I1007 11:31:19.371131 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgwl6\" (UniqueName: \"kubernetes.io/projected/012580db-7236-454c-a0c9-e53ca0cefe4c-kube-api-access-pgwl6\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7\" (UID: \"012580db-7236-454c-a0c9-e53ca0cefe4c\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7" Oct 07 11:31:19 crc kubenswrapper[4700]: I1007 11:31:19.371176 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/012580db-7236-454c-a0c9-e53ca0cefe4c-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7\" (UID: \"012580db-7236-454c-a0c9-e53ca0cefe4c\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7" Oct 07 11:31:19 crc kubenswrapper[4700]: I1007 11:31:19.474580 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/012580db-7236-454c-a0c9-e53ca0cefe4c-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7\" (UID: \"012580db-7236-454c-a0c9-e53ca0cefe4c\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7" Oct 07 11:31:19 crc kubenswrapper[4700]: I1007 11:31:19.474690 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgwl6\" (UniqueName: \"kubernetes.io/projected/012580db-7236-454c-a0c9-e53ca0cefe4c-kube-api-access-pgwl6\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7\" (UID: \"012580db-7236-454c-a0c9-e53ca0cefe4c\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7" Oct 07 11:31:19 crc kubenswrapper[4700]: I1007 11:31:19.474809 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/012580db-7236-454c-a0c9-e53ca0cefe4c-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7\" (UID: \"012580db-7236-454c-a0c9-e53ca0cefe4c\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7" Oct 07 11:31:19 crc kubenswrapper[4700]: I1007 11:31:19.475382 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/012580db-7236-454c-a0c9-e53ca0cefe4c-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7\" (UID: \"012580db-7236-454c-a0c9-e53ca0cefe4c\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7" Oct 07 11:31:19 crc kubenswrapper[4700]: I1007 11:31:19.475515 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/012580db-7236-454c-a0c9-e53ca0cefe4c-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7\" (UID: \"012580db-7236-454c-a0c9-e53ca0cefe4c\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7" Oct 07 11:31:19 crc kubenswrapper[4700]: I1007 11:31:19.520389 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgwl6\" (UniqueName: \"kubernetes.io/projected/012580db-7236-454c-a0c9-e53ca0cefe4c-kube-api-access-pgwl6\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7\" (UID: \"012580db-7236-454c-a0c9-e53ca0cefe4c\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7" Oct 07 11:31:19 crc kubenswrapper[4700]: I1007 11:31:19.796745 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7" Oct 07 11:31:20 crc kubenswrapper[4700]: I1007 11:31:20.046026 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7"] Oct 07 11:31:20 crc kubenswrapper[4700]: I1007 11:31:20.472582 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7" event={"ID":"012580db-7236-454c-a0c9-e53ca0cefe4c","Type":"ContainerStarted","Data":"1287ed9e4b42535bcfc18e40fd7c3b94699f33d52f1f0ef2a664577a2db47c07"} Oct 07 11:31:20 crc kubenswrapper[4700]: I1007 11:31:20.472658 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7" event={"ID":"012580db-7236-454c-a0c9-e53ca0cefe4c","Type":"ContainerStarted","Data":"a5759ae60b07d0ec10091ec66f07f40e229afeabea8df2b3c411f66154c707d0"} Oct 07 11:31:21 crc kubenswrapper[4700]: I1007 11:31:21.479371 4700 generic.go:334] "Generic (PLEG): container finished" podID="012580db-7236-454c-a0c9-e53ca0cefe4c" containerID="1287ed9e4b42535bcfc18e40fd7c3b94699f33d52f1f0ef2a664577a2db47c07" exitCode=0 Oct 07 11:31:21 crc kubenswrapper[4700]: I1007 11:31:21.479439 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7" event={"ID":"012580db-7236-454c-a0c9-e53ca0cefe4c","Type":"ContainerDied","Data":"1287ed9e4b42535bcfc18e40fd7c3b94699f33d52f1f0ef2a664577a2db47c07"} Oct 07 11:31:23 crc kubenswrapper[4700]: I1007 11:31:23.498433 4700 generic.go:334] "Generic (PLEG): container finished" podID="012580db-7236-454c-a0c9-e53ca0cefe4c" containerID="3a97bb0fe8a9858adf6e661ef8a1bc50859659a15365130b80cdba1d7b49a658" exitCode=0 Oct 07 11:31:23 crc kubenswrapper[4700]: I1007 11:31:23.498530 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7" event={"ID":"012580db-7236-454c-a0c9-e53ca0cefe4c","Type":"ContainerDied","Data":"3a97bb0fe8a9858adf6e661ef8a1bc50859659a15365130b80cdba1d7b49a658"} Oct 07 11:31:24 crc kubenswrapper[4700]: I1007 11:31:24.511070 4700 generic.go:334] "Generic (PLEG): container finished" podID="012580db-7236-454c-a0c9-e53ca0cefe4c" containerID="a77eb875776c7acad0225c73812e777352357fb6a32e4c4844f541fc51b895d4" exitCode=0 Oct 07 11:31:24 crc kubenswrapper[4700]: I1007 11:31:24.512786 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7" event={"ID":"012580db-7236-454c-a0c9-e53ca0cefe4c","Type":"ContainerDied","Data":"a77eb875776c7acad0225c73812e777352357fb6a32e4c4844f541fc51b895d4"} Oct 07 11:31:25 crc kubenswrapper[4700]: I1007 11:31:25.844187 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7" Oct 07 11:31:25 crc kubenswrapper[4700]: I1007 11:31:25.968123 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/012580db-7236-454c-a0c9-e53ca0cefe4c-util\") pod \"012580db-7236-454c-a0c9-e53ca0cefe4c\" (UID: \"012580db-7236-454c-a0c9-e53ca0cefe4c\") " Oct 07 11:31:25 crc kubenswrapper[4700]: I1007 11:31:25.968214 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/012580db-7236-454c-a0c9-e53ca0cefe4c-bundle\") pod \"012580db-7236-454c-a0c9-e53ca0cefe4c\" (UID: \"012580db-7236-454c-a0c9-e53ca0cefe4c\") " Oct 07 11:31:25 crc kubenswrapper[4700]: I1007 11:31:25.968495 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgwl6\" (UniqueName: \"kubernetes.io/projected/012580db-7236-454c-a0c9-e53ca0cefe4c-kube-api-access-pgwl6\") pod \"012580db-7236-454c-a0c9-e53ca0cefe4c\" (UID: \"012580db-7236-454c-a0c9-e53ca0cefe4c\") " Oct 07 11:31:25 crc kubenswrapper[4700]: I1007 11:31:25.968958 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/012580db-7236-454c-a0c9-e53ca0cefe4c-bundle" (OuterVolumeSpecName: "bundle") pod "012580db-7236-454c-a0c9-e53ca0cefe4c" (UID: "012580db-7236-454c-a0c9-e53ca0cefe4c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:31:25 crc kubenswrapper[4700]: I1007 11:31:25.969260 4700 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/012580db-7236-454c-a0c9-e53ca0cefe4c-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:31:25 crc kubenswrapper[4700]: I1007 11:31:25.980365 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/012580db-7236-454c-a0c9-e53ca0cefe4c-kube-api-access-pgwl6" (OuterVolumeSpecName: "kube-api-access-pgwl6") pod "012580db-7236-454c-a0c9-e53ca0cefe4c" (UID: "012580db-7236-454c-a0c9-e53ca0cefe4c"). InnerVolumeSpecName "kube-api-access-pgwl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:31:26 crc kubenswrapper[4700]: I1007 11:31:26.067593 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/012580db-7236-454c-a0c9-e53ca0cefe4c-util" (OuterVolumeSpecName: "util") pod "012580db-7236-454c-a0c9-e53ca0cefe4c" (UID: "012580db-7236-454c-a0c9-e53ca0cefe4c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:31:26 crc kubenswrapper[4700]: I1007 11:31:26.070630 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgwl6\" (UniqueName: \"kubernetes.io/projected/012580db-7236-454c-a0c9-e53ca0cefe4c-kube-api-access-pgwl6\") on node \"crc\" DevicePath \"\"" Oct 07 11:31:26 crc kubenswrapper[4700]: I1007 11:31:26.070674 4700 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/012580db-7236-454c-a0c9-e53ca0cefe4c-util\") on node \"crc\" DevicePath \"\"" Oct 07 11:31:26 crc kubenswrapper[4700]: I1007 11:31:26.527709 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7" event={"ID":"012580db-7236-454c-a0c9-e53ca0cefe4c","Type":"ContainerDied","Data":"a5759ae60b07d0ec10091ec66f07f40e229afeabea8df2b3c411f66154c707d0"} Oct 07 11:31:26 crc kubenswrapper[4700]: I1007 11:31:26.527767 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5759ae60b07d0ec10091ec66f07f40e229afeabea8df2b3c411f66154c707d0" Oct 07 11:31:26 crc kubenswrapper[4700]: I1007 11:31:26.527823 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7" Oct 07 11:31:27 crc kubenswrapper[4700]: I1007 11:31:27.839943 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-h9nxf"] Oct 07 11:31:27 crc kubenswrapper[4700]: E1007 11:31:27.840158 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="012580db-7236-454c-a0c9-e53ca0cefe4c" containerName="extract" Oct 07 11:31:27 crc kubenswrapper[4700]: I1007 11:31:27.840170 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="012580db-7236-454c-a0c9-e53ca0cefe4c" containerName="extract" Oct 07 11:31:27 crc kubenswrapper[4700]: E1007 11:31:27.840180 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="012580db-7236-454c-a0c9-e53ca0cefe4c" containerName="util" Oct 07 11:31:27 crc kubenswrapper[4700]: I1007 11:31:27.840185 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="012580db-7236-454c-a0c9-e53ca0cefe4c" containerName="util" Oct 07 11:31:27 crc kubenswrapper[4700]: E1007 11:31:27.840208 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="012580db-7236-454c-a0c9-e53ca0cefe4c" containerName="pull" Oct 07 11:31:27 crc kubenswrapper[4700]: I1007 11:31:27.840215 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="012580db-7236-454c-a0c9-e53ca0cefe4c" containerName="pull" Oct 07 11:31:27 crc kubenswrapper[4700]: I1007 11:31:27.840348 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="012580db-7236-454c-a0c9-e53ca0cefe4c" containerName="extract" Oct 07 11:31:27 crc kubenswrapper[4700]: I1007 11:31:27.840775 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-h9nxf" Oct 07 11:31:27 crc kubenswrapper[4700]: I1007 11:31:27.844791 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 07 11:31:27 crc kubenswrapper[4700]: I1007 11:31:27.845272 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 07 11:31:27 crc kubenswrapper[4700]: I1007 11:31:27.845269 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-bv2qm" Oct 07 11:31:27 crc kubenswrapper[4700]: I1007 11:31:27.859351 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-h9nxf"] Oct 07 11:31:27 crc kubenswrapper[4700]: I1007 11:31:27.897650 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2tjj\" (UniqueName: \"kubernetes.io/projected/ab86639c-d812-4e56-9f44-f8727fa8b6b5-kube-api-access-m2tjj\") pod \"nmstate-operator-858ddd8f98-h9nxf\" (UID: \"ab86639c-d812-4e56-9f44-f8727fa8b6b5\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-h9nxf" Oct 07 11:31:27 crc kubenswrapper[4700]: I1007 11:31:27.998674 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2tjj\" (UniqueName: \"kubernetes.io/projected/ab86639c-d812-4e56-9f44-f8727fa8b6b5-kube-api-access-m2tjj\") pod \"nmstate-operator-858ddd8f98-h9nxf\" (UID: \"ab86639c-d812-4e56-9f44-f8727fa8b6b5\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-h9nxf" Oct 07 11:31:28 crc kubenswrapper[4700]: I1007 11:31:28.019039 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2tjj\" (UniqueName: \"kubernetes.io/projected/ab86639c-d812-4e56-9f44-f8727fa8b6b5-kube-api-access-m2tjj\") pod \"nmstate-operator-858ddd8f98-h9nxf\" (UID: \"ab86639c-d812-4e56-9f44-f8727fa8b6b5\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-h9nxf" Oct 07 11:31:28 crc kubenswrapper[4700]: I1007 11:31:28.166791 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-h9nxf" Oct 07 11:31:28 crc kubenswrapper[4700]: I1007 11:31:28.446877 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-h9nxf"] Oct 07 11:31:28 crc kubenswrapper[4700]: I1007 11:31:28.539883 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-h9nxf" event={"ID":"ab86639c-d812-4e56-9f44-f8727fa8b6b5","Type":"ContainerStarted","Data":"0eff8cb98cd5927e2aaadf3c6284ef342c95787d3f35666db9e784e7f2875c0d"} Oct 07 11:31:31 crc kubenswrapper[4700]: I1007 11:31:31.569067 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-h9nxf" event={"ID":"ab86639c-d812-4e56-9f44-f8727fa8b6b5","Type":"ContainerStarted","Data":"4b5345eccc7b0294f3ed92236e3496ef2d008aeb29ea4452dfb89da064df1082"} Oct 07 11:31:31 crc kubenswrapper[4700]: I1007 11:31:31.591830 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-h9nxf" podStartSLOduration=2.521978865 podStartE2EDuration="4.591802155s" podCreationTimestamp="2025-10-07 11:31:27 +0000 UTC" firstStartedPulling="2025-10-07 11:31:28.458601198 +0000 UTC m=+655.255000207" lastFinishedPulling="2025-10-07 11:31:30.528424508 +0000 UTC m=+657.324823497" observedRunningTime="2025-10-07 11:31:31.587917112 +0000 UTC m=+658.384316121" watchObservedRunningTime="2025-10-07 11:31:31.591802155 +0000 UTC m=+658.388201174" Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.569854 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-svznx"] Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.572601 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-svznx" Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.581469 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-k4h2w"] Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.582217 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k4h2w" Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.582506 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-pvlqk" Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.586085 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.604393 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-svznx"] Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.606796 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-k4h2w"] Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.626057 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-7q2md"] Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.626810 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-7q2md" Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.727095 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sln98\" (UniqueName: \"kubernetes.io/projected/7d6bc40a-05b0-4ff6-9111-44cc8bc88ea2-kube-api-access-sln98\") pod \"nmstate-metrics-fdff9cb8d-svznx\" (UID: \"7d6bc40a-05b0-4ff6-9111-44cc8bc88ea2\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-svznx" Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.727156 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c2c057c9-3fea-45df-9991-448998e13a79-nmstate-lock\") pod \"nmstate-handler-7q2md\" (UID: \"c2c057c9-3fea-45df-9991-448998e13a79\") " pod="openshift-nmstate/nmstate-handler-7q2md" Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.727183 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/607c033a-3b68-4731-92c3-c9a9a08acd5c-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-k4h2w\" (UID: \"607c033a-3b68-4731-92c3-c9a9a08acd5c\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k4h2w" Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.727209 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c2c057c9-3fea-45df-9991-448998e13a79-ovs-socket\") pod \"nmstate-handler-7q2md\" (UID: \"c2c057c9-3fea-45df-9991-448998e13a79\") " pod="openshift-nmstate/nmstate-handler-7q2md" Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.727232 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c2c057c9-3fea-45df-9991-448998e13a79-dbus-socket\") pod \"nmstate-handler-7q2md\" (UID: \"c2c057c9-3fea-45df-9991-448998e13a79\") " pod="openshift-nmstate/nmstate-handler-7q2md" Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.727252 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bkbq\" (UniqueName: \"kubernetes.io/projected/c2c057c9-3fea-45df-9991-448998e13a79-kube-api-access-9bkbq\") pod \"nmstate-handler-7q2md\" (UID: \"c2c057c9-3fea-45df-9991-448998e13a79\") " pod="openshift-nmstate/nmstate-handler-7q2md" Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.727326 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2xbs\" (UniqueName: \"kubernetes.io/projected/607c033a-3b68-4731-92c3-c9a9a08acd5c-kube-api-access-q2xbs\") pod \"nmstate-webhook-6cdbc54649-k4h2w\" (UID: \"607c033a-3b68-4731-92c3-c9a9a08acd5c\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k4h2w" Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.799180 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-9r6fx"] Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.800487 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-9r6fx" Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.802534 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-mq4jf" Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.802587 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.802601 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.817642 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-9r6fx"] Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.828219 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c2c057c9-3fea-45df-9991-448998e13a79-ovs-socket\") pod \"nmstate-handler-7q2md\" (UID: \"c2c057c9-3fea-45df-9991-448998e13a79\") " pod="openshift-nmstate/nmstate-handler-7q2md" Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.828264 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c2c057c9-3fea-45df-9991-448998e13a79-dbus-socket\") pod \"nmstate-handler-7q2md\" (UID: \"c2c057c9-3fea-45df-9991-448998e13a79\") " pod="openshift-nmstate/nmstate-handler-7q2md" Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.828282 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bkbq\" (UniqueName: \"kubernetes.io/projected/c2c057c9-3fea-45df-9991-448998e13a79-kube-api-access-9bkbq\") pod \"nmstate-handler-7q2md\" (UID: \"c2c057c9-3fea-45df-9991-448998e13a79\") " pod="openshift-nmstate/nmstate-handler-7q2md" Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.828420 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c2c057c9-3fea-45df-9991-448998e13a79-ovs-socket\") pod \"nmstate-handler-7q2md\" (UID: \"c2c057c9-3fea-45df-9991-448998e13a79\") " pod="openshift-nmstate/nmstate-handler-7q2md" Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.828548 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2xbs\" (UniqueName: \"kubernetes.io/projected/607c033a-3b68-4731-92c3-c9a9a08acd5c-kube-api-access-q2xbs\") pod \"nmstate-webhook-6cdbc54649-k4h2w\" (UID: \"607c033a-3b68-4731-92c3-c9a9a08acd5c\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k4h2w" Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.828608 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sln98\" (UniqueName: \"kubernetes.io/projected/7d6bc40a-05b0-4ff6-9111-44cc8bc88ea2-kube-api-access-sln98\") pod \"nmstate-metrics-fdff9cb8d-svznx\" (UID: \"7d6bc40a-05b0-4ff6-9111-44cc8bc88ea2\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-svznx" Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.828620 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c2c057c9-3fea-45df-9991-448998e13a79-dbus-socket\") pod \"nmstate-handler-7q2md\" (UID: \"c2c057c9-3fea-45df-9991-448998e13a79\") " pod="openshift-nmstate/nmstate-handler-7q2md" Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.828629 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c2c057c9-3fea-45df-9991-448998e13a79-nmstate-lock\") pod \"nmstate-handler-7q2md\" (UID: \"c2c057c9-3fea-45df-9991-448998e13a79\") " pod="openshift-nmstate/nmstate-handler-7q2md" Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.828677 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c2c057c9-3fea-45df-9991-448998e13a79-nmstate-lock\") pod \"nmstate-handler-7q2md\" (UID: \"c2c057c9-3fea-45df-9991-448998e13a79\") " pod="openshift-nmstate/nmstate-handler-7q2md" Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.828700 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/607c033a-3b68-4731-92c3-c9a9a08acd5c-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-k4h2w\" (UID: \"607c033a-3b68-4731-92c3-c9a9a08acd5c\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k4h2w" Oct 07 11:31:32 crc kubenswrapper[4700]: E1007 11:31:32.828822 4700 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Oct 07 11:31:32 crc kubenswrapper[4700]: E1007 11:31:32.828878 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/607c033a-3b68-4731-92c3-c9a9a08acd5c-tls-key-pair podName:607c033a-3b68-4731-92c3-c9a9a08acd5c nodeName:}" failed. No retries permitted until 2025-10-07 11:31:33.328861242 +0000 UTC m=+660.125260231 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/607c033a-3b68-4731-92c3-c9a9a08acd5c-tls-key-pair") pod "nmstate-webhook-6cdbc54649-k4h2w" (UID: "607c033a-3b68-4731-92c3-c9a9a08acd5c") : secret "openshift-nmstate-webhook" not found Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.848258 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bkbq\" (UniqueName: \"kubernetes.io/projected/c2c057c9-3fea-45df-9991-448998e13a79-kube-api-access-9bkbq\") pod \"nmstate-handler-7q2md\" (UID: \"c2c057c9-3fea-45df-9991-448998e13a79\") " pod="openshift-nmstate/nmstate-handler-7q2md" Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.849903 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2xbs\" (UniqueName: \"kubernetes.io/projected/607c033a-3b68-4731-92c3-c9a9a08acd5c-kube-api-access-q2xbs\") pod \"nmstate-webhook-6cdbc54649-k4h2w\" (UID: \"607c033a-3b68-4731-92c3-c9a9a08acd5c\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k4h2w" Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.855095 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sln98\" (UniqueName: \"kubernetes.io/projected/7d6bc40a-05b0-4ff6-9111-44cc8bc88ea2-kube-api-access-sln98\") pod \"nmstate-metrics-fdff9cb8d-svznx\" (UID: \"7d6bc40a-05b0-4ff6-9111-44cc8bc88ea2\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-svznx" Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.907041 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-svznx" Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.930188 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6fpc\" (UniqueName: \"kubernetes.io/projected/65303df1-c2ad-4edf-83c1-8b3d6bce8c33-kube-api-access-z6fpc\") pod \"nmstate-console-plugin-6b874cbd85-9r6fx\" (UID: \"65303df1-c2ad-4edf-83c1-8b3d6bce8c33\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-9r6fx" Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.930289 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/65303df1-c2ad-4edf-83c1-8b3d6bce8c33-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-9r6fx\" (UID: \"65303df1-c2ad-4edf-83c1-8b3d6bce8c33\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-9r6fx" Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.930415 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/65303df1-c2ad-4edf-83c1-8b3d6bce8c33-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-9r6fx\" (UID: \"65303df1-c2ad-4edf-83c1-8b3d6bce8c33\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-9r6fx" Oct 07 11:31:32 crc kubenswrapper[4700]: I1007 11:31:32.957177 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-7q2md" Oct 07 11:31:32 crc kubenswrapper[4700]: W1007 11:31:32.987135 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2c057c9_3fea_45df_9991_448998e13a79.slice/crio-8c0313c6ecae5a3c53dad20e620e1b4d37138d1af816d886bf6f2dcdc6cbfedb WatchSource:0}: Error finding container 8c0313c6ecae5a3c53dad20e620e1b4d37138d1af816d886bf6f2dcdc6cbfedb: Status 404 returned error can't find the container with id 8c0313c6ecae5a3c53dad20e620e1b4d37138d1af816d886bf6f2dcdc6cbfedb Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.017469 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5bcfd589bd-9cvs7"] Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.018411 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bcfd589bd-9cvs7" Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.028803 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5bcfd589bd-9cvs7"] Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.031219 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6fpc\" (UniqueName: \"kubernetes.io/projected/65303df1-c2ad-4edf-83c1-8b3d6bce8c33-kube-api-access-z6fpc\") pod \"nmstate-console-plugin-6b874cbd85-9r6fx\" (UID: \"65303df1-c2ad-4edf-83c1-8b3d6bce8c33\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-9r6fx" Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.031260 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwxmb\" (UniqueName: \"kubernetes.io/projected/f8ad810d-8714-40d0-93dd-41cd99c62606-kube-api-access-jwxmb\") pod \"console-5bcfd589bd-9cvs7\" (UID: \"f8ad810d-8714-40d0-93dd-41cd99c62606\") " pod="openshift-console/console-5bcfd589bd-9cvs7" Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.031286 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/65303df1-c2ad-4edf-83c1-8b3d6bce8c33-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-9r6fx\" (UID: \"65303df1-c2ad-4edf-83c1-8b3d6bce8c33\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-9r6fx" Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.031321 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f8ad810d-8714-40d0-93dd-41cd99c62606-console-config\") pod \"console-5bcfd589bd-9cvs7\" (UID: \"f8ad810d-8714-40d0-93dd-41cd99c62606\") " pod="openshift-console/console-5bcfd589bd-9cvs7" Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.031344 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f8ad810d-8714-40d0-93dd-41cd99c62606-console-oauth-config\") pod \"console-5bcfd589bd-9cvs7\" (UID: \"f8ad810d-8714-40d0-93dd-41cd99c62606\") " pod="openshift-console/console-5bcfd589bd-9cvs7" Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.031361 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8ad810d-8714-40d0-93dd-41cd99c62606-trusted-ca-bundle\") pod \"console-5bcfd589bd-9cvs7\" (UID: \"f8ad810d-8714-40d0-93dd-41cd99c62606\") " pod="openshift-console/console-5bcfd589bd-9cvs7" Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.031377 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f8ad810d-8714-40d0-93dd-41cd99c62606-oauth-serving-cert\") pod \"console-5bcfd589bd-9cvs7\" (UID: \"f8ad810d-8714-40d0-93dd-41cd99c62606\") " pod="openshift-console/console-5bcfd589bd-9cvs7" Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.031402 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8ad810d-8714-40d0-93dd-41cd99c62606-console-serving-cert\") pod \"console-5bcfd589bd-9cvs7\" (UID: \"f8ad810d-8714-40d0-93dd-41cd99c62606\") " pod="openshift-console/console-5bcfd589bd-9cvs7" Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.031417 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f8ad810d-8714-40d0-93dd-41cd99c62606-service-ca\") pod \"console-5bcfd589bd-9cvs7\" (UID: \"f8ad810d-8714-40d0-93dd-41cd99c62606\") " pod="openshift-console/console-5bcfd589bd-9cvs7" Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.031438 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/65303df1-c2ad-4edf-83c1-8b3d6bce8c33-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-9r6fx\" (UID: \"65303df1-c2ad-4edf-83c1-8b3d6bce8c33\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-9r6fx" Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.032629 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/65303df1-c2ad-4edf-83c1-8b3d6bce8c33-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-9r6fx\" (UID: \"65303df1-c2ad-4edf-83c1-8b3d6bce8c33\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-9r6fx" Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.035566 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/65303df1-c2ad-4edf-83c1-8b3d6bce8c33-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-9r6fx\" (UID: \"65303df1-c2ad-4edf-83c1-8b3d6bce8c33\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-9r6fx" Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.057375 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6fpc\" (UniqueName: \"kubernetes.io/projected/65303df1-c2ad-4edf-83c1-8b3d6bce8c33-kube-api-access-z6fpc\") pod \"nmstate-console-plugin-6b874cbd85-9r6fx\" (UID: \"65303df1-c2ad-4edf-83c1-8b3d6bce8c33\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-9r6fx" Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.118724 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-9r6fx" Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.132452 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwxmb\" (UniqueName: \"kubernetes.io/projected/f8ad810d-8714-40d0-93dd-41cd99c62606-kube-api-access-jwxmb\") pod \"console-5bcfd589bd-9cvs7\" (UID: \"f8ad810d-8714-40d0-93dd-41cd99c62606\") " pod="openshift-console/console-5bcfd589bd-9cvs7" Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.132530 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f8ad810d-8714-40d0-93dd-41cd99c62606-console-config\") pod \"console-5bcfd589bd-9cvs7\" (UID: \"f8ad810d-8714-40d0-93dd-41cd99c62606\") " pod="openshift-console/console-5bcfd589bd-9cvs7" Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.132560 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f8ad810d-8714-40d0-93dd-41cd99c62606-console-oauth-config\") pod \"console-5bcfd589bd-9cvs7\" (UID: \"f8ad810d-8714-40d0-93dd-41cd99c62606\") " pod="openshift-console/console-5bcfd589bd-9cvs7" Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.132579 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8ad810d-8714-40d0-93dd-41cd99c62606-trusted-ca-bundle\") pod \"console-5bcfd589bd-9cvs7\" (UID: \"f8ad810d-8714-40d0-93dd-41cd99c62606\") " pod="openshift-console/console-5bcfd589bd-9cvs7" Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.132602 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f8ad810d-8714-40d0-93dd-41cd99c62606-oauth-serving-cert\") pod \"console-5bcfd589bd-9cvs7\" (UID: \"f8ad810d-8714-40d0-93dd-41cd99c62606\") " pod="openshift-console/console-5bcfd589bd-9cvs7" Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.132628 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8ad810d-8714-40d0-93dd-41cd99c62606-console-serving-cert\") pod \"console-5bcfd589bd-9cvs7\" (UID: \"f8ad810d-8714-40d0-93dd-41cd99c62606\") " pod="openshift-console/console-5bcfd589bd-9cvs7" Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.132647 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f8ad810d-8714-40d0-93dd-41cd99c62606-service-ca\") pod \"console-5bcfd589bd-9cvs7\" (UID: \"f8ad810d-8714-40d0-93dd-41cd99c62606\") " pod="openshift-console/console-5bcfd589bd-9cvs7" Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.133987 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f8ad810d-8714-40d0-93dd-41cd99c62606-service-ca\") pod \"console-5bcfd589bd-9cvs7\" (UID: \"f8ad810d-8714-40d0-93dd-41cd99c62606\") " pod="openshift-console/console-5bcfd589bd-9cvs7" Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.135088 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f8ad810d-8714-40d0-93dd-41cd99c62606-console-config\") pod \"console-5bcfd589bd-9cvs7\" (UID: \"f8ad810d-8714-40d0-93dd-41cd99c62606\") " pod="openshift-console/console-5bcfd589bd-9cvs7" Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.135887 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f8ad810d-8714-40d0-93dd-41cd99c62606-oauth-serving-cert\") pod \"console-5bcfd589bd-9cvs7\" (UID: \"f8ad810d-8714-40d0-93dd-41cd99c62606\") " pod="openshift-console/console-5bcfd589bd-9cvs7" Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.136211 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8ad810d-8714-40d0-93dd-41cd99c62606-trusted-ca-bundle\") pod \"console-5bcfd589bd-9cvs7\" (UID: \"f8ad810d-8714-40d0-93dd-41cd99c62606\") " pod="openshift-console/console-5bcfd589bd-9cvs7" Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.139709 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8ad810d-8714-40d0-93dd-41cd99c62606-console-serving-cert\") pod \"console-5bcfd589bd-9cvs7\" (UID: \"f8ad810d-8714-40d0-93dd-41cd99c62606\") " pod="openshift-console/console-5bcfd589bd-9cvs7" Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.141123 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f8ad810d-8714-40d0-93dd-41cd99c62606-console-oauth-config\") pod \"console-5bcfd589bd-9cvs7\" (UID: \"f8ad810d-8714-40d0-93dd-41cd99c62606\") " pod="openshift-console/console-5bcfd589bd-9cvs7" Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.152415 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwxmb\" (UniqueName: \"kubernetes.io/projected/f8ad810d-8714-40d0-93dd-41cd99c62606-kube-api-access-jwxmb\") pod \"console-5bcfd589bd-9cvs7\" (UID: \"f8ad810d-8714-40d0-93dd-41cd99c62606\") " pod="openshift-console/console-5bcfd589bd-9cvs7" Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.169337 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-svznx"] Oct 07 11:31:33 crc kubenswrapper[4700]: W1007 11:31:33.176272 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d6bc40a_05b0_4ff6_9111_44cc8bc88ea2.slice/crio-3c8cb502464cb4a2f574b0ac6e0761a4b4a8e2f54f0a72a38367b48bced04c32 WatchSource:0}: Error finding container 3c8cb502464cb4a2f574b0ac6e0761a4b4a8e2f54f0a72a38367b48bced04c32: Status 404 returned error can't find the container with id 3c8cb502464cb4a2f574b0ac6e0761a4b4a8e2f54f0a72a38367b48bced04c32 Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.293973 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-9r6fx"] Oct 07 11:31:33 crc kubenswrapper[4700]: W1007 11:31:33.296922 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65303df1_c2ad_4edf_83c1_8b3d6bce8c33.slice/crio-e2b44158b0220f1dbcc268189902f2d83f26856e3594cd2f2043e99dd8e64cce WatchSource:0}: Error finding container e2b44158b0220f1dbcc268189902f2d83f26856e3594cd2f2043e99dd8e64cce: Status 404 returned error can't find the container with id e2b44158b0220f1dbcc268189902f2d83f26856e3594cd2f2043e99dd8e64cce Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.334929 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bcfd589bd-9cvs7" Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.335399 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/607c033a-3b68-4731-92c3-c9a9a08acd5c-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-k4h2w\" (UID: \"607c033a-3b68-4731-92c3-c9a9a08acd5c\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k4h2w" Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.339471 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/607c033a-3b68-4731-92c3-c9a9a08acd5c-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-k4h2w\" (UID: \"607c033a-3b68-4731-92c3-c9a9a08acd5c\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k4h2w" Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.509019 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5bcfd589bd-9cvs7"] Oct 07 11:31:33 crc kubenswrapper[4700]: W1007 11:31:33.516361 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8ad810d_8714_40d0_93dd_41cd99c62606.slice/crio-bd07ce3f0c363024988a2cdf4bbfc82a805369c3dfa7a55d7473fdad1be38588 WatchSource:0}: Error finding container bd07ce3f0c363024988a2cdf4bbfc82a805369c3dfa7a55d7473fdad1be38588: Status 404 returned error can't find the container with id bd07ce3f0c363024988a2cdf4bbfc82a805369c3dfa7a55d7473fdad1be38588 Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.525201 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k4h2w" Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.584905 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-7q2md" event={"ID":"c2c057c9-3fea-45df-9991-448998e13a79","Type":"ContainerStarted","Data":"8c0313c6ecae5a3c53dad20e620e1b4d37138d1af816d886bf6f2dcdc6cbfedb"} Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.585805 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bcfd589bd-9cvs7" event={"ID":"f8ad810d-8714-40d0-93dd-41cd99c62606","Type":"ContainerStarted","Data":"bd07ce3f0c363024988a2cdf4bbfc82a805369c3dfa7a55d7473fdad1be38588"} Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.586777 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-svznx" event={"ID":"7d6bc40a-05b0-4ff6-9111-44cc8bc88ea2","Type":"ContainerStarted","Data":"3c8cb502464cb4a2f574b0ac6e0761a4b4a8e2f54f0a72a38367b48bced04c32"} Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.588018 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-9r6fx" event={"ID":"65303df1-c2ad-4edf-83c1-8b3d6bce8c33","Type":"ContainerStarted","Data":"e2b44158b0220f1dbcc268189902f2d83f26856e3594cd2f2043e99dd8e64cce"} Oct 07 11:31:33 crc kubenswrapper[4700]: I1007 11:31:33.761113 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-k4h2w"] Oct 07 11:31:33 crc kubenswrapper[4700]: W1007 11:31:33.766247 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod607c033a_3b68_4731_92c3_c9a9a08acd5c.slice/crio-ec49c7270a4ae0c74478d257974b4ba15fe77f693489e7e4c401d2f2f61d816c WatchSource:0}: Error finding container ec49c7270a4ae0c74478d257974b4ba15fe77f693489e7e4c401d2f2f61d816c: Status 404 returned error can't find the container with id ec49c7270a4ae0c74478d257974b4ba15fe77f693489e7e4c401d2f2f61d816c Oct 07 11:31:34 crc kubenswrapper[4700]: I1007 11:31:34.597480 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k4h2w" event={"ID":"607c033a-3b68-4731-92c3-c9a9a08acd5c","Type":"ContainerStarted","Data":"ec49c7270a4ae0c74478d257974b4ba15fe77f693489e7e4c401d2f2f61d816c"} Oct 07 11:31:34 crc kubenswrapper[4700]: I1007 11:31:34.599647 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bcfd589bd-9cvs7" event={"ID":"f8ad810d-8714-40d0-93dd-41cd99c62606","Type":"ContainerStarted","Data":"7ee2d963ae177aa329585b880d35f78b6a3c7b07cd69ad4365ea58bcaa3cf781"} Oct 07 11:31:34 crc kubenswrapper[4700]: I1007 11:31:34.636332 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5bcfd589bd-9cvs7" podStartSLOduration=2.636268249 podStartE2EDuration="2.636268249s" podCreationTimestamp="2025-10-07 11:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:31:34.621643223 +0000 UTC m=+661.418042222" watchObservedRunningTime="2025-10-07 11:31:34.636268249 +0000 UTC m=+661.432667248" Oct 07 11:31:36 crc kubenswrapper[4700]: I1007 11:31:36.613394 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k4h2w" event={"ID":"607c033a-3b68-4731-92c3-c9a9a08acd5c","Type":"ContainerStarted","Data":"7e51fc1618a1657e4a7ebd2b46bf669a9d1b5e343c61ab60c88add604e0b9211"} Oct 07 11:31:36 crc kubenswrapper[4700]: I1007 11:31:36.614345 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k4h2w" Oct 07 11:31:36 crc kubenswrapper[4700]: I1007 11:31:36.617092 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-7q2md" event={"ID":"c2c057c9-3fea-45df-9991-448998e13a79","Type":"ContainerStarted","Data":"01230075a1a11ae4f7c44314b3dc5175e3a0698a17221771d176e619c60ead9e"} Oct 07 11:31:36 crc kubenswrapper[4700]: I1007 11:31:36.617255 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-7q2md" Oct 07 11:31:36 crc kubenswrapper[4700]: I1007 11:31:36.619436 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-svznx" event={"ID":"7d6bc40a-05b0-4ff6-9111-44cc8bc88ea2","Type":"ContainerStarted","Data":"85fac0402d1dd7e037a199de1eb81a8f4b01d2b3854a08a233f07b36498827b1"} Oct 07 11:31:36 crc kubenswrapper[4700]: I1007 11:31:36.620731 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-9r6fx" event={"ID":"65303df1-c2ad-4edf-83c1-8b3d6bce8c33","Type":"ContainerStarted","Data":"6bd3d3065b4712f09eaae2749f2cd5393194d2c52d72906da111dfb1f27bd020"} Oct 07 11:31:36 crc kubenswrapper[4700]: I1007 11:31:36.638551 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k4h2w" podStartSLOduration=2.182772444 podStartE2EDuration="4.638531655s" podCreationTimestamp="2025-10-07 11:31:32 +0000 UTC" firstStartedPulling="2025-10-07 11:31:33.768437772 +0000 UTC m=+660.564836761" lastFinishedPulling="2025-10-07 11:31:36.224196983 +0000 UTC m=+663.020595972" observedRunningTime="2025-10-07 11:31:36.63568245 +0000 UTC m=+663.432081459" watchObservedRunningTime="2025-10-07 11:31:36.638531655 +0000 UTC m=+663.434930654" Oct 07 11:31:36 crc kubenswrapper[4700]: I1007 11:31:36.662176 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-7q2md" podStartSLOduration=1.428203056 podStartE2EDuration="4.662149238s" podCreationTimestamp="2025-10-07 11:31:32 +0000 UTC" firstStartedPulling="2025-10-07 11:31:32.991660338 +0000 UTC m=+659.788059327" lastFinishedPulling="2025-10-07 11:31:36.22560652 +0000 UTC m=+663.022005509" observedRunningTime="2025-10-07 11:31:36.657839335 +0000 UTC m=+663.454238324" watchObservedRunningTime="2025-10-07 11:31:36.662149238 +0000 UTC m=+663.458548267" Oct 07 11:31:36 crc kubenswrapper[4700]: I1007 11:31:36.671942 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-9r6fx" podStartSLOduration=1.751078513 podStartE2EDuration="4.671933206s" podCreationTimestamp="2025-10-07 11:31:32 +0000 UTC" firstStartedPulling="2025-10-07 11:31:33.299221682 +0000 UTC m=+660.095620671" lastFinishedPulling="2025-10-07 11:31:36.220076375 +0000 UTC m=+663.016475364" observedRunningTime="2025-10-07 11:31:36.671423363 +0000 UTC m=+663.467822362" watchObservedRunningTime="2025-10-07 11:31:36.671933206 +0000 UTC m=+663.468332205" Oct 07 11:31:40 crc kubenswrapper[4700]: I1007 11:31:40.656103 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-svznx" event={"ID":"7d6bc40a-05b0-4ff6-9111-44cc8bc88ea2","Type":"ContainerStarted","Data":"766f01a47fdf58fd00bfe78d3bcd334af95f252f372ca287f455de079ef0f8b4"} Oct 07 11:31:41 crc kubenswrapper[4700]: I1007 11:31:41.695810 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-svznx" podStartSLOduration=2.653971505 podStartE2EDuration="9.695777854s" podCreationTimestamp="2025-10-07 11:31:32 +0000 UTC" firstStartedPulling="2025-10-07 11:31:33.182063701 +0000 UTC m=+659.978462690" lastFinishedPulling="2025-10-07 11:31:40.22387001 +0000 UTC m=+667.020269039" observedRunningTime="2025-10-07 11:31:41.690002191 +0000 UTC m=+668.486401270" watchObservedRunningTime="2025-10-07 11:31:41.695777854 +0000 UTC m=+668.492176873" Oct 07 11:31:42 crc kubenswrapper[4700]: I1007 11:31:42.982886 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-7q2md" Oct 07 11:31:43 crc kubenswrapper[4700]: I1007 11:31:43.335604 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5bcfd589bd-9cvs7" Oct 07 11:31:43 crc kubenswrapper[4700]: I1007 11:31:43.335946 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5bcfd589bd-9cvs7" Oct 07 11:31:43 crc kubenswrapper[4700]: I1007 11:31:43.342372 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5bcfd589bd-9cvs7" Oct 07 11:31:43 crc kubenswrapper[4700]: I1007 11:31:43.699702 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5bcfd589bd-9cvs7" Oct 07 11:31:43 crc kubenswrapper[4700]: I1007 11:31:43.761676 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-xx8r8"] Oct 07 11:31:53 crc kubenswrapper[4700]: I1007 11:31:53.532745 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k4h2w" Oct 07 11:32:08 crc kubenswrapper[4700]: I1007 11:32:08.838000 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-xx8r8" podUID="96767da2-aab3-4f6d-a14b-ada1c0a4ded8" containerName="console" containerID="cri-o://fa3bd17f007cfb8592075be953c6daea733fd393b8a147ebad10b7e321ec0b63" gracePeriod=15 Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.259359 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-xx8r8_96767da2-aab3-4f6d-a14b-ada1c0a4ded8/console/0.log" Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.259691 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-xx8r8" Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.383483 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xxwj\" (UniqueName: \"kubernetes.io/projected/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-kube-api-access-7xxwj\") pod \"96767da2-aab3-4f6d-a14b-ada1c0a4ded8\" (UID: \"96767da2-aab3-4f6d-a14b-ada1c0a4ded8\") " Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.383534 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-trusted-ca-bundle\") pod \"96767da2-aab3-4f6d-a14b-ada1c0a4ded8\" (UID: \"96767da2-aab3-4f6d-a14b-ada1c0a4ded8\") " Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.383557 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-console-serving-cert\") pod \"96767da2-aab3-4f6d-a14b-ada1c0a4ded8\" (UID: \"96767da2-aab3-4f6d-a14b-ada1c0a4ded8\") " Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.383577 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-console-oauth-config\") pod \"96767da2-aab3-4f6d-a14b-ada1c0a4ded8\" (UID: \"96767da2-aab3-4f6d-a14b-ada1c0a4ded8\") " Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.383642 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-console-config\") pod \"96767da2-aab3-4f6d-a14b-ada1c0a4ded8\" (UID: \"96767da2-aab3-4f6d-a14b-ada1c0a4ded8\") " Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.383693 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-service-ca\") pod \"96767da2-aab3-4f6d-a14b-ada1c0a4ded8\" (UID: \"96767da2-aab3-4f6d-a14b-ada1c0a4ded8\") " Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.384156 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "96767da2-aab3-4f6d-a14b-ada1c0a4ded8" (UID: "96767da2-aab3-4f6d-a14b-ada1c0a4ded8"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.384200 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-service-ca" (OuterVolumeSpecName: "service-ca") pod "96767da2-aab3-4f6d-a14b-ada1c0a4ded8" (UID: "96767da2-aab3-4f6d-a14b-ada1c0a4ded8"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.384232 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-console-config" (OuterVolumeSpecName: "console-config") pod "96767da2-aab3-4f6d-a14b-ada1c0a4ded8" (UID: "96767da2-aab3-4f6d-a14b-ada1c0a4ded8"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.384340 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-oauth-serving-cert\") pod \"96767da2-aab3-4f6d-a14b-ada1c0a4ded8\" (UID: \"96767da2-aab3-4f6d-a14b-ada1c0a4ded8\") " Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.384643 4700 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.384664 4700 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-console-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.384685 4700 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.384699 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "96767da2-aab3-4f6d-a14b-ada1c0a4ded8" (UID: "96767da2-aab3-4f6d-a14b-ada1c0a4ded8"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.391601 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "96767da2-aab3-4f6d-a14b-ada1c0a4ded8" (UID: "96767da2-aab3-4f6d-a14b-ada1c0a4ded8"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.391685 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-kube-api-access-7xxwj" (OuterVolumeSpecName: "kube-api-access-7xxwj") pod "96767da2-aab3-4f6d-a14b-ada1c0a4ded8" (UID: "96767da2-aab3-4f6d-a14b-ada1c0a4ded8"). InnerVolumeSpecName "kube-api-access-7xxwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.391801 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "96767da2-aab3-4f6d-a14b-ada1c0a4ded8" (UID: "96767da2-aab3-4f6d-a14b-ada1c0a4ded8"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.486498 4700 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.486539 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xxwj\" (UniqueName: \"kubernetes.io/projected/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-kube-api-access-7xxwj\") on node \"crc\" DevicePath \"\"" Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.486556 4700 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.486584 4700 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96767da2-aab3-4f6d-a14b-ada1c0a4ded8-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.546804 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk"] Oct 07 11:32:09 crc kubenswrapper[4700]: E1007 11:32:09.547079 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96767da2-aab3-4f6d-a14b-ada1c0a4ded8" containerName="console" Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.547096 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="96767da2-aab3-4f6d-a14b-ada1c0a4ded8" containerName="console" Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.547211 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="96767da2-aab3-4f6d-a14b-ada1c0a4ded8" containerName="console" Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.548083 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk" Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.555130 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.565607 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk"] Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.688789 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/13ec320d-824d-4483-bdc7-6a419cbdd630-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk\" (UID: \"13ec320d-824d-4483-bdc7-6a419cbdd630\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk" Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.688946 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vpq2\" (UniqueName: \"kubernetes.io/projected/13ec320d-824d-4483-bdc7-6a419cbdd630-kube-api-access-4vpq2\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk\" (UID: \"13ec320d-824d-4483-bdc7-6a419cbdd630\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk" Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.689012 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/13ec320d-824d-4483-bdc7-6a419cbdd630-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk\" (UID: \"13ec320d-824d-4483-bdc7-6a419cbdd630\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk" Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.790460 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vpq2\" (UniqueName: \"kubernetes.io/projected/13ec320d-824d-4483-bdc7-6a419cbdd630-kube-api-access-4vpq2\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk\" (UID: \"13ec320d-824d-4483-bdc7-6a419cbdd630\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk" Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.790544 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/13ec320d-824d-4483-bdc7-6a419cbdd630-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk\" (UID: \"13ec320d-824d-4483-bdc7-6a419cbdd630\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk" Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.790691 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/13ec320d-824d-4483-bdc7-6a419cbdd630-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk\" (UID: \"13ec320d-824d-4483-bdc7-6a419cbdd630\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk" Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.791066 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/13ec320d-824d-4483-bdc7-6a419cbdd630-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk\" (UID: \"13ec320d-824d-4483-bdc7-6a419cbdd630\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk" Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.791508 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/13ec320d-824d-4483-bdc7-6a419cbdd630-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk\" (UID: \"13ec320d-824d-4483-bdc7-6a419cbdd630\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk" Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.821361 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vpq2\" (UniqueName: \"kubernetes.io/projected/13ec320d-824d-4483-bdc7-6a419cbdd630-kube-api-access-4vpq2\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk\" (UID: \"13ec320d-824d-4483-bdc7-6a419cbdd630\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk" Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.864909 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk" Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.879691 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-xx8r8_96767da2-aab3-4f6d-a14b-ada1c0a4ded8/console/0.log" Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.879757 4700 generic.go:334] "Generic (PLEG): container finished" podID="96767da2-aab3-4f6d-a14b-ada1c0a4ded8" containerID="fa3bd17f007cfb8592075be953c6daea733fd393b8a147ebad10b7e321ec0b63" exitCode=2 Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.879795 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-xx8r8" event={"ID":"96767da2-aab3-4f6d-a14b-ada1c0a4ded8","Type":"ContainerDied","Data":"fa3bd17f007cfb8592075be953c6daea733fd393b8a147ebad10b7e321ec0b63"} Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.879831 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-xx8r8" event={"ID":"96767da2-aab3-4f6d-a14b-ada1c0a4ded8","Type":"ContainerDied","Data":"c6538b61de4ebf7d45dffdffbabd115c544a735d56563decaa9c4b00e96a73f5"} Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.879853 4700 scope.go:117] "RemoveContainer" containerID="fa3bd17f007cfb8592075be953c6daea733fd393b8a147ebad10b7e321ec0b63" Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.879903 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-xx8r8" Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.915059 4700 scope.go:117] "RemoveContainer" containerID="fa3bd17f007cfb8592075be953c6daea733fd393b8a147ebad10b7e321ec0b63" Oct 07 11:32:09 crc kubenswrapper[4700]: E1007 11:32:09.918018 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa3bd17f007cfb8592075be953c6daea733fd393b8a147ebad10b7e321ec0b63\": container with ID starting with fa3bd17f007cfb8592075be953c6daea733fd393b8a147ebad10b7e321ec0b63 not found: ID does not exist" containerID="fa3bd17f007cfb8592075be953c6daea733fd393b8a147ebad10b7e321ec0b63" Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.918090 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa3bd17f007cfb8592075be953c6daea733fd393b8a147ebad10b7e321ec0b63"} err="failed to get container status \"fa3bd17f007cfb8592075be953c6daea733fd393b8a147ebad10b7e321ec0b63\": rpc error: code = NotFound desc = could not find container \"fa3bd17f007cfb8592075be953c6daea733fd393b8a147ebad10b7e321ec0b63\": container with ID starting with fa3bd17f007cfb8592075be953c6daea733fd393b8a147ebad10b7e321ec0b63 not found: ID does not exist" Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.927169 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-xx8r8"] Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.933802 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-xx8r8"] Oct 07 11:32:09 crc kubenswrapper[4700]: I1007 11:32:09.973520 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96767da2-aab3-4f6d-a14b-ada1c0a4ded8" path="/var/lib/kubelet/pods/96767da2-aab3-4f6d-a14b-ada1c0a4ded8/volumes" Oct 07 11:32:10 crc kubenswrapper[4700]: I1007 11:32:10.175320 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk"] Oct 07 11:32:10 crc kubenswrapper[4700]: I1007 11:32:10.891883 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk" event={"ID":"13ec320d-824d-4483-bdc7-6a419cbdd630","Type":"ContainerStarted","Data":"587925b76e4d10187273c336e241a10a3f1f14bf620724a06fed27000450159d"} Oct 07 11:32:10 crc kubenswrapper[4700]: I1007 11:32:10.892380 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk" event={"ID":"13ec320d-824d-4483-bdc7-6a419cbdd630","Type":"ContainerStarted","Data":"2a6b4ea38626e4684aac7ca57fcd7e8df7e6f523825161ba0a88f7265b851f8f"} Oct 07 11:32:11 crc kubenswrapper[4700]: I1007 11:32:11.901863 4700 generic.go:334] "Generic (PLEG): container finished" podID="13ec320d-824d-4483-bdc7-6a419cbdd630" containerID="587925b76e4d10187273c336e241a10a3f1f14bf620724a06fed27000450159d" exitCode=0 Oct 07 11:32:11 crc kubenswrapper[4700]: I1007 11:32:11.901936 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk" event={"ID":"13ec320d-824d-4483-bdc7-6a419cbdd630","Type":"ContainerDied","Data":"587925b76e4d10187273c336e241a10a3f1f14bf620724a06fed27000450159d"} Oct 07 11:32:13 crc kubenswrapper[4700]: I1007 11:32:13.919861 4700 generic.go:334] "Generic (PLEG): container finished" podID="13ec320d-824d-4483-bdc7-6a419cbdd630" containerID="e60ad33c23b0024e8ecd54093025dc19c81babc3e8120dc750f9735e64a23954" exitCode=0 Oct 07 11:32:13 crc kubenswrapper[4700]: I1007 11:32:13.919999 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk" event={"ID":"13ec320d-824d-4483-bdc7-6a419cbdd630","Type":"ContainerDied","Data":"e60ad33c23b0024e8ecd54093025dc19c81babc3e8120dc750f9735e64a23954"} Oct 07 11:32:14 crc kubenswrapper[4700]: I1007 11:32:14.932394 4700 generic.go:334] "Generic (PLEG): container finished" podID="13ec320d-824d-4483-bdc7-6a419cbdd630" containerID="bda045e6e4862bd7262333e5eb7045b4c68ba4e2dff3345fe637277ce6a8810f" exitCode=0 Oct 07 11:32:14 crc kubenswrapper[4700]: I1007 11:32:14.932492 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk" event={"ID":"13ec320d-824d-4483-bdc7-6a419cbdd630","Type":"ContainerDied","Data":"bda045e6e4862bd7262333e5eb7045b4c68ba4e2dff3345fe637277ce6a8810f"} Oct 07 11:32:16 crc kubenswrapper[4700]: I1007 11:32:16.275707 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk" Oct 07 11:32:16 crc kubenswrapper[4700]: I1007 11:32:16.389993 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vpq2\" (UniqueName: \"kubernetes.io/projected/13ec320d-824d-4483-bdc7-6a419cbdd630-kube-api-access-4vpq2\") pod \"13ec320d-824d-4483-bdc7-6a419cbdd630\" (UID: \"13ec320d-824d-4483-bdc7-6a419cbdd630\") " Oct 07 11:32:16 crc kubenswrapper[4700]: I1007 11:32:16.390089 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/13ec320d-824d-4483-bdc7-6a419cbdd630-util\") pod \"13ec320d-824d-4483-bdc7-6a419cbdd630\" (UID: \"13ec320d-824d-4483-bdc7-6a419cbdd630\") " Oct 07 11:32:16 crc kubenswrapper[4700]: I1007 11:32:16.390120 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/13ec320d-824d-4483-bdc7-6a419cbdd630-bundle\") pod \"13ec320d-824d-4483-bdc7-6a419cbdd630\" (UID: \"13ec320d-824d-4483-bdc7-6a419cbdd630\") " Oct 07 11:32:16 crc kubenswrapper[4700]: I1007 11:32:16.391624 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13ec320d-824d-4483-bdc7-6a419cbdd630-bundle" (OuterVolumeSpecName: "bundle") pod "13ec320d-824d-4483-bdc7-6a419cbdd630" (UID: "13ec320d-824d-4483-bdc7-6a419cbdd630"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:32:16 crc kubenswrapper[4700]: I1007 11:32:16.391800 4700 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/13ec320d-824d-4483-bdc7-6a419cbdd630-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:32:16 crc kubenswrapper[4700]: I1007 11:32:16.395378 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13ec320d-824d-4483-bdc7-6a419cbdd630-kube-api-access-4vpq2" (OuterVolumeSpecName: "kube-api-access-4vpq2") pod "13ec320d-824d-4483-bdc7-6a419cbdd630" (UID: "13ec320d-824d-4483-bdc7-6a419cbdd630"). InnerVolumeSpecName "kube-api-access-4vpq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:32:16 crc kubenswrapper[4700]: I1007 11:32:16.493030 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vpq2\" (UniqueName: \"kubernetes.io/projected/13ec320d-824d-4483-bdc7-6a419cbdd630-kube-api-access-4vpq2\") on node \"crc\" DevicePath \"\"" Oct 07 11:32:16 crc kubenswrapper[4700]: I1007 11:32:16.687985 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13ec320d-824d-4483-bdc7-6a419cbdd630-util" (OuterVolumeSpecName: "util") pod "13ec320d-824d-4483-bdc7-6a419cbdd630" (UID: "13ec320d-824d-4483-bdc7-6a419cbdd630"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:32:16 crc kubenswrapper[4700]: I1007 11:32:16.695615 4700 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/13ec320d-824d-4483-bdc7-6a419cbdd630-util\") on node \"crc\" DevicePath \"\"" Oct 07 11:32:16 crc kubenswrapper[4700]: I1007 11:32:16.950606 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk" event={"ID":"13ec320d-824d-4483-bdc7-6a419cbdd630","Type":"ContainerDied","Data":"2a6b4ea38626e4684aac7ca57fcd7e8df7e6f523825161ba0a88f7265b851f8f"} Oct 07 11:32:16 crc kubenswrapper[4700]: I1007 11:32:16.950760 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a6b4ea38626e4684aac7ca57fcd7e8df7e6f523825161ba0a88f7265b851f8f" Oct 07 11:32:16 crc kubenswrapper[4700]: I1007 11:32:16.950706 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk" Oct 07 11:32:24 crc kubenswrapper[4700]: I1007 11:32:24.693032 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-779847879b-zbmnp"] Oct 07 11:32:24 crc kubenswrapper[4700]: E1007 11:32:24.693757 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13ec320d-824d-4483-bdc7-6a419cbdd630" containerName="extract" Oct 07 11:32:24 crc kubenswrapper[4700]: I1007 11:32:24.693770 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="13ec320d-824d-4483-bdc7-6a419cbdd630" containerName="extract" Oct 07 11:32:24 crc kubenswrapper[4700]: E1007 11:32:24.693782 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13ec320d-824d-4483-bdc7-6a419cbdd630" containerName="util" Oct 07 11:32:24 crc kubenswrapper[4700]: I1007 11:32:24.693787 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="13ec320d-824d-4483-bdc7-6a419cbdd630" containerName="util" Oct 07 11:32:24 crc kubenswrapper[4700]: E1007 11:32:24.693809 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13ec320d-824d-4483-bdc7-6a419cbdd630" containerName="pull" Oct 07 11:32:24 crc kubenswrapper[4700]: I1007 11:32:24.693815 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="13ec320d-824d-4483-bdc7-6a419cbdd630" containerName="pull" Oct 07 11:32:24 crc kubenswrapper[4700]: I1007 11:32:24.693899 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="13ec320d-824d-4483-bdc7-6a419cbdd630" containerName="extract" Oct 07 11:32:24 crc kubenswrapper[4700]: I1007 11:32:24.694314 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-779847879b-zbmnp" Oct 07 11:32:24 crc kubenswrapper[4700]: I1007 11:32:24.696875 4700 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 07 11:32:24 crc kubenswrapper[4700]: I1007 11:32:24.696989 4700 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 07 11:32:24 crc kubenswrapper[4700]: I1007 11:32:24.697646 4700 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-kpg5p" Oct 07 11:32:24 crc kubenswrapper[4700]: I1007 11:32:24.697760 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 07 11:32:24 crc kubenswrapper[4700]: I1007 11:32:24.697716 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 07 11:32:24 crc kubenswrapper[4700]: I1007 11:32:24.716395 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-779847879b-zbmnp"] Oct 07 11:32:24 crc kubenswrapper[4700]: I1007 11:32:24.805071 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q5qz\" (UniqueName: \"kubernetes.io/projected/7089843f-2eed-4318-b373-ff19fa518a8d-kube-api-access-9q5qz\") pod \"metallb-operator-controller-manager-779847879b-zbmnp\" (UID: \"7089843f-2eed-4318-b373-ff19fa518a8d\") " pod="metallb-system/metallb-operator-controller-manager-779847879b-zbmnp" Oct 07 11:32:24 crc kubenswrapper[4700]: I1007 11:32:24.805121 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7089843f-2eed-4318-b373-ff19fa518a8d-apiservice-cert\") pod \"metallb-operator-controller-manager-779847879b-zbmnp\" (UID: \"7089843f-2eed-4318-b373-ff19fa518a8d\") " pod="metallb-system/metallb-operator-controller-manager-779847879b-zbmnp" Oct 07 11:32:24 crc kubenswrapper[4700]: I1007 11:32:24.805217 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7089843f-2eed-4318-b373-ff19fa518a8d-webhook-cert\") pod \"metallb-operator-controller-manager-779847879b-zbmnp\" (UID: \"7089843f-2eed-4318-b373-ff19fa518a8d\") " pod="metallb-system/metallb-operator-controller-manager-779847879b-zbmnp" Oct 07 11:32:24 crc kubenswrapper[4700]: I1007 11:32:24.906047 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7089843f-2eed-4318-b373-ff19fa518a8d-apiservice-cert\") pod \"metallb-operator-controller-manager-779847879b-zbmnp\" (UID: \"7089843f-2eed-4318-b373-ff19fa518a8d\") " pod="metallb-system/metallb-operator-controller-manager-779847879b-zbmnp" Oct 07 11:32:24 crc kubenswrapper[4700]: I1007 11:32:24.906137 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7089843f-2eed-4318-b373-ff19fa518a8d-webhook-cert\") pod \"metallb-operator-controller-manager-779847879b-zbmnp\" (UID: \"7089843f-2eed-4318-b373-ff19fa518a8d\") " pod="metallb-system/metallb-operator-controller-manager-779847879b-zbmnp" Oct 07 11:32:24 crc kubenswrapper[4700]: I1007 11:32:24.906188 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q5qz\" (UniqueName: \"kubernetes.io/projected/7089843f-2eed-4318-b373-ff19fa518a8d-kube-api-access-9q5qz\") pod \"metallb-operator-controller-manager-779847879b-zbmnp\" (UID: \"7089843f-2eed-4318-b373-ff19fa518a8d\") " pod="metallb-system/metallb-operator-controller-manager-779847879b-zbmnp" Oct 07 11:32:24 crc kubenswrapper[4700]: I1007 11:32:24.914333 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7089843f-2eed-4318-b373-ff19fa518a8d-apiservice-cert\") pod \"metallb-operator-controller-manager-779847879b-zbmnp\" (UID: \"7089843f-2eed-4318-b373-ff19fa518a8d\") " pod="metallb-system/metallb-operator-controller-manager-779847879b-zbmnp" Oct 07 11:32:24 crc kubenswrapper[4700]: I1007 11:32:24.920937 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7089843f-2eed-4318-b373-ff19fa518a8d-webhook-cert\") pod \"metallb-operator-controller-manager-779847879b-zbmnp\" (UID: \"7089843f-2eed-4318-b373-ff19fa518a8d\") " pod="metallb-system/metallb-operator-controller-manager-779847879b-zbmnp" Oct 07 11:32:24 crc kubenswrapper[4700]: I1007 11:32:24.921180 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q5qz\" (UniqueName: \"kubernetes.io/projected/7089843f-2eed-4318-b373-ff19fa518a8d-kube-api-access-9q5qz\") pod \"metallb-operator-controller-manager-779847879b-zbmnp\" (UID: \"7089843f-2eed-4318-b373-ff19fa518a8d\") " pod="metallb-system/metallb-operator-controller-manager-779847879b-zbmnp" Oct 07 11:32:24 crc kubenswrapper[4700]: I1007 11:32:24.995743 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-654f9bf6d-jll85"] Oct 07 11:32:24 crc kubenswrapper[4700]: I1007 11:32:24.996502 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-654f9bf6d-jll85" Oct 07 11:32:25 crc kubenswrapper[4700]: I1007 11:32:25.002459 4700 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 07 11:32:25 crc kubenswrapper[4700]: I1007 11:32:25.002540 4700 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 07 11:32:25 crc kubenswrapper[4700]: I1007 11:32:25.002634 4700 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-wlnvx" Oct 07 11:32:25 crc kubenswrapper[4700]: I1007 11:32:25.011127 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-654f9bf6d-jll85"] Oct 07 11:32:25 crc kubenswrapper[4700]: I1007 11:32:25.011942 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-779847879b-zbmnp" Oct 07 11:32:25 crc kubenswrapper[4700]: I1007 11:32:25.012538 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97d901d3-6f2a-4e96-8578-f169200d5f6a-webhook-cert\") pod \"metallb-operator-webhook-server-654f9bf6d-jll85\" (UID: \"97d901d3-6f2a-4e96-8578-f169200d5f6a\") " pod="metallb-system/metallb-operator-webhook-server-654f9bf6d-jll85" Oct 07 11:32:25 crc kubenswrapper[4700]: I1007 11:32:25.012614 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79jpb\" (UniqueName: \"kubernetes.io/projected/97d901d3-6f2a-4e96-8578-f169200d5f6a-kube-api-access-79jpb\") pod \"metallb-operator-webhook-server-654f9bf6d-jll85\" (UID: \"97d901d3-6f2a-4e96-8578-f169200d5f6a\") " pod="metallb-system/metallb-operator-webhook-server-654f9bf6d-jll85" Oct 07 11:32:25 crc kubenswrapper[4700]: I1007 11:32:25.012890 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97d901d3-6f2a-4e96-8578-f169200d5f6a-apiservice-cert\") pod \"metallb-operator-webhook-server-654f9bf6d-jll85\" (UID: \"97d901d3-6f2a-4e96-8578-f169200d5f6a\") " pod="metallb-system/metallb-operator-webhook-server-654f9bf6d-jll85" Oct 07 11:32:25 crc kubenswrapper[4700]: I1007 11:32:25.113669 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97d901d3-6f2a-4e96-8578-f169200d5f6a-webhook-cert\") pod \"metallb-operator-webhook-server-654f9bf6d-jll85\" (UID: \"97d901d3-6f2a-4e96-8578-f169200d5f6a\") " pod="metallb-system/metallb-operator-webhook-server-654f9bf6d-jll85" Oct 07 11:32:25 crc kubenswrapper[4700]: I1007 11:32:25.113744 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79jpb\" (UniqueName: \"kubernetes.io/projected/97d901d3-6f2a-4e96-8578-f169200d5f6a-kube-api-access-79jpb\") pod \"metallb-operator-webhook-server-654f9bf6d-jll85\" (UID: \"97d901d3-6f2a-4e96-8578-f169200d5f6a\") " pod="metallb-system/metallb-operator-webhook-server-654f9bf6d-jll85" Oct 07 11:32:25 crc kubenswrapper[4700]: I1007 11:32:25.113803 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97d901d3-6f2a-4e96-8578-f169200d5f6a-apiservice-cert\") pod \"metallb-operator-webhook-server-654f9bf6d-jll85\" (UID: \"97d901d3-6f2a-4e96-8578-f169200d5f6a\") " pod="metallb-system/metallb-operator-webhook-server-654f9bf6d-jll85" Oct 07 11:32:25 crc kubenswrapper[4700]: I1007 11:32:25.119538 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97d901d3-6f2a-4e96-8578-f169200d5f6a-apiservice-cert\") pod \"metallb-operator-webhook-server-654f9bf6d-jll85\" (UID: \"97d901d3-6f2a-4e96-8578-f169200d5f6a\") " pod="metallb-system/metallb-operator-webhook-server-654f9bf6d-jll85" Oct 07 11:32:25 crc kubenswrapper[4700]: I1007 11:32:25.123358 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97d901d3-6f2a-4e96-8578-f169200d5f6a-webhook-cert\") pod \"metallb-operator-webhook-server-654f9bf6d-jll85\" (UID: \"97d901d3-6f2a-4e96-8578-f169200d5f6a\") " pod="metallb-system/metallb-operator-webhook-server-654f9bf6d-jll85" Oct 07 11:32:25 crc kubenswrapper[4700]: I1007 11:32:25.134864 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79jpb\" (UniqueName: \"kubernetes.io/projected/97d901d3-6f2a-4e96-8578-f169200d5f6a-kube-api-access-79jpb\") pod \"metallb-operator-webhook-server-654f9bf6d-jll85\" (UID: \"97d901d3-6f2a-4e96-8578-f169200d5f6a\") " pod="metallb-system/metallb-operator-webhook-server-654f9bf6d-jll85" Oct 07 11:32:25 crc kubenswrapper[4700]: I1007 11:32:25.274262 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-779847879b-zbmnp"] Oct 07 11:32:25 crc kubenswrapper[4700]: I1007 11:32:25.312587 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-654f9bf6d-jll85" Oct 07 11:32:25 crc kubenswrapper[4700]: I1007 11:32:25.525354 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-654f9bf6d-jll85"] Oct 07 11:32:25 crc kubenswrapper[4700]: W1007 11:32:25.543909 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97d901d3_6f2a_4e96_8578_f169200d5f6a.slice/crio-99fa745f8d40945a3e3d9a92eee1f6e85337343ab1cc659617824e9aea2a9993 WatchSource:0}: Error finding container 99fa745f8d40945a3e3d9a92eee1f6e85337343ab1cc659617824e9aea2a9993: Status 404 returned error can't find the container with id 99fa745f8d40945a3e3d9a92eee1f6e85337343ab1cc659617824e9aea2a9993 Oct 07 11:32:25 crc kubenswrapper[4700]: I1007 11:32:25.998160 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-654f9bf6d-jll85" event={"ID":"97d901d3-6f2a-4e96-8578-f169200d5f6a","Type":"ContainerStarted","Data":"99fa745f8d40945a3e3d9a92eee1f6e85337343ab1cc659617824e9aea2a9993"} Oct 07 11:32:25 crc kubenswrapper[4700]: I1007 11:32:25.999813 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-779847879b-zbmnp" event={"ID":"7089843f-2eed-4318-b373-ff19fa518a8d","Type":"ContainerStarted","Data":"611328a64ce16eb614421d237321f1d89426f69eaf1182eea9ae0b543539ee4f"} Oct 07 11:32:29 crc kubenswrapper[4700]: I1007 11:32:29.026830 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-779847879b-zbmnp" event={"ID":"7089843f-2eed-4318-b373-ff19fa518a8d","Type":"ContainerStarted","Data":"69c71f13ea39f9cfd0e7d95c425844ba0d834a78699b07f460ade8fdd85d8790"} Oct 07 11:32:29 crc kubenswrapper[4700]: I1007 11:32:29.028183 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-779847879b-zbmnp" Oct 07 11:32:29 crc kubenswrapper[4700]: I1007 11:32:29.059239 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-779847879b-zbmnp" podStartSLOduration=1.762976436 podStartE2EDuration="5.059214607s" podCreationTimestamp="2025-10-07 11:32:24 +0000 UTC" firstStartedPulling="2025-10-07 11:32:25.296228681 +0000 UTC m=+712.092627670" lastFinishedPulling="2025-10-07 11:32:28.592466852 +0000 UTC m=+715.388865841" observedRunningTime="2025-10-07 11:32:29.056163037 +0000 UTC m=+715.852562016" watchObservedRunningTime="2025-10-07 11:32:29.059214607 +0000 UTC m=+715.855613596" Oct 07 11:32:31 crc kubenswrapper[4700]: I1007 11:32:31.042096 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-654f9bf6d-jll85" event={"ID":"97d901d3-6f2a-4e96-8578-f169200d5f6a","Type":"ContainerStarted","Data":"24e1de133e841f8bd7c652789fc23e22fe3a4c532a9c055fd190c7cf29d6ce97"} Oct 07 11:32:31 crc kubenswrapper[4700]: I1007 11:32:31.042478 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-654f9bf6d-jll85" Oct 07 11:32:31 crc kubenswrapper[4700]: I1007 11:32:31.066047 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-654f9bf6d-jll85" podStartSLOduration=1.937605757 podStartE2EDuration="7.066026862s" podCreationTimestamp="2025-10-07 11:32:24 +0000 UTC" firstStartedPulling="2025-10-07 11:32:25.546867644 +0000 UTC m=+712.343266623" lastFinishedPulling="2025-10-07 11:32:30.675288709 +0000 UTC m=+717.471687728" observedRunningTime="2025-10-07 11:32:31.061318878 +0000 UTC m=+717.857717877" watchObservedRunningTime="2025-10-07 11:32:31.066026862 +0000 UTC m=+717.862425841" Oct 07 11:32:45 crc kubenswrapper[4700]: I1007 11:32:45.320407 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-654f9bf6d-jll85" Oct 07 11:32:45 crc kubenswrapper[4700]: I1007 11:32:45.333494 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 11:32:45 crc kubenswrapper[4700]: I1007 11:32:45.333567 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.015131 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-779847879b-zbmnp" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.794533 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-zz4cl"] Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.797349 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-zz4cl" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.800233 4700 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-nwzr4" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.800474 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.800248 4700 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.805421 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-59fcf"] Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.806418 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-59fcf" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.807868 4700 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.812079 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-59fcf"] Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.886866 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-pv9ct"] Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.887646 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d2a66390-1e1f-433a-8637-8f9c03197ab8-frr-conf\") pod \"frr-k8s-zz4cl\" (UID: \"d2a66390-1e1f-433a-8637-8f9c03197ab8\") " pod="metallb-system/frr-k8s-zz4cl" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.887712 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpnnh\" (UniqueName: \"kubernetes.io/projected/d2a66390-1e1f-433a-8637-8f9c03197ab8-kube-api-access-zpnnh\") pod \"frr-k8s-zz4cl\" (UID: \"d2a66390-1e1f-433a-8637-8f9c03197ab8\") " pod="metallb-system/frr-k8s-zz4cl" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.887776 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-pv9ct" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.887777 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d2a66390-1e1f-433a-8637-8f9c03197ab8-reloader\") pod \"frr-k8s-zz4cl\" (UID: \"d2a66390-1e1f-433a-8637-8f9c03197ab8\") " pod="metallb-system/frr-k8s-zz4cl" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.888378 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/173c5607-1006-4c8c-afc1-79c8248bbe7a-cert\") pod \"frr-k8s-webhook-server-64bf5d555-59fcf\" (UID: \"173c5607-1006-4c8c-afc1-79c8248bbe7a\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-59fcf" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.888523 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d2a66390-1e1f-433a-8637-8f9c03197ab8-frr-sockets\") pod \"frr-k8s-zz4cl\" (UID: \"d2a66390-1e1f-433a-8637-8f9c03197ab8\") " pod="metallb-system/frr-k8s-zz4cl" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.888655 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2a66390-1e1f-433a-8637-8f9c03197ab8-metrics-certs\") pod \"frr-k8s-zz4cl\" (UID: \"d2a66390-1e1f-433a-8637-8f9c03197ab8\") " pod="metallb-system/frr-k8s-zz4cl" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.888769 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d2a66390-1e1f-433a-8637-8f9c03197ab8-frr-startup\") pod \"frr-k8s-zz4cl\" (UID: \"d2a66390-1e1f-433a-8637-8f9c03197ab8\") " pod="metallb-system/frr-k8s-zz4cl" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.888932 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d2a66390-1e1f-433a-8637-8f9c03197ab8-metrics\") pod \"frr-k8s-zz4cl\" (UID: \"d2a66390-1e1f-433a-8637-8f9c03197ab8\") " pod="metallb-system/frr-k8s-zz4cl" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.889057 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctrxr\" (UniqueName: \"kubernetes.io/projected/173c5607-1006-4c8c-afc1-79c8248bbe7a-kube-api-access-ctrxr\") pod \"frr-k8s-webhook-server-64bf5d555-59fcf\" (UID: \"173c5607-1006-4c8c-afc1-79c8248bbe7a\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-59fcf" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.890516 4700 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.890760 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.890960 4700 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-lwwsw" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.893443 4700 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.916803 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-47w4l"] Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.918069 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-47w4l" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.922107 4700 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.940211 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-47w4l"] Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.990219 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/173c5607-1006-4c8c-afc1-79c8248bbe7a-cert\") pod \"frr-k8s-webhook-server-64bf5d555-59fcf\" (UID: \"173c5607-1006-4c8c-afc1-79c8248bbe7a\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-59fcf" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.991441 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30cc7c7a-bda9-4b40-972b-6c87af01ad23-metrics-certs\") pod \"speaker-pv9ct\" (UID: \"30cc7c7a-bda9-4b40-972b-6c87af01ad23\") " pod="metallb-system/speaker-pv9ct" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.991541 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d2a66390-1e1f-433a-8637-8f9c03197ab8-frr-sockets\") pod \"frr-k8s-zz4cl\" (UID: \"d2a66390-1e1f-433a-8637-8f9c03197ab8\") " pod="metallb-system/frr-k8s-zz4cl" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.991650 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2a66390-1e1f-433a-8637-8f9c03197ab8-metrics-certs\") pod \"frr-k8s-zz4cl\" (UID: \"d2a66390-1e1f-433a-8637-8f9c03197ab8\") " pod="metallb-system/frr-k8s-zz4cl" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.991742 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d2a66390-1e1f-433a-8637-8f9c03197ab8-frr-startup\") pod \"frr-k8s-zz4cl\" (UID: \"d2a66390-1e1f-433a-8637-8f9c03197ab8\") " pod="metallb-system/frr-k8s-zz4cl" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.991831 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/30cc7c7a-bda9-4b40-972b-6c87af01ad23-memberlist\") pod \"speaker-pv9ct\" (UID: \"30cc7c7a-bda9-4b40-972b-6c87af01ad23\") " pod="metallb-system/speaker-pv9ct" Oct 07 11:33:05 crc kubenswrapper[4700]: E1007 11:33:05.991877 4700 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.991991 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d2a66390-1e1f-433a-8637-8f9c03197ab8-metrics\") pod \"frr-k8s-zz4cl\" (UID: \"d2a66390-1e1f-433a-8637-8f9c03197ab8\") " pod="metallb-system/frr-k8s-zz4cl" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.992028 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz5qb\" (UniqueName: \"kubernetes.io/projected/30cc7c7a-bda9-4b40-972b-6c87af01ad23-kube-api-access-bz5qb\") pod \"speaker-pv9ct\" (UID: \"30cc7c7a-bda9-4b40-972b-6c87af01ad23\") " pod="metallb-system/speaker-pv9ct" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.992061 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctrxr\" (UniqueName: \"kubernetes.io/projected/173c5607-1006-4c8c-afc1-79c8248bbe7a-kube-api-access-ctrxr\") pod \"frr-k8s-webhook-server-64bf5d555-59fcf\" (UID: \"173c5607-1006-4c8c-afc1-79c8248bbe7a\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-59fcf" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.992117 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d2a66390-1e1f-433a-8637-8f9c03197ab8-frr-conf\") pod \"frr-k8s-zz4cl\" (UID: \"d2a66390-1e1f-433a-8637-8f9c03197ab8\") " pod="metallb-system/frr-k8s-zz4cl" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.992146 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpnnh\" (UniqueName: \"kubernetes.io/projected/d2a66390-1e1f-433a-8637-8f9c03197ab8-kube-api-access-zpnnh\") pod \"frr-k8s-zz4cl\" (UID: \"d2a66390-1e1f-433a-8637-8f9c03197ab8\") " pod="metallb-system/frr-k8s-zz4cl" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.992170 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d2a66390-1e1f-433a-8637-8f9c03197ab8-reloader\") pod \"frr-k8s-zz4cl\" (UID: \"d2a66390-1e1f-433a-8637-8f9c03197ab8\") " pod="metallb-system/frr-k8s-zz4cl" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.992202 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/30cc7c7a-bda9-4b40-972b-6c87af01ad23-metallb-excludel2\") pod \"speaker-pv9ct\" (UID: \"30cc7c7a-bda9-4b40-972b-6c87af01ad23\") " pod="metallb-system/speaker-pv9ct" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.991886 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d2a66390-1e1f-433a-8637-8f9c03197ab8-frr-sockets\") pod \"frr-k8s-zz4cl\" (UID: \"d2a66390-1e1f-433a-8637-8f9c03197ab8\") " pod="metallb-system/frr-k8s-zz4cl" Oct 07 11:33:05 crc kubenswrapper[4700]: E1007 11:33:05.992785 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2a66390-1e1f-433a-8637-8f9c03197ab8-metrics-certs podName:d2a66390-1e1f-433a-8637-8f9c03197ab8 nodeName:}" failed. No retries permitted until 2025-10-07 11:33:06.49276416 +0000 UTC m=+753.289163149 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d2a66390-1e1f-433a-8637-8f9c03197ab8-metrics-certs") pod "frr-k8s-zz4cl" (UID: "d2a66390-1e1f-433a-8637-8f9c03197ab8") : secret "frr-k8s-certs-secret" not found Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.993254 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d2a66390-1e1f-433a-8637-8f9c03197ab8-frr-startup\") pod \"frr-k8s-zz4cl\" (UID: \"d2a66390-1e1f-433a-8637-8f9c03197ab8\") " pod="metallb-system/frr-k8s-zz4cl" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.994155 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d2a66390-1e1f-433a-8637-8f9c03197ab8-metrics\") pod \"frr-k8s-zz4cl\" (UID: \"d2a66390-1e1f-433a-8637-8f9c03197ab8\") " pod="metallb-system/frr-k8s-zz4cl" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.995855 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d2a66390-1e1f-433a-8637-8f9c03197ab8-reloader\") pod \"frr-k8s-zz4cl\" (UID: \"d2a66390-1e1f-433a-8637-8f9c03197ab8\") " pod="metallb-system/frr-k8s-zz4cl" Oct 07 11:33:05 crc kubenswrapper[4700]: I1007 11:33:05.996053 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d2a66390-1e1f-433a-8637-8f9c03197ab8-frr-conf\") pod \"frr-k8s-zz4cl\" (UID: \"d2a66390-1e1f-433a-8637-8f9c03197ab8\") " pod="metallb-system/frr-k8s-zz4cl" Oct 07 11:33:06 crc kubenswrapper[4700]: I1007 11:33:06.004816 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/173c5607-1006-4c8c-afc1-79c8248bbe7a-cert\") pod \"frr-k8s-webhook-server-64bf5d555-59fcf\" (UID: \"173c5607-1006-4c8c-afc1-79c8248bbe7a\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-59fcf" Oct 07 11:33:06 crc kubenswrapper[4700]: I1007 11:33:06.015453 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpnnh\" (UniqueName: \"kubernetes.io/projected/d2a66390-1e1f-433a-8637-8f9c03197ab8-kube-api-access-zpnnh\") pod \"frr-k8s-zz4cl\" (UID: \"d2a66390-1e1f-433a-8637-8f9c03197ab8\") " pod="metallb-system/frr-k8s-zz4cl" Oct 07 11:33:06 crc kubenswrapper[4700]: I1007 11:33:06.017860 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctrxr\" (UniqueName: \"kubernetes.io/projected/173c5607-1006-4c8c-afc1-79c8248bbe7a-kube-api-access-ctrxr\") pod \"frr-k8s-webhook-server-64bf5d555-59fcf\" (UID: \"173c5607-1006-4c8c-afc1-79c8248bbe7a\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-59fcf" Oct 07 11:33:06 crc kubenswrapper[4700]: I1007 11:33:06.093001 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab4aa75e-5802-4ba7-b88c-655fad15d8af-metrics-certs\") pod \"controller-68d546b9d8-47w4l\" (UID: \"ab4aa75e-5802-4ba7-b88c-655fad15d8af\") " pod="metallb-system/controller-68d546b9d8-47w4l" Oct 07 11:33:06 crc kubenswrapper[4700]: I1007 11:33:06.093160 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/30cc7c7a-bda9-4b40-972b-6c87af01ad23-memberlist\") pod \"speaker-pv9ct\" (UID: \"30cc7c7a-bda9-4b40-972b-6c87af01ad23\") " pod="metallb-system/speaker-pv9ct" Oct 07 11:33:06 crc kubenswrapper[4700]: I1007 11:33:06.093261 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz5qb\" (UniqueName: \"kubernetes.io/projected/30cc7c7a-bda9-4b40-972b-6c87af01ad23-kube-api-access-bz5qb\") pod \"speaker-pv9ct\" (UID: \"30cc7c7a-bda9-4b40-972b-6c87af01ad23\") " pod="metallb-system/speaker-pv9ct" Oct 07 11:33:06 crc kubenswrapper[4700]: E1007 11:33:06.093415 4700 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 07 11:33:06 crc kubenswrapper[4700]: E1007 11:33:06.093515 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30cc7c7a-bda9-4b40-972b-6c87af01ad23-memberlist podName:30cc7c7a-bda9-4b40-972b-6c87af01ad23 nodeName:}" failed. No retries permitted until 2025-10-07 11:33:06.593495684 +0000 UTC m=+753.389894673 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/30cc7c7a-bda9-4b40-972b-6c87af01ad23-memberlist") pod "speaker-pv9ct" (UID: "30cc7c7a-bda9-4b40-972b-6c87af01ad23") : secret "metallb-memberlist" not found Oct 07 11:33:06 crc kubenswrapper[4700]: I1007 11:33:06.093431 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/30cc7c7a-bda9-4b40-972b-6c87af01ad23-metallb-excludel2\") pod \"speaker-pv9ct\" (UID: \"30cc7c7a-bda9-4b40-972b-6c87af01ad23\") " pod="metallb-system/speaker-pv9ct" Oct 07 11:33:06 crc kubenswrapper[4700]: I1007 11:33:06.093668 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30cc7c7a-bda9-4b40-972b-6c87af01ad23-metrics-certs\") pod \"speaker-pv9ct\" (UID: \"30cc7c7a-bda9-4b40-972b-6c87af01ad23\") " pod="metallb-system/speaker-pv9ct" Oct 07 11:33:06 crc kubenswrapper[4700]: I1007 11:33:06.093736 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg7mn\" (UniqueName: \"kubernetes.io/projected/ab4aa75e-5802-4ba7-b88c-655fad15d8af-kube-api-access-sg7mn\") pod \"controller-68d546b9d8-47w4l\" (UID: \"ab4aa75e-5802-4ba7-b88c-655fad15d8af\") " pod="metallb-system/controller-68d546b9d8-47w4l" Oct 07 11:33:06 crc kubenswrapper[4700]: I1007 11:33:06.093775 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab4aa75e-5802-4ba7-b88c-655fad15d8af-cert\") pod \"controller-68d546b9d8-47w4l\" (UID: \"ab4aa75e-5802-4ba7-b88c-655fad15d8af\") " pod="metallb-system/controller-68d546b9d8-47w4l" Oct 07 11:33:06 crc kubenswrapper[4700]: E1007 11:33:06.093835 4700 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Oct 07 11:33:06 crc kubenswrapper[4700]: E1007 11:33:06.093867 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30cc7c7a-bda9-4b40-972b-6c87af01ad23-metrics-certs podName:30cc7c7a-bda9-4b40-972b-6c87af01ad23 nodeName:}" failed. No retries permitted until 2025-10-07 11:33:06.593859773 +0000 UTC m=+753.390258762 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/30cc7c7a-bda9-4b40-972b-6c87af01ad23-metrics-certs") pod "speaker-pv9ct" (UID: "30cc7c7a-bda9-4b40-972b-6c87af01ad23") : secret "speaker-certs-secret" not found Oct 07 11:33:06 crc kubenswrapper[4700]: I1007 11:33:06.094045 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/30cc7c7a-bda9-4b40-972b-6c87af01ad23-metallb-excludel2\") pod \"speaker-pv9ct\" (UID: \"30cc7c7a-bda9-4b40-972b-6c87af01ad23\") " pod="metallb-system/speaker-pv9ct" Oct 07 11:33:06 crc kubenswrapper[4700]: I1007 11:33:06.117881 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz5qb\" (UniqueName: \"kubernetes.io/projected/30cc7c7a-bda9-4b40-972b-6c87af01ad23-kube-api-access-bz5qb\") pod \"speaker-pv9ct\" (UID: \"30cc7c7a-bda9-4b40-972b-6c87af01ad23\") " pod="metallb-system/speaker-pv9ct" Oct 07 11:33:06 crc kubenswrapper[4700]: I1007 11:33:06.132373 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-59fcf" Oct 07 11:33:06 crc kubenswrapper[4700]: I1007 11:33:06.194888 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg7mn\" (UniqueName: \"kubernetes.io/projected/ab4aa75e-5802-4ba7-b88c-655fad15d8af-kube-api-access-sg7mn\") pod \"controller-68d546b9d8-47w4l\" (UID: \"ab4aa75e-5802-4ba7-b88c-655fad15d8af\") " pod="metallb-system/controller-68d546b9d8-47w4l" Oct 07 11:33:06 crc kubenswrapper[4700]: I1007 11:33:06.194938 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab4aa75e-5802-4ba7-b88c-655fad15d8af-cert\") pod \"controller-68d546b9d8-47w4l\" (UID: \"ab4aa75e-5802-4ba7-b88c-655fad15d8af\") " pod="metallb-system/controller-68d546b9d8-47w4l" Oct 07 11:33:06 crc kubenswrapper[4700]: I1007 11:33:06.194969 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab4aa75e-5802-4ba7-b88c-655fad15d8af-metrics-certs\") pod \"controller-68d546b9d8-47w4l\" (UID: \"ab4aa75e-5802-4ba7-b88c-655fad15d8af\") " pod="metallb-system/controller-68d546b9d8-47w4l" Oct 07 11:33:06 crc kubenswrapper[4700]: I1007 11:33:06.199749 4700 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 07 11:33:06 crc kubenswrapper[4700]: I1007 11:33:06.200467 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab4aa75e-5802-4ba7-b88c-655fad15d8af-metrics-certs\") pod \"controller-68d546b9d8-47w4l\" (UID: \"ab4aa75e-5802-4ba7-b88c-655fad15d8af\") " pod="metallb-system/controller-68d546b9d8-47w4l" Oct 07 11:33:06 crc kubenswrapper[4700]: I1007 11:33:06.210133 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab4aa75e-5802-4ba7-b88c-655fad15d8af-cert\") pod \"controller-68d546b9d8-47w4l\" (UID: \"ab4aa75e-5802-4ba7-b88c-655fad15d8af\") " pod="metallb-system/controller-68d546b9d8-47w4l" Oct 07 11:33:06 crc kubenswrapper[4700]: I1007 11:33:06.212177 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg7mn\" (UniqueName: \"kubernetes.io/projected/ab4aa75e-5802-4ba7-b88c-655fad15d8af-kube-api-access-sg7mn\") pod \"controller-68d546b9d8-47w4l\" (UID: \"ab4aa75e-5802-4ba7-b88c-655fad15d8af\") " pod="metallb-system/controller-68d546b9d8-47w4l" Oct 07 11:33:06 crc kubenswrapper[4700]: I1007 11:33:06.279827 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-47w4l" Oct 07 11:33:06 crc kubenswrapper[4700]: I1007 11:33:06.504936 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2a66390-1e1f-433a-8637-8f9c03197ab8-metrics-certs\") pod \"frr-k8s-zz4cl\" (UID: \"d2a66390-1e1f-433a-8637-8f9c03197ab8\") " pod="metallb-system/frr-k8s-zz4cl" Oct 07 11:33:06 crc kubenswrapper[4700]: I1007 11:33:06.513004 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2a66390-1e1f-433a-8637-8f9c03197ab8-metrics-certs\") pod \"frr-k8s-zz4cl\" (UID: \"d2a66390-1e1f-433a-8637-8f9c03197ab8\") " pod="metallb-system/frr-k8s-zz4cl" Oct 07 11:33:06 crc kubenswrapper[4700]: I1007 11:33:06.576872 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-47w4l"] Oct 07 11:33:06 crc kubenswrapper[4700]: W1007 11:33:06.589688 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab4aa75e_5802_4ba7_b88c_655fad15d8af.slice/crio-f6c1d8de2e1d64bcf03cf259c25e8a9e18fcd7a96669fcce5e0ec39908eaf6e6 WatchSource:0}: Error finding container f6c1d8de2e1d64bcf03cf259c25e8a9e18fcd7a96669fcce5e0ec39908eaf6e6: Status 404 returned error can't find the container with id f6c1d8de2e1d64bcf03cf259c25e8a9e18fcd7a96669fcce5e0ec39908eaf6e6 Oct 07 11:33:06 crc kubenswrapper[4700]: I1007 11:33:06.606285 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-59fcf"] Oct 07 11:33:06 crc kubenswrapper[4700]: I1007 11:33:06.607032 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30cc7c7a-bda9-4b40-972b-6c87af01ad23-metrics-certs\") pod \"speaker-pv9ct\" (UID: \"30cc7c7a-bda9-4b40-972b-6c87af01ad23\") " pod="metallb-system/speaker-pv9ct" Oct 07 11:33:06 crc kubenswrapper[4700]: I1007 11:33:06.607335 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/30cc7c7a-bda9-4b40-972b-6c87af01ad23-memberlist\") pod \"speaker-pv9ct\" (UID: \"30cc7c7a-bda9-4b40-972b-6c87af01ad23\") " pod="metallb-system/speaker-pv9ct" Oct 07 11:33:06 crc kubenswrapper[4700]: E1007 11:33:06.607563 4700 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 07 11:33:06 crc kubenswrapper[4700]: E1007 11:33:06.607672 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30cc7c7a-bda9-4b40-972b-6c87af01ad23-memberlist podName:30cc7c7a-bda9-4b40-972b-6c87af01ad23 nodeName:}" failed. No retries permitted until 2025-10-07 11:33:07.607645198 +0000 UTC m=+754.404044207 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/30cc7c7a-bda9-4b40-972b-6c87af01ad23-memberlist") pod "speaker-pv9ct" (UID: "30cc7c7a-bda9-4b40-972b-6c87af01ad23") : secret "metallb-memberlist" not found Oct 07 11:33:06 crc kubenswrapper[4700]: I1007 11:33:06.611128 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30cc7c7a-bda9-4b40-972b-6c87af01ad23-metrics-certs\") pod \"speaker-pv9ct\" (UID: \"30cc7c7a-bda9-4b40-972b-6c87af01ad23\") " pod="metallb-system/speaker-pv9ct" Oct 07 11:33:06 crc kubenswrapper[4700]: W1007 11:33:06.614025 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod173c5607_1006_4c8c_afc1_79c8248bbe7a.slice/crio-d6ec6bdb0a1dd1e9f3854fdbaf4b6e10c072a214ae75bb91489a5cab4821286e WatchSource:0}: Error finding container d6ec6bdb0a1dd1e9f3854fdbaf4b6e10c072a214ae75bb91489a5cab4821286e: Status 404 returned error can't find the container with id d6ec6bdb0a1dd1e9f3854fdbaf4b6e10c072a214ae75bb91489a5cab4821286e Oct 07 11:33:06 crc kubenswrapper[4700]: I1007 11:33:06.722655 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-zz4cl" Oct 07 11:33:07 crc kubenswrapper[4700]: I1007 11:33:07.324200 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-59fcf" event={"ID":"173c5607-1006-4c8c-afc1-79c8248bbe7a","Type":"ContainerStarted","Data":"d6ec6bdb0a1dd1e9f3854fdbaf4b6e10c072a214ae75bb91489a5cab4821286e"} Oct 07 11:33:07 crc kubenswrapper[4700]: I1007 11:33:07.326284 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zz4cl" event={"ID":"d2a66390-1e1f-433a-8637-8f9c03197ab8","Type":"ContainerStarted","Data":"55a32af4baf13cd3a1514004657986a0bf11611613f41a1ecbe564552649dae4"} Oct 07 11:33:07 crc kubenswrapper[4700]: I1007 11:33:07.328631 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-47w4l" event={"ID":"ab4aa75e-5802-4ba7-b88c-655fad15d8af","Type":"ContainerStarted","Data":"9d4b15ade4151e9cc269559abb7952df8fdc2a86967a4b2718e1ebf820b83694"} Oct 07 11:33:07 crc kubenswrapper[4700]: I1007 11:33:07.328666 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-47w4l" event={"ID":"ab4aa75e-5802-4ba7-b88c-655fad15d8af","Type":"ContainerStarted","Data":"0dcf62229e0b13bb27b698754c214ac797004f7929589f4d6ff60f65c3f309c6"} Oct 07 11:33:07 crc kubenswrapper[4700]: I1007 11:33:07.328680 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-47w4l" event={"ID":"ab4aa75e-5802-4ba7-b88c-655fad15d8af","Type":"ContainerStarted","Data":"f6c1d8de2e1d64bcf03cf259c25e8a9e18fcd7a96669fcce5e0ec39908eaf6e6"} Oct 07 11:33:07 crc kubenswrapper[4700]: I1007 11:33:07.328834 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-47w4l" Oct 07 11:33:07 crc kubenswrapper[4700]: I1007 11:33:07.358452 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-47w4l" podStartSLOduration=2.358418416 podStartE2EDuration="2.358418416s" podCreationTimestamp="2025-10-07 11:33:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:33:07.353440505 +0000 UTC m=+754.149839564" watchObservedRunningTime="2025-10-07 11:33:07.358418416 +0000 UTC m=+754.154817435" Oct 07 11:33:07 crc kubenswrapper[4700]: I1007 11:33:07.623880 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/30cc7c7a-bda9-4b40-972b-6c87af01ad23-memberlist\") pod \"speaker-pv9ct\" (UID: \"30cc7c7a-bda9-4b40-972b-6c87af01ad23\") " pod="metallb-system/speaker-pv9ct" Oct 07 11:33:07 crc kubenswrapper[4700]: I1007 11:33:07.640952 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/30cc7c7a-bda9-4b40-972b-6c87af01ad23-memberlist\") pod \"speaker-pv9ct\" (UID: \"30cc7c7a-bda9-4b40-972b-6c87af01ad23\") " pod="metallb-system/speaker-pv9ct" Oct 07 11:33:07 crc kubenswrapper[4700]: I1007 11:33:07.705074 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-pv9ct" Oct 07 11:33:07 crc kubenswrapper[4700]: W1007 11:33:07.728849 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30cc7c7a_bda9_4b40_972b_6c87af01ad23.slice/crio-ffadcf14cdcbc531a6a5720509421fd6e99f70982b04773f586d6b85c573d1f0 WatchSource:0}: Error finding container ffadcf14cdcbc531a6a5720509421fd6e99f70982b04773f586d6b85c573d1f0: Status 404 returned error can't find the container with id ffadcf14cdcbc531a6a5720509421fd6e99f70982b04773f586d6b85c573d1f0 Oct 07 11:33:08 crc kubenswrapper[4700]: I1007 11:33:08.352412 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-pv9ct" event={"ID":"30cc7c7a-bda9-4b40-972b-6c87af01ad23","Type":"ContainerStarted","Data":"5fed95a75060dc38e9513ac6b1f95aa9b001fe2ed89964b15a01bed434985cbc"} Oct 07 11:33:08 crc kubenswrapper[4700]: I1007 11:33:08.352919 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-pv9ct" event={"ID":"30cc7c7a-bda9-4b40-972b-6c87af01ad23","Type":"ContainerStarted","Data":"ffadcf14cdcbc531a6a5720509421fd6e99f70982b04773f586d6b85c573d1f0"} Oct 07 11:33:09 crc kubenswrapper[4700]: I1007 11:33:09.380236 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-pv9ct" event={"ID":"30cc7c7a-bda9-4b40-972b-6c87af01ad23","Type":"ContainerStarted","Data":"4f12b17249b35a2c7e5a7ddbf03e112aa8c01193f9414b8cd48c58d4715fb5c1"} Oct 07 11:33:09 crc kubenswrapper[4700]: I1007 11:33:09.381331 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-pv9ct" Oct 07 11:33:09 crc kubenswrapper[4700]: I1007 11:33:09.405582 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-pv9ct" podStartSLOduration=4.405562004 podStartE2EDuration="4.405562004s" podCreationTimestamp="2025-10-07 11:33:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:33:09.404347632 +0000 UTC m=+756.200746621" watchObservedRunningTime="2025-10-07 11:33:09.405562004 +0000 UTC m=+756.201960993" Oct 07 11:33:11 crc kubenswrapper[4700]: I1007 11:33:11.242752 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8l887"] Oct 07 11:33:11 crc kubenswrapper[4700]: I1007 11:33:11.243244 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-8l887" podUID="2557333d-abb9-4863-81c9-397307a108f6" containerName="controller-manager" containerID="cri-o://39e4ebfa264df42fd9eacf75383dfc334325d3e2a6b41394ee3e60d2671296a5" gracePeriod=30 Oct 07 11:33:11 crc kubenswrapper[4700]: I1007 11:33:11.327762 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4c7j6"] Oct 07 11:33:11 crc kubenswrapper[4700]: I1007 11:33:11.328002 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4c7j6" podUID="86dce66a-8c06-4f71-8ce9-dec87390310d" containerName="route-controller-manager" containerID="cri-o://924e6fc90a5277b37ef5bfd0e44d514709528eb378983776fb08b75d08ba3dab" gracePeriod=30 Oct 07 11:33:11 crc kubenswrapper[4700]: I1007 11:33:11.399958 4700 generic.go:334] "Generic (PLEG): container finished" podID="2557333d-abb9-4863-81c9-397307a108f6" containerID="39e4ebfa264df42fd9eacf75383dfc334325d3e2a6b41394ee3e60d2671296a5" exitCode=0 Oct 07 11:33:11 crc kubenswrapper[4700]: I1007 11:33:11.400003 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8l887" event={"ID":"2557333d-abb9-4863-81c9-397307a108f6","Type":"ContainerDied","Data":"39e4ebfa264df42fd9eacf75383dfc334325d3e2a6b41394ee3e60d2671296a5"} Oct 07 11:33:11 crc kubenswrapper[4700]: I1007 11:33:11.748857 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8l887" Oct 07 11:33:11 crc kubenswrapper[4700]: I1007 11:33:11.832181 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4c7j6" Oct 07 11:33:11 crc kubenswrapper[4700]: I1007 11:33:11.890048 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2557333d-abb9-4863-81c9-397307a108f6-config\") pod \"2557333d-abb9-4863-81c9-397307a108f6\" (UID: \"2557333d-abb9-4863-81c9-397307a108f6\") " Oct 07 11:33:11 crc kubenswrapper[4700]: I1007 11:33:11.890159 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2557333d-abb9-4863-81c9-397307a108f6-proxy-ca-bundles\") pod \"2557333d-abb9-4863-81c9-397307a108f6\" (UID: \"2557333d-abb9-4863-81c9-397307a108f6\") " Oct 07 11:33:11 crc kubenswrapper[4700]: I1007 11:33:11.890200 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2557333d-abb9-4863-81c9-397307a108f6-client-ca\") pod \"2557333d-abb9-4863-81c9-397307a108f6\" (UID: \"2557333d-abb9-4863-81c9-397307a108f6\") " Oct 07 11:33:11 crc kubenswrapper[4700]: I1007 11:33:11.890218 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2557333d-abb9-4863-81c9-397307a108f6-serving-cert\") pod \"2557333d-abb9-4863-81c9-397307a108f6\" (UID: \"2557333d-abb9-4863-81c9-397307a108f6\") " Oct 07 11:33:11 crc kubenswrapper[4700]: I1007 11:33:11.890246 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9whct\" (UniqueName: \"kubernetes.io/projected/2557333d-abb9-4863-81c9-397307a108f6-kube-api-access-9whct\") pod \"2557333d-abb9-4863-81c9-397307a108f6\" (UID: \"2557333d-abb9-4863-81c9-397307a108f6\") " Oct 07 11:33:11 crc kubenswrapper[4700]: I1007 11:33:11.892131 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2557333d-abb9-4863-81c9-397307a108f6-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2557333d-abb9-4863-81c9-397307a108f6" (UID: "2557333d-abb9-4863-81c9-397307a108f6"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:33:11 crc kubenswrapper[4700]: I1007 11:33:11.892160 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2557333d-abb9-4863-81c9-397307a108f6-client-ca" (OuterVolumeSpecName: "client-ca") pod "2557333d-abb9-4863-81c9-397307a108f6" (UID: "2557333d-abb9-4863-81c9-397307a108f6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:33:11 crc kubenswrapper[4700]: I1007 11:33:11.892369 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2557333d-abb9-4863-81c9-397307a108f6-config" (OuterVolumeSpecName: "config") pod "2557333d-abb9-4863-81c9-397307a108f6" (UID: "2557333d-abb9-4863-81c9-397307a108f6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:33:11 crc kubenswrapper[4700]: I1007 11:33:11.897418 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2557333d-abb9-4863-81c9-397307a108f6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2557333d-abb9-4863-81c9-397307a108f6" (UID: "2557333d-abb9-4863-81c9-397307a108f6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:33:11 crc kubenswrapper[4700]: I1007 11:33:11.906252 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2557333d-abb9-4863-81c9-397307a108f6-kube-api-access-9whct" (OuterVolumeSpecName: "kube-api-access-9whct") pod "2557333d-abb9-4863-81c9-397307a108f6" (UID: "2557333d-abb9-4863-81c9-397307a108f6"). InnerVolumeSpecName "kube-api-access-9whct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:33:11 crc kubenswrapper[4700]: I1007 11:33:11.991873 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmc5k\" (UniqueName: \"kubernetes.io/projected/86dce66a-8c06-4f71-8ce9-dec87390310d-kube-api-access-mmc5k\") pod \"86dce66a-8c06-4f71-8ce9-dec87390310d\" (UID: \"86dce66a-8c06-4f71-8ce9-dec87390310d\") " Oct 07 11:33:11 crc kubenswrapper[4700]: I1007 11:33:11.991974 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86dce66a-8c06-4f71-8ce9-dec87390310d-client-ca\") pod \"86dce66a-8c06-4f71-8ce9-dec87390310d\" (UID: \"86dce66a-8c06-4f71-8ce9-dec87390310d\") " Oct 07 11:33:11 crc kubenswrapper[4700]: I1007 11:33:11.992043 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86dce66a-8c06-4f71-8ce9-dec87390310d-config\") pod \"86dce66a-8c06-4f71-8ce9-dec87390310d\" (UID: \"86dce66a-8c06-4f71-8ce9-dec87390310d\") " Oct 07 11:33:11 crc kubenswrapper[4700]: I1007 11:33:11.992105 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86dce66a-8c06-4f71-8ce9-dec87390310d-serving-cert\") pod \"86dce66a-8c06-4f71-8ce9-dec87390310d\" (UID: \"86dce66a-8c06-4f71-8ce9-dec87390310d\") " Oct 07 11:33:11 crc kubenswrapper[4700]: I1007 11:33:11.992415 4700 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2557333d-abb9-4863-81c9-397307a108f6-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 07 11:33:11 crc kubenswrapper[4700]: I1007 11:33:11.992430 4700 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2557333d-abb9-4863-81c9-397307a108f6-client-ca\") on node \"crc\" DevicePath \"\"" Oct 07 11:33:11 crc kubenswrapper[4700]: I1007 11:33:11.992439 4700 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2557333d-abb9-4863-81c9-397307a108f6-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 11:33:11 crc kubenswrapper[4700]: I1007 11:33:11.992447 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9whct\" (UniqueName: \"kubernetes.io/projected/2557333d-abb9-4863-81c9-397307a108f6-kube-api-access-9whct\") on node \"crc\" DevicePath \"\"" Oct 07 11:33:11 crc kubenswrapper[4700]: I1007 11:33:11.992457 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2557333d-abb9-4863-81c9-397307a108f6-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:33:11 crc kubenswrapper[4700]: I1007 11:33:11.992969 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86dce66a-8c06-4f71-8ce9-dec87390310d-config" (OuterVolumeSpecName: "config") pod "86dce66a-8c06-4f71-8ce9-dec87390310d" (UID: "86dce66a-8c06-4f71-8ce9-dec87390310d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:33:11 crc kubenswrapper[4700]: I1007 11:33:11.993163 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86dce66a-8c06-4f71-8ce9-dec87390310d-client-ca" (OuterVolumeSpecName: "client-ca") pod "86dce66a-8c06-4f71-8ce9-dec87390310d" (UID: "86dce66a-8c06-4f71-8ce9-dec87390310d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:33:11 crc kubenswrapper[4700]: I1007 11:33:11.996182 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86dce66a-8c06-4f71-8ce9-dec87390310d-kube-api-access-mmc5k" (OuterVolumeSpecName: "kube-api-access-mmc5k") pod "86dce66a-8c06-4f71-8ce9-dec87390310d" (UID: "86dce66a-8c06-4f71-8ce9-dec87390310d"). InnerVolumeSpecName "kube-api-access-mmc5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:33:11 crc kubenswrapper[4700]: I1007 11:33:11.996398 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86dce66a-8c06-4f71-8ce9-dec87390310d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "86dce66a-8c06-4f71-8ce9-dec87390310d" (UID: "86dce66a-8c06-4f71-8ce9-dec87390310d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:33:12 crc kubenswrapper[4700]: I1007 11:33:12.106687 4700 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86dce66a-8c06-4f71-8ce9-dec87390310d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 11:33:12 crc kubenswrapper[4700]: I1007 11:33:12.106720 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmc5k\" (UniqueName: \"kubernetes.io/projected/86dce66a-8c06-4f71-8ce9-dec87390310d-kube-api-access-mmc5k\") on node \"crc\" DevicePath \"\"" Oct 07 11:33:12 crc kubenswrapper[4700]: I1007 11:33:12.106729 4700 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86dce66a-8c06-4f71-8ce9-dec87390310d-client-ca\") on node \"crc\" DevicePath \"\"" Oct 07 11:33:12 crc kubenswrapper[4700]: I1007 11:33:12.106739 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86dce66a-8c06-4f71-8ce9-dec87390310d-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:33:12 crc kubenswrapper[4700]: I1007 11:33:12.407769 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8l887" event={"ID":"2557333d-abb9-4863-81c9-397307a108f6","Type":"ContainerDied","Data":"c241313bd9a927ee4267bbafe5d7419fc930cf2569335ebbdcee6f498c357b4d"} Oct 07 11:33:12 crc kubenswrapper[4700]: I1007 11:33:12.407863 4700 scope.go:117] "RemoveContainer" containerID="39e4ebfa264df42fd9eacf75383dfc334325d3e2a6b41394ee3e60d2671296a5" Oct 07 11:33:12 crc kubenswrapper[4700]: I1007 11:33:12.408002 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8l887" Oct 07 11:33:12 crc kubenswrapper[4700]: I1007 11:33:12.413407 4700 generic.go:334] "Generic (PLEG): container finished" podID="86dce66a-8c06-4f71-8ce9-dec87390310d" containerID="924e6fc90a5277b37ef5bfd0e44d514709528eb378983776fb08b75d08ba3dab" exitCode=0 Oct 07 11:33:12 crc kubenswrapper[4700]: I1007 11:33:12.413478 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4c7j6" event={"ID":"86dce66a-8c06-4f71-8ce9-dec87390310d","Type":"ContainerDied","Data":"924e6fc90a5277b37ef5bfd0e44d514709528eb378983776fb08b75d08ba3dab"} Oct 07 11:33:12 crc kubenswrapper[4700]: I1007 11:33:12.413506 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4c7j6" Oct 07 11:33:12 crc kubenswrapper[4700]: I1007 11:33:12.413526 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4c7j6" event={"ID":"86dce66a-8c06-4f71-8ce9-dec87390310d","Type":"ContainerDied","Data":"89e650332482887468c554060bf414b6bf6fdd8dab49cc7cc7c9e7b499b86aff"} Oct 07 11:33:12 crc kubenswrapper[4700]: I1007 11:33:12.432693 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8l887"] Oct 07 11:33:12 crc kubenswrapper[4700]: I1007 11:33:12.436502 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8l887"] Oct 07 11:33:12 crc kubenswrapper[4700]: I1007 11:33:12.462623 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4c7j6"] Oct 07 11:33:12 crc kubenswrapper[4700]: I1007 11:33:12.465368 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4c7j6"] Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.023013 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-79f9c84b64-2pjvc"] Oct 07 11:33:13 crc kubenswrapper[4700]: E1007 11:33:13.024208 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2557333d-abb9-4863-81c9-397307a108f6" containerName="controller-manager" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.024223 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="2557333d-abb9-4863-81c9-397307a108f6" containerName="controller-manager" Oct 07 11:33:13 crc kubenswrapper[4700]: E1007 11:33:13.024233 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86dce66a-8c06-4f71-8ce9-dec87390310d" containerName="route-controller-manager" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.024241 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="86dce66a-8c06-4f71-8ce9-dec87390310d" containerName="route-controller-manager" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.024378 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="86dce66a-8c06-4f71-8ce9-dec87390310d" containerName="route-controller-manager" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.024396 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="2557333d-abb9-4863-81c9-397307a108f6" containerName="controller-manager" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.024906 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79f9c84b64-2pjvc" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.027166 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.027340 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.027752 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.028052 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.032662 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.033285 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.033409 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f7d9fcdf7-9gqm5"] Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.034155 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f7d9fcdf7-9gqm5" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.036211 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.037697 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-79f9c84b64-2pjvc"] Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.038428 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.039326 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.039515 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.039645 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.039772 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.039909 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.040721 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f7d9fcdf7-9gqm5"] Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.126018 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g7p2\" (UniqueName: \"kubernetes.io/projected/80d50fa4-4426-479c-bfd3-be65107598e0-kube-api-access-8g7p2\") pod \"controller-manager-79f9c84b64-2pjvc\" (UID: \"80d50fa4-4426-479c-bfd3-be65107598e0\") " pod="openshift-controller-manager/controller-manager-79f9c84b64-2pjvc" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.126095 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/80d50fa4-4426-479c-bfd3-be65107598e0-client-ca\") pod \"controller-manager-79f9c84b64-2pjvc\" (UID: \"80d50fa4-4426-479c-bfd3-be65107598e0\") " pod="openshift-controller-manager/controller-manager-79f9c84b64-2pjvc" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.126157 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kvj4\" (UniqueName: \"kubernetes.io/projected/214bf5f8-88e7-4f07-8019-5e500339579d-kube-api-access-4kvj4\") pod \"route-controller-manager-7f7d9fcdf7-9gqm5\" (UID: \"214bf5f8-88e7-4f07-8019-5e500339579d\") " pod="openshift-route-controller-manager/route-controller-manager-7f7d9fcdf7-9gqm5" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.126180 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/80d50fa4-4426-479c-bfd3-be65107598e0-proxy-ca-bundles\") pod \"controller-manager-79f9c84b64-2pjvc\" (UID: \"80d50fa4-4426-479c-bfd3-be65107598e0\") " pod="openshift-controller-manager/controller-manager-79f9c84b64-2pjvc" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.126201 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/214bf5f8-88e7-4f07-8019-5e500339579d-serving-cert\") pod \"route-controller-manager-7f7d9fcdf7-9gqm5\" (UID: \"214bf5f8-88e7-4f07-8019-5e500339579d\") " pod="openshift-route-controller-manager/route-controller-manager-7f7d9fcdf7-9gqm5" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.126221 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/214bf5f8-88e7-4f07-8019-5e500339579d-client-ca\") pod \"route-controller-manager-7f7d9fcdf7-9gqm5\" (UID: \"214bf5f8-88e7-4f07-8019-5e500339579d\") " pod="openshift-route-controller-manager/route-controller-manager-7f7d9fcdf7-9gqm5" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.126272 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/214bf5f8-88e7-4f07-8019-5e500339579d-config\") pod \"route-controller-manager-7f7d9fcdf7-9gqm5\" (UID: \"214bf5f8-88e7-4f07-8019-5e500339579d\") " pod="openshift-route-controller-manager/route-controller-manager-7f7d9fcdf7-9gqm5" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.126298 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80d50fa4-4426-479c-bfd3-be65107598e0-config\") pod \"controller-manager-79f9c84b64-2pjvc\" (UID: \"80d50fa4-4426-479c-bfd3-be65107598e0\") " pod="openshift-controller-manager/controller-manager-79f9c84b64-2pjvc" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.126347 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80d50fa4-4426-479c-bfd3-be65107598e0-serving-cert\") pod \"controller-manager-79f9c84b64-2pjvc\" (UID: \"80d50fa4-4426-479c-bfd3-be65107598e0\") " pod="openshift-controller-manager/controller-manager-79f9c84b64-2pjvc" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.143980 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-79f9c84b64-2pjvc"] Oct 07 11:33:13 crc kubenswrapper[4700]: E1007 11:33:13.148522 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config kube-api-access-8g7p2 proxy-ca-bundles serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-79f9c84b64-2pjvc" podUID="80d50fa4-4426-479c-bfd3-be65107598e0" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.159203 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f7d9fcdf7-9gqm5"] Oct 07 11:33:13 crc kubenswrapper[4700]: E1007 11:33:13.160019 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config kube-api-access-4kvj4 serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-7f7d9fcdf7-9gqm5" podUID="214bf5f8-88e7-4f07-8019-5e500339579d" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.228500 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g7p2\" (UniqueName: \"kubernetes.io/projected/80d50fa4-4426-479c-bfd3-be65107598e0-kube-api-access-8g7p2\") pod \"controller-manager-79f9c84b64-2pjvc\" (UID: \"80d50fa4-4426-479c-bfd3-be65107598e0\") " pod="openshift-controller-manager/controller-manager-79f9c84b64-2pjvc" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.228598 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/80d50fa4-4426-479c-bfd3-be65107598e0-client-ca\") pod \"controller-manager-79f9c84b64-2pjvc\" (UID: \"80d50fa4-4426-479c-bfd3-be65107598e0\") " pod="openshift-controller-manager/controller-manager-79f9c84b64-2pjvc" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.228641 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kvj4\" (UniqueName: \"kubernetes.io/projected/214bf5f8-88e7-4f07-8019-5e500339579d-kube-api-access-4kvj4\") pod \"route-controller-manager-7f7d9fcdf7-9gqm5\" (UID: \"214bf5f8-88e7-4f07-8019-5e500339579d\") " pod="openshift-route-controller-manager/route-controller-manager-7f7d9fcdf7-9gqm5" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.228671 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/80d50fa4-4426-479c-bfd3-be65107598e0-proxy-ca-bundles\") pod \"controller-manager-79f9c84b64-2pjvc\" (UID: \"80d50fa4-4426-479c-bfd3-be65107598e0\") " pod="openshift-controller-manager/controller-manager-79f9c84b64-2pjvc" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.228703 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/214bf5f8-88e7-4f07-8019-5e500339579d-serving-cert\") pod \"route-controller-manager-7f7d9fcdf7-9gqm5\" (UID: \"214bf5f8-88e7-4f07-8019-5e500339579d\") " pod="openshift-route-controller-manager/route-controller-manager-7f7d9fcdf7-9gqm5" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.228731 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/214bf5f8-88e7-4f07-8019-5e500339579d-client-ca\") pod \"route-controller-manager-7f7d9fcdf7-9gqm5\" (UID: \"214bf5f8-88e7-4f07-8019-5e500339579d\") " pod="openshift-route-controller-manager/route-controller-manager-7f7d9fcdf7-9gqm5" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.228784 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/214bf5f8-88e7-4f07-8019-5e500339579d-config\") pod \"route-controller-manager-7f7d9fcdf7-9gqm5\" (UID: \"214bf5f8-88e7-4f07-8019-5e500339579d\") " pod="openshift-route-controller-manager/route-controller-manager-7f7d9fcdf7-9gqm5" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.228816 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80d50fa4-4426-479c-bfd3-be65107598e0-config\") pod \"controller-manager-79f9c84b64-2pjvc\" (UID: \"80d50fa4-4426-479c-bfd3-be65107598e0\") " pod="openshift-controller-manager/controller-manager-79f9c84b64-2pjvc" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.228846 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80d50fa4-4426-479c-bfd3-be65107598e0-serving-cert\") pod \"controller-manager-79f9c84b64-2pjvc\" (UID: \"80d50fa4-4426-479c-bfd3-be65107598e0\") " pod="openshift-controller-manager/controller-manager-79f9c84b64-2pjvc" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.229810 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/80d50fa4-4426-479c-bfd3-be65107598e0-client-ca\") pod \"controller-manager-79f9c84b64-2pjvc\" (UID: \"80d50fa4-4426-479c-bfd3-be65107598e0\") " pod="openshift-controller-manager/controller-manager-79f9c84b64-2pjvc" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.230828 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/214bf5f8-88e7-4f07-8019-5e500339579d-config\") pod \"route-controller-manager-7f7d9fcdf7-9gqm5\" (UID: \"214bf5f8-88e7-4f07-8019-5e500339579d\") " pod="openshift-route-controller-manager/route-controller-manager-7f7d9fcdf7-9gqm5" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.231453 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80d50fa4-4426-479c-bfd3-be65107598e0-config\") pod \"controller-manager-79f9c84b64-2pjvc\" (UID: \"80d50fa4-4426-479c-bfd3-be65107598e0\") " pod="openshift-controller-manager/controller-manager-79f9c84b64-2pjvc" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.231806 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/80d50fa4-4426-479c-bfd3-be65107598e0-proxy-ca-bundles\") pod \"controller-manager-79f9c84b64-2pjvc\" (UID: \"80d50fa4-4426-479c-bfd3-be65107598e0\") " pod="openshift-controller-manager/controller-manager-79f9c84b64-2pjvc" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.232181 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/214bf5f8-88e7-4f07-8019-5e500339579d-client-ca\") pod \"route-controller-manager-7f7d9fcdf7-9gqm5\" (UID: \"214bf5f8-88e7-4f07-8019-5e500339579d\") " pod="openshift-route-controller-manager/route-controller-manager-7f7d9fcdf7-9gqm5" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.244794 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80d50fa4-4426-479c-bfd3-be65107598e0-serving-cert\") pod \"controller-manager-79f9c84b64-2pjvc\" (UID: \"80d50fa4-4426-479c-bfd3-be65107598e0\") " pod="openshift-controller-manager/controller-manager-79f9c84b64-2pjvc" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.245274 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/214bf5f8-88e7-4f07-8019-5e500339579d-serving-cert\") pod \"route-controller-manager-7f7d9fcdf7-9gqm5\" (UID: \"214bf5f8-88e7-4f07-8019-5e500339579d\") " pod="openshift-route-controller-manager/route-controller-manager-7f7d9fcdf7-9gqm5" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.249925 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kvj4\" (UniqueName: \"kubernetes.io/projected/214bf5f8-88e7-4f07-8019-5e500339579d-kube-api-access-4kvj4\") pod \"route-controller-manager-7f7d9fcdf7-9gqm5\" (UID: \"214bf5f8-88e7-4f07-8019-5e500339579d\") " pod="openshift-route-controller-manager/route-controller-manager-7f7d9fcdf7-9gqm5" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.251932 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g7p2\" (UniqueName: \"kubernetes.io/projected/80d50fa4-4426-479c-bfd3-be65107598e0-kube-api-access-8g7p2\") pod \"controller-manager-79f9c84b64-2pjvc\" (UID: \"80d50fa4-4426-479c-bfd3-be65107598e0\") " pod="openshift-controller-manager/controller-manager-79f9c84b64-2pjvc" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.421697 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f7d9fcdf7-9gqm5" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.421786 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79f9c84b64-2pjvc" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.446998 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f7d9fcdf7-9gqm5" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.460658 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79f9c84b64-2pjvc" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.542275 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/214bf5f8-88e7-4f07-8019-5e500339579d-serving-cert\") pod \"214bf5f8-88e7-4f07-8019-5e500339579d\" (UID: \"214bf5f8-88e7-4f07-8019-5e500339579d\") " Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.542401 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/80d50fa4-4426-479c-bfd3-be65107598e0-proxy-ca-bundles\") pod \"80d50fa4-4426-479c-bfd3-be65107598e0\" (UID: \"80d50fa4-4426-479c-bfd3-be65107598e0\") " Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.542438 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80d50fa4-4426-479c-bfd3-be65107598e0-serving-cert\") pod \"80d50fa4-4426-479c-bfd3-be65107598e0\" (UID: \"80d50fa4-4426-479c-bfd3-be65107598e0\") " Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.542482 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/214bf5f8-88e7-4f07-8019-5e500339579d-client-ca\") pod \"214bf5f8-88e7-4f07-8019-5e500339579d\" (UID: \"214bf5f8-88e7-4f07-8019-5e500339579d\") " Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.542560 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80d50fa4-4426-479c-bfd3-be65107598e0-config\") pod \"80d50fa4-4426-479c-bfd3-be65107598e0\" (UID: \"80d50fa4-4426-479c-bfd3-be65107598e0\") " Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.542587 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8g7p2\" (UniqueName: \"kubernetes.io/projected/80d50fa4-4426-479c-bfd3-be65107598e0-kube-api-access-8g7p2\") pod \"80d50fa4-4426-479c-bfd3-be65107598e0\" (UID: \"80d50fa4-4426-479c-bfd3-be65107598e0\") " Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.542614 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/214bf5f8-88e7-4f07-8019-5e500339579d-config\") pod \"214bf5f8-88e7-4f07-8019-5e500339579d\" (UID: \"214bf5f8-88e7-4f07-8019-5e500339579d\") " Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.542639 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/80d50fa4-4426-479c-bfd3-be65107598e0-client-ca\") pod \"80d50fa4-4426-479c-bfd3-be65107598e0\" (UID: \"80d50fa4-4426-479c-bfd3-be65107598e0\") " Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.542668 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kvj4\" (UniqueName: \"kubernetes.io/projected/214bf5f8-88e7-4f07-8019-5e500339579d-kube-api-access-4kvj4\") pod \"214bf5f8-88e7-4f07-8019-5e500339579d\" (UID: \"214bf5f8-88e7-4f07-8019-5e500339579d\") " Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.543883 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/214bf5f8-88e7-4f07-8019-5e500339579d-client-ca" (OuterVolumeSpecName: "client-ca") pod "214bf5f8-88e7-4f07-8019-5e500339579d" (UID: "214bf5f8-88e7-4f07-8019-5e500339579d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.544876 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80d50fa4-4426-479c-bfd3-be65107598e0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "80d50fa4-4426-479c-bfd3-be65107598e0" (UID: "80d50fa4-4426-479c-bfd3-be65107598e0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.546081 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/214bf5f8-88e7-4f07-8019-5e500339579d-kube-api-access-4kvj4" (OuterVolumeSpecName: "kube-api-access-4kvj4") pod "214bf5f8-88e7-4f07-8019-5e500339579d" (UID: "214bf5f8-88e7-4f07-8019-5e500339579d"). InnerVolumeSpecName "kube-api-access-4kvj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.546819 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/214bf5f8-88e7-4f07-8019-5e500339579d-config" (OuterVolumeSpecName: "config") pod "214bf5f8-88e7-4f07-8019-5e500339579d" (UID: "214bf5f8-88e7-4f07-8019-5e500339579d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.546961 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80d50fa4-4426-479c-bfd3-be65107598e0-config" (OuterVolumeSpecName: "config") pod "80d50fa4-4426-479c-bfd3-be65107598e0" (UID: "80d50fa4-4426-479c-bfd3-be65107598e0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.547360 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80d50fa4-4426-479c-bfd3-be65107598e0-client-ca" (OuterVolumeSpecName: "client-ca") pod "80d50fa4-4426-479c-bfd3-be65107598e0" (UID: "80d50fa4-4426-479c-bfd3-be65107598e0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.566488 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80d50fa4-4426-479c-bfd3-be65107598e0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "80d50fa4-4426-479c-bfd3-be65107598e0" (UID: "80d50fa4-4426-479c-bfd3-be65107598e0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.574514 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/214bf5f8-88e7-4f07-8019-5e500339579d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "214bf5f8-88e7-4f07-8019-5e500339579d" (UID: "214bf5f8-88e7-4f07-8019-5e500339579d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.579626 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80d50fa4-4426-479c-bfd3-be65107598e0-kube-api-access-8g7p2" (OuterVolumeSpecName: "kube-api-access-8g7p2") pod "80d50fa4-4426-479c-bfd3-be65107598e0" (UID: "80d50fa4-4426-479c-bfd3-be65107598e0"). InnerVolumeSpecName "kube-api-access-8g7p2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.644510 4700 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/214bf5f8-88e7-4f07-8019-5e500339579d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.644546 4700 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/80d50fa4-4426-479c-bfd3-be65107598e0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.644558 4700 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80d50fa4-4426-479c-bfd3-be65107598e0-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.644566 4700 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/214bf5f8-88e7-4f07-8019-5e500339579d-client-ca\") on node \"crc\" DevicePath \"\"" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.644576 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80d50fa4-4426-479c-bfd3-be65107598e0-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.644588 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8g7p2\" (UniqueName: \"kubernetes.io/projected/80d50fa4-4426-479c-bfd3-be65107598e0-kube-api-access-8g7p2\") on node \"crc\" DevicePath \"\"" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.644596 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/214bf5f8-88e7-4f07-8019-5e500339579d-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.644606 4700 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/80d50fa4-4426-479c-bfd3-be65107598e0-client-ca\") on node \"crc\" DevicePath \"\"" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.644615 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kvj4\" (UniqueName: \"kubernetes.io/projected/214bf5f8-88e7-4f07-8019-5e500339579d-kube-api-access-4kvj4\") on node \"crc\" DevicePath \"\"" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.968117 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2557333d-abb9-4863-81c9-397307a108f6" path="/var/lib/kubelet/pods/2557333d-abb9-4863-81c9-397307a108f6/volumes" Oct 07 11:33:13 crc kubenswrapper[4700]: I1007 11:33:13.968930 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86dce66a-8c06-4f71-8ce9-dec87390310d" path="/var/lib/kubelet/pods/86dce66a-8c06-4f71-8ce9-dec87390310d/volumes" Oct 07 11:33:14 crc kubenswrapper[4700]: I1007 11:33:14.430379 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f7d9fcdf7-9gqm5" Oct 07 11:33:14 crc kubenswrapper[4700]: I1007 11:33:14.431841 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79f9c84b64-2pjvc" Oct 07 11:33:14 crc kubenswrapper[4700]: I1007 11:33:14.493924 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f7d9fcdf7-9gqm5"] Oct 07 11:33:14 crc kubenswrapper[4700]: I1007 11:33:14.502718 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b4b7cd7d6-q7v87"] Oct 07 11:33:14 crc kubenswrapper[4700]: I1007 11:33:14.509142 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f7d9fcdf7-9gqm5"] Oct 07 11:33:14 crc kubenswrapper[4700]: I1007 11:33:14.509256 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b4b7cd7d6-q7v87" Oct 07 11:33:14 crc kubenswrapper[4700]: I1007 11:33:14.511964 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 07 11:33:14 crc kubenswrapper[4700]: I1007 11:33:14.512007 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 07 11:33:14 crc kubenswrapper[4700]: I1007 11:33:14.512339 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 07 11:33:14 crc kubenswrapper[4700]: I1007 11:33:14.512355 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 07 11:33:14 crc kubenswrapper[4700]: I1007 11:33:14.515815 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 07 11:33:14 crc kubenswrapper[4700]: I1007 11:33:14.516074 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 07 11:33:14 crc kubenswrapper[4700]: I1007 11:33:14.522267 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b4b7cd7d6-q7v87"] Oct 07 11:33:14 crc kubenswrapper[4700]: I1007 11:33:14.537619 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-79f9c84b64-2pjvc"] Oct 07 11:33:14 crc kubenswrapper[4700]: I1007 11:33:14.542291 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-79f9c84b64-2pjvc"] Oct 07 11:33:14 crc kubenswrapper[4700]: I1007 11:33:14.670797 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/900d177d-3cf4-4eed-997c-1e632b47513a-client-ca\") pod \"route-controller-manager-b4b7cd7d6-q7v87\" (UID: \"900d177d-3cf4-4eed-997c-1e632b47513a\") " pod="openshift-route-controller-manager/route-controller-manager-b4b7cd7d6-q7v87" Oct 07 11:33:14 crc kubenswrapper[4700]: I1007 11:33:14.671076 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mx9m\" (UniqueName: \"kubernetes.io/projected/900d177d-3cf4-4eed-997c-1e632b47513a-kube-api-access-4mx9m\") pod \"route-controller-manager-b4b7cd7d6-q7v87\" (UID: \"900d177d-3cf4-4eed-997c-1e632b47513a\") " pod="openshift-route-controller-manager/route-controller-manager-b4b7cd7d6-q7v87" Oct 07 11:33:14 crc kubenswrapper[4700]: I1007 11:33:14.671258 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/900d177d-3cf4-4eed-997c-1e632b47513a-config\") pod \"route-controller-manager-b4b7cd7d6-q7v87\" (UID: \"900d177d-3cf4-4eed-997c-1e632b47513a\") " pod="openshift-route-controller-manager/route-controller-manager-b4b7cd7d6-q7v87" Oct 07 11:33:14 crc kubenswrapper[4700]: I1007 11:33:14.671414 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/900d177d-3cf4-4eed-997c-1e632b47513a-serving-cert\") pod \"route-controller-manager-b4b7cd7d6-q7v87\" (UID: \"900d177d-3cf4-4eed-997c-1e632b47513a\") " pod="openshift-route-controller-manager/route-controller-manager-b4b7cd7d6-q7v87" Oct 07 11:33:14 crc kubenswrapper[4700]: I1007 11:33:14.774552 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/900d177d-3cf4-4eed-997c-1e632b47513a-client-ca\") pod \"route-controller-manager-b4b7cd7d6-q7v87\" (UID: \"900d177d-3cf4-4eed-997c-1e632b47513a\") " pod="openshift-route-controller-manager/route-controller-manager-b4b7cd7d6-q7v87" Oct 07 11:33:14 crc kubenswrapper[4700]: I1007 11:33:14.774648 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mx9m\" (UniqueName: \"kubernetes.io/projected/900d177d-3cf4-4eed-997c-1e632b47513a-kube-api-access-4mx9m\") pod \"route-controller-manager-b4b7cd7d6-q7v87\" (UID: \"900d177d-3cf4-4eed-997c-1e632b47513a\") " pod="openshift-route-controller-manager/route-controller-manager-b4b7cd7d6-q7v87" Oct 07 11:33:14 crc kubenswrapper[4700]: I1007 11:33:14.774700 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/900d177d-3cf4-4eed-997c-1e632b47513a-config\") pod \"route-controller-manager-b4b7cd7d6-q7v87\" (UID: \"900d177d-3cf4-4eed-997c-1e632b47513a\") " pod="openshift-route-controller-manager/route-controller-manager-b4b7cd7d6-q7v87" Oct 07 11:33:14 crc kubenswrapper[4700]: I1007 11:33:14.774731 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/900d177d-3cf4-4eed-997c-1e632b47513a-serving-cert\") pod \"route-controller-manager-b4b7cd7d6-q7v87\" (UID: \"900d177d-3cf4-4eed-997c-1e632b47513a\") " pod="openshift-route-controller-manager/route-controller-manager-b4b7cd7d6-q7v87" Oct 07 11:33:14 crc kubenswrapper[4700]: I1007 11:33:14.777514 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/900d177d-3cf4-4eed-997c-1e632b47513a-client-ca\") pod \"route-controller-manager-b4b7cd7d6-q7v87\" (UID: \"900d177d-3cf4-4eed-997c-1e632b47513a\") " pod="openshift-route-controller-manager/route-controller-manager-b4b7cd7d6-q7v87" Oct 07 11:33:14 crc kubenswrapper[4700]: I1007 11:33:14.777809 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/900d177d-3cf4-4eed-997c-1e632b47513a-config\") pod \"route-controller-manager-b4b7cd7d6-q7v87\" (UID: \"900d177d-3cf4-4eed-997c-1e632b47513a\") " pod="openshift-route-controller-manager/route-controller-manager-b4b7cd7d6-q7v87" Oct 07 11:33:14 crc kubenswrapper[4700]: I1007 11:33:14.781337 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/900d177d-3cf4-4eed-997c-1e632b47513a-serving-cert\") pod \"route-controller-manager-b4b7cd7d6-q7v87\" (UID: \"900d177d-3cf4-4eed-997c-1e632b47513a\") " pod="openshift-route-controller-manager/route-controller-manager-b4b7cd7d6-q7v87" Oct 07 11:33:14 crc kubenswrapper[4700]: I1007 11:33:14.805213 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mx9m\" (UniqueName: \"kubernetes.io/projected/900d177d-3cf4-4eed-997c-1e632b47513a-kube-api-access-4mx9m\") pod \"route-controller-manager-b4b7cd7d6-q7v87\" (UID: \"900d177d-3cf4-4eed-997c-1e632b47513a\") " pod="openshift-route-controller-manager/route-controller-manager-b4b7cd7d6-q7v87" Oct 07 11:33:14 crc kubenswrapper[4700]: I1007 11:33:14.837390 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b4b7cd7d6-q7v87" Oct 07 11:33:15 crc kubenswrapper[4700]: I1007 11:33:15.334476 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 11:33:15 crc kubenswrapper[4700]: I1007 11:33:15.334979 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 11:33:15 crc kubenswrapper[4700]: I1007 11:33:15.444762 4700 scope.go:117] "RemoveContainer" containerID="924e6fc90a5277b37ef5bfd0e44d514709528eb378983776fb08b75d08ba3dab" Oct 07 11:33:15 crc kubenswrapper[4700]: I1007 11:33:15.530929 4700 scope.go:117] "RemoveContainer" containerID="924e6fc90a5277b37ef5bfd0e44d514709528eb378983776fb08b75d08ba3dab" Oct 07 11:33:15 crc kubenswrapper[4700]: E1007 11:33:15.531825 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"924e6fc90a5277b37ef5bfd0e44d514709528eb378983776fb08b75d08ba3dab\": container with ID starting with 924e6fc90a5277b37ef5bfd0e44d514709528eb378983776fb08b75d08ba3dab not found: ID does not exist" containerID="924e6fc90a5277b37ef5bfd0e44d514709528eb378983776fb08b75d08ba3dab" Oct 07 11:33:15 crc kubenswrapper[4700]: I1007 11:33:15.531895 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"924e6fc90a5277b37ef5bfd0e44d514709528eb378983776fb08b75d08ba3dab"} err="failed to get container status \"924e6fc90a5277b37ef5bfd0e44d514709528eb378983776fb08b75d08ba3dab\": rpc error: code = NotFound desc = could not find container \"924e6fc90a5277b37ef5bfd0e44d514709528eb378983776fb08b75d08ba3dab\": container with ID starting with 924e6fc90a5277b37ef5bfd0e44d514709528eb378983776fb08b75d08ba3dab not found: ID does not exist" Oct 07 11:33:15 crc kubenswrapper[4700]: I1007 11:33:15.831649 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b4b7cd7d6-q7v87"] Oct 07 11:33:15 crc kubenswrapper[4700]: I1007 11:33:15.967214 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="214bf5f8-88e7-4f07-8019-5e500339579d" path="/var/lib/kubelet/pods/214bf5f8-88e7-4f07-8019-5e500339579d/volumes" Oct 07 11:33:15 crc kubenswrapper[4700]: I1007 11:33:15.967997 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80d50fa4-4426-479c-bfd3-be65107598e0" path="/var/lib/kubelet/pods/80d50fa4-4426-479c-bfd3-be65107598e0/volumes" Oct 07 11:33:16 crc kubenswrapper[4700]: I1007 11:33:16.287408 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-47w4l" Oct 07 11:33:16 crc kubenswrapper[4700]: I1007 11:33:16.447903 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b4b7cd7d6-q7v87" event={"ID":"900d177d-3cf4-4eed-997c-1e632b47513a","Type":"ContainerStarted","Data":"746098c7576e248ffdfa534d743c0695c32c24539632fe760195ec79d8d5c6e6"} Oct 07 11:33:16 crc kubenswrapper[4700]: I1007 11:33:16.448014 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-b4b7cd7d6-q7v87" Oct 07 11:33:16 crc kubenswrapper[4700]: I1007 11:33:16.448038 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b4b7cd7d6-q7v87" event={"ID":"900d177d-3cf4-4eed-997c-1e632b47513a","Type":"ContainerStarted","Data":"720b0b4551e0c4ad2281597dbcc17fce868652e1e69bed3a0c06a3c02615d55c"} Oct 07 11:33:16 crc kubenswrapper[4700]: I1007 11:33:16.450221 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-59fcf" event={"ID":"173c5607-1006-4c8c-afc1-79c8248bbe7a","Type":"ContainerStarted","Data":"f50ff86a01e002cbf68a97b15936d50cee4ca95e010225a59ca3f64255460fb8"} Oct 07 11:33:16 crc kubenswrapper[4700]: I1007 11:33:16.450346 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-59fcf" Oct 07 11:33:16 crc kubenswrapper[4700]: I1007 11:33:16.452373 4700 generic.go:334] "Generic (PLEG): container finished" podID="d2a66390-1e1f-433a-8637-8f9c03197ab8" containerID="2eae6b686e41a40f5ee82fe172db4b518fedaef3fa08c0ba8c1326e812361a36" exitCode=0 Oct 07 11:33:16 crc kubenswrapper[4700]: I1007 11:33:16.452439 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zz4cl" event={"ID":"d2a66390-1e1f-433a-8637-8f9c03197ab8","Type":"ContainerDied","Data":"2eae6b686e41a40f5ee82fe172db4b518fedaef3fa08c0ba8c1326e812361a36"} Oct 07 11:33:16 crc kubenswrapper[4700]: I1007 11:33:16.454851 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-b4b7cd7d6-q7v87" Oct 07 11:33:16 crc kubenswrapper[4700]: I1007 11:33:16.496811 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-59fcf" podStartSLOduration=2.464985552 podStartE2EDuration="11.496784148s" podCreationTimestamp="2025-10-07 11:33:05 +0000 UTC" firstStartedPulling="2025-10-07 11:33:06.617769675 +0000 UTC m=+753.414168684" lastFinishedPulling="2025-10-07 11:33:15.649568291 +0000 UTC m=+762.445967280" observedRunningTime="2025-10-07 11:33:16.494697383 +0000 UTC m=+763.291096372" watchObservedRunningTime="2025-10-07 11:33:16.496784148 +0000 UTC m=+763.293183147" Oct 07 11:33:16 crc kubenswrapper[4700]: I1007 11:33:16.497969 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-b4b7cd7d6-q7v87" podStartSLOduration=3.497960419 podStartE2EDuration="3.497960419s" podCreationTimestamp="2025-10-07 11:33:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:33:16.477562952 +0000 UTC m=+763.273961951" watchObservedRunningTime="2025-10-07 11:33:16.497960419 +0000 UTC m=+763.294359418" Oct 07 11:33:17 crc kubenswrapper[4700]: I1007 11:33:17.019279 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-68cc7449d8-9zq5s"] Oct 07 11:33:17 crc kubenswrapper[4700]: I1007 11:33:17.021028 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68cc7449d8-9zq5s" Oct 07 11:33:17 crc kubenswrapper[4700]: I1007 11:33:17.026940 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 07 11:33:17 crc kubenswrapper[4700]: I1007 11:33:17.031762 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 07 11:33:17 crc kubenswrapper[4700]: I1007 11:33:17.032800 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 07 11:33:17 crc kubenswrapper[4700]: I1007 11:33:17.032905 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 07 11:33:17 crc kubenswrapper[4700]: I1007 11:33:17.032985 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 07 11:33:17 crc kubenswrapper[4700]: I1007 11:33:17.035090 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 07 11:33:17 crc kubenswrapper[4700]: I1007 11:33:17.038734 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-68cc7449d8-9zq5s"] Oct 07 11:33:17 crc kubenswrapper[4700]: I1007 11:33:17.045782 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 07 11:33:17 crc kubenswrapper[4700]: I1007 11:33:17.216171 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11da3391-cee7-4a08-a8d2-fab7cbfd0f63-serving-cert\") pod \"controller-manager-68cc7449d8-9zq5s\" (UID: \"11da3391-cee7-4a08-a8d2-fab7cbfd0f63\") " pod="openshift-controller-manager/controller-manager-68cc7449d8-9zq5s" Oct 07 11:33:17 crc kubenswrapper[4700]: I1007 11:33:17.216672 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11da3391-cee7-4a08-a8d2-fab7cbfd0f63-config\") pod \"controller-manager-68cc7449d8-9zq5s\" (UID: \"11da3391-cee7-4a08-a8d2-fab7cbfd0f63\") " pod="openshift-controller-manager/controller-manager-68cc7449d8-9zq5s" Oct 07 11:33:17 crc kubenswrapper[4700]: I1007 11:33:17.219017 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5dn2\" (UniqueName: \"kubernetes.io/projected/11da3391-cee7-4a08-a8d2-fab7cbfd0f63-kube-api-access-j5dn2\") pod \"controller-manager-68cc7449d8-9zq5s\" (UID: \"11da3391-cee7-4a08-a8d2-fab7cbfd0f63\") " pod="openshift-controller-manager/controller-manager-68cc7449d8-9zq5s" Oct 07 11:33:17 crc kubenswrapper[4700]: I1007 11:33:17.219276 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11da3391-cee7-4a08-a8d2-fab7cbfd0f63-client-ca\") pod \"controller-manager-68cc7449d8-9zq5s\" (UID: \"11da3391-cee7-4a08-a8d2-fab7cbfd0f63\") " pod="openshift-controller-manager/controller-manager-68cc7449d8-9zq5s" Oct 07 11:33:17 crc kubenswrapper[4700]: I1007 11:33:17.219363 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/11da3391-cee7-4a08-a8d2-fab7cbfd0f63-proxy-ca-bundles\") pod \"controller-manager-68cc7449d8-9zq5s\" (UID: \"11da3391-cee7-4a08-a8d2-fab7cbfd0f63\") " pod="openshift-controller-manager/controller-manager-68cc7449d8-9zq5s" Oct 07 11:33:17 crc kubenswrapper[4700]: I1007 11:33:17.320986 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11da3391-cee7-4a08-a8d2-fab7cbfd0f63-serving-cert\") pod \"controller-manager-68cc7449d8-9zq5s\" (UID: \"11da3391-cee7-4a08-a8d2-fab7cbfd0f63\") " pod="openshift-controller-manager/controller-manager-68cc7449d8-9zq5s" Oct 07 11:33:17 crc kubenswrapper[4700]: I1007 11:33:17.321075 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11da3391-cee7-4a08-a8d2-fab7cbfd0f63-config\") pod \"controller-manager-68cc7449d8-9zq5s\" (UID: \"11da3391-cee7-4a08-a8d2-fab7cbfd0f63\") " pod="openshift-controller-manager/controller-manager-68cc7449d8-9zq5s" Oct 07 11:33:17 crc kubenswrapper[4700]: I1007 11:33:17.321102 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5dn2\" (UniqueName: \"kubernetes.io/projected/11da3391-cee7-4a08-a8d2-fab7cbfd0f63-kube-api-access-j5dn2\") pod \"controller-manager-68cc7449d8-9zq5s\" (UID: \"11da3391-cee7-4a08-a8d2-fab7cbfd0f63\") " pod="openshift-controller-manager/controller-manager-68cc7449d8-9zq5s" Oct 07 11:33:17 crc kubenswrapper[4700]: I1007 11:33:17.321147 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11da3391-cee7-4a08-a8d2-fab7cbfd0f63-client-ca\") pod \"controller-manager-68cc7449d8-9zq5s\" (UID: \"11da3391-cee7-4a08-a8d2-fab7cbfd0f63\") " pod="openshift-controller-manager/controller-manager-68cc7449d8-9zq5s" Oct 07 11:33:17 crc kubenswrapper[4700]: I1007 11:33:17.321169 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/11da3391-cee7-4a08-a8d2-fab7cbfd0f63-proxy-ca-bundles\") pod \"controller-manager-68cc7449d8-9zq5s\" (UID: \"11da3391-cee7-4a08-a8d2-fab7cbfd0f63\") " pod="openshift-controller-manager/controller-manager-68cc7449d8-9zq5s" Oct 07 11:33:17 crc kubenswrapper[4700]: I1007 11:33:17.322832 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11da3391-cee7-4a08-a8d2-fab7cbfd0f63-config\") pod \"controller-manager-68cc7449d8-9zq5s\" (UID: \"11da3391-cee7-4a08-a8d2-fab7cbfd0f63\") " pod="openshift-controller-manager/controller-manager-68cc7449d8-9zq5s" Oct 07 11:33:17 crc kubenswrapper[4700]: I1007 11:33:17.322960 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/11da3391-cee7-4a08-a8d2-fab7cbfd0f63-proxy-ca-bundles\") pod \"controller-manager-68cc7449d8-9zq5s\" (UID: \"11da3391-cee7-4a08-a8d2-fab7cbfd0f63\") " pod="openshift-controller-manager/controller-manager-68cc7449d8-9zq5s" Oct 07 11:33:17 crc kubenswrapper[4700]: I1007 11:33:17.323731 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11da3391-cee7-4a08-a8d2-fab7cbfd0f63-client-ca\") pod \"controller-manager-68cc7449d8-9zq5s\" (UID: \"11da3391-cee7-4a08-a8d2-fab7cbfd0f63\") " pod="openshift-controller-manager/controller-manager-68cc7449d8-9zq5s" Oct 07 11:33:17 crc kubenswrapper[4700]: I1007 11:33:17.331320 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11da3391-cee7-4a08-a8d2-fab7cbfd0f63-serving-cert\") pod \"controller-manager-68cc7449d8-9zq5s\" (UID: \"11da3391-cee7-4a08-a8d2-fab7cbfd0f63\") " pod="openshift-controller-manager/controller-manager-68cc7449d8-9zq5s" Oct 07 11:33:17 crc kubenswrapper[4700]: I1007 11:33:17.355919 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5dn2\" (UniqueName: \"kubernetes.io/projected/11da3391-cee7-4a08-a8d2-fab7cbfd0f63-kube-api-access-j5dn2\") pod \"controller-manager-68cc7449d8-9zq5s\" (UID: \"11da3391-cee7-4a08-a8d2-fab7cbfd0f63\") " pod="openshift-controller-manager/controller-manager-68cc7449d8-9zq5s" Oct 07 11:33:17 crc kubenswrapper[4700]: I1007 11:33:17.463389 4700 generic.go:334] "Generic (PLEG): container finished" podID="d2a66390-1e1f-433a-8637-8f9c03197ab8" containerID="b89c1af6f902e9a94972f26a7b7065003d3bf68c0804ee7dc18f81eb9cd64309" exitCode=0 Oct 07 11:33:17 crc kubenswrapper[4700]: I1007 11:33:17.464625 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zz4cl" event={"ID":"d2a66390-1e1f-433a-8637-8f9c03197ab8","Type":"ContainerDied","Data":"b89c1af6f902e9a94972f26a7b7065003d3bf68c0804ee7dc18f81eb9cd64309"} Oct 07 11:33:17 crc kubenswrapper[4700]: I1007 11:33:17.649682 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68cc7449d8-9zq5s" Oct 07 11:33:18 crc kubenswrapper[4700]: I1007 11:33:18.080494 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-68cc7449d8-9zq5s"] Oct 07 11:33:18 crc kubenswrapper[4700]: W1007 11:33:18.092177 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11da3391_cee7_4a08_a8d2_fab7cbfd0f63.slice/crio-d9283322692460a514737f46ffc1ebe6034fd7acee3c9638b349063c3c420e16 WatchSource:0}: Error finding container d9283322692460a514737f46ffc1ebe6034fd7acee3c9638b349063c3c420e16: Status 404 returned error can't find the container with id d9283322692460a514737f46ffc1ebe6034fd7acee3c9638b349063c3c420e16 Oct 07 11:33:18 crc kubenswrapper[4700]: I1007 11:33:18.326429 4700 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 07 11:33:18 crc kubenswrapper[4700]: I1007 11:33:18.471328 4700 generic.go:334] "Generic (PLEG): container finished" podID="d2a66390-1e1f-433a-8637-8f9c03197ab8" containerID="4381c29199e2b28c5e88e781eaab89516e9d56a7939d02e0f8fcea63d9e7824b" exitCode=0 Oct 07 11:33:18 crc kubenswrapper[4700]: I1007 11:33:18.471403 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zz4cl" event={"ID":"d2a66390-1e1f-433a-8637-8f9c03197ab8","Type":"ContainerDied","Data":"4381c29199e2b28c5e88e781eaab89516e9d56a7939d02e0f8fcea63d9e7824b"} Oct 07 11:33:18 crc kubenswrapper[4700]: I1007 11:33:18.472894 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68cc7449d8-9zq5s" event={"ID":"11da3391-cee7-4a08-a8d2-fab7cbfd0f63","Type":"ContainerStarted","Data":"43ef1a1616ac5d0575ba19cd1f77e804e50fb4b73e97b31a0d3a9d342e33e42b"} Oct 07 11:33:18 crc kubenswrapper[4700]: I1007 11:33:18.472931 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68cc7449d8-9zq5s" event={"ID":"11da3391-cee7-4a08-a8d2-fab7cbfd0f63","Type":"ContainerStarted","Data":"d9283322692460a514737f46ffc1ebe6034fd7acee3c9638b349063c3c420e16"} Oct 07 11:33:18 crc kubenswrapper[4700]: I1007 11:33:18.473105 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-68cc7449d8-9zq5s" Oct 07 11:33:18 crc kubenswrapper[4700]: I1007 11:33:18.478581 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-68cc7449d8-9zq5s" Oct 07 11:33:18 crc kubenswrapper[4700]: I1007 11:33:18.538041 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-68cc7449d8-9zq5s" podStartSLOduration=5.5380223619999995 podStartE2EDuration="5.538022362s" podCreationTimestamp="2025-10-07 11:33:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:33:18.533635776 +0000 UTC m=+765.330034765" watchObservedRunningTime="2025-10-07 11:33:18.538022362 +0000 UTC m=+765.334421351" Oct 07 11:33:19 crc kubenswrapper[4700]: I1007 11:33:19.504648 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zz4cl" event={"ID":"d2a66390-1e1f-433a-8637-8f9c03197ab8","Type":"ContainerStarted","Data":"81f6bb116e7cda2d95348fcff84a0d79707b295dbd8ef7fb84bbfdddbc769a93"} Oct 07 11:33:19 crc kubenswrapper[4700]: I1007 11:33:19.505107 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zz4cl" event={"ID":"d2a66390-1e1f-433a-8637-8f9c03197ab8","Type":"ContainerStarted","Data":"1a84d6b4474c0ec09587af39dd4304739e34e14d18921d9178c749d98dfd6610"} Oct 07 11:33:19 crc kubenswrapper[4700]: I1007 11:33:19.505118 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zz4cl" event={"ID":"d2a66390-1e1f-433a-8637-8f9c03197ab8","Type":"ContainerStarted","Data":"60276ea61aba6f6531da2d0ead1eb90c7bc900eb47db782048e94ba9d2e065f9"} Oct 07 11:33:19 crc kubenswrapper[4700]: I1007 11:33:19.505130 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zz4cl" event={"ID":"d2a66390-1e1f-433a-8637-8f9c03197ab8","Type":"ContainerStarted","Data":"756a2b008e1efc3446738da77ffc71e03458eddfcf02cefb0b8558b1ee86fb8f"} Oct 07 11:33:19 crc kubenswrapper[4700]: I1007 11:33:19.505141 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zz4cl" event={"ID":"d2a66390-1e1f-433a-8637-8f9c03197ab8","Type":"ContainerStarted","Data":"d6151712f35b25fe356b046aedea22bf94c1ddb0186c595ddaa1ab3599d86f6c"} Oct 07 11:33:20 crc kubenswrapper[4700]: I1007 11:33:20.515641 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zz4cl" event={"ID":"d2a66390-1e1f-433a-8637-8f9c03197ab8","Type":"ContainerStarted","Data":"2a07e22da7c8d44c98abe5c3069e9b69ea16d3c6b5238ce40f7bcbd0fc1dc089"} Oct 07 11:33:20 crc kubenswrapper[4700]: I1007 11:33:20.516222 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-zz4cl" Oct 07 11:33:20 crc kubenswrapper[4700]: I1007 11:33:20.540476 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-zz4cl" podStartSLOduration=6.818938389 podStartE2EDuration="15.540449441s" podCreationTimestamp="2025-10-07 11:33:05 +0000 UTC" firstStartedPulling="2025-10-07 11:33:06.85407483 +0000 UTC m=+753.650473859" lastFinishedPulling="2025-10-07 11:33:15.575585932 +0000 UTC m=+762.371984911" observedRunningTime="2025-10-07 11:33:20.538558392 +0000 UTC m=+767.334957381" watchObservedRunningTime="2025-10-07 11:33:20.540449441 +0000 UTC m=+767.336848430" Oct 07 11:33:21 crc kubenswrapper[4700]: I1007 11:33:21.723989 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-zz4cl" Oct 07 11:33:21 crc kubenswrapper[4700]: I1007 11:33:21.769854 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-zz4cl" Oct 07 11:33:26 crc kubenswrapper[4700]: I1007 11:33:26.140651 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-59fcf" Oct 07 11:33:27 crc kubenswrapper[4700]: I1007 11:33:27.709166 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-pv9ct" Oct 07 11:33:30 crc kubenswrapper[4700]: I1007 11:33:30.532408 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-rrqwh"] Oct 07 11:33:30 crc kubenswrapper[4700]: I1007 11:33:30.534172 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rrqwh" Oct 07 11:33:30 crc kubenswrapper[4700]: I1007 11:33:30.547434 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 07 11:33:30 crc kubenswrapper[4700]: I1007 11:33:30.554959 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rrqwh"] Oct 07 11:33:30 crc kubenswrapper[4700]: I1007 11:33:30.556907 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 07 11:33:30 crc kubenswrapper[4700]: I1007 11:33:30.636122 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdt9z\" (UniqueName: \"kubernetes.io/projected/147f20ac-032c-4a86-8445-8602712382c2-kube-api-access-cdt9z\") pod \"openstack-operator-index-rrqwh\" (UID: \"147f20ac-032c-4a86-8445-8602712382c2\") " pod="openstack-operators/openstack-operator-index-rrqwh" Oct 07 11:33:30 crc kubenswrapper[4700]: I1007 11:33:30.736944 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdt9z\" (UniqueName: \"kubernetes.io/projected/147f20ac-032c-4a86-8445-8602712382c2-kube-api-access-cdt9z\") pod \"openstack-operator-index-rrqwh\" (UID: \"147f20ac-032c-4a86-8445-8602712382c2\") " pod="openstack-operators/openstack-operator-index-rrqwh" Oct 07 11:33:30 crc kubenswrapper[4700]: I1007 11:33:30.760627 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdt9z\" (UniqueName: \"kubernetes.io/projected/147f20ac-032c-4a86-8445-8602712382c2-kube-api-access-cdt9z\") pod \"openstack-operator-index-rrqwh\" (UID: \"147f20ac-032c-4a86-8445-8602712382c2\") " pod="openstack-operators/openstack-operator-index-rrqwh" Oct 07 11:33:30 crc kubenswrapper[4700]: I1007 11:33:30.859700 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rrqwh" Oct 07 11:33:31 crc kubenswrapper[4700]: I1007 11:33:31.289807 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rrqwh"] Oct 07 11:33:31 crc kubenswrapper[4700]: I1007 11:33:31.599450 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rrqwh" event={"ID":"147f20ac-032c-4a86-8445-8602712382c2","Type":"ContainerStarted","Data":"81598f4f4ab7aba349c936b5b9b7297ccea76072dee89d63ef9a14268161137c"} Oct 07 11:33:33 crc kubenswrapper[4700]: I1007 11:33:33.708728 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-rrqwh"] Oct 07 11:33:34 crc kubenswrapper[4700]: I1007 11:33:34.329146 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-7lqzz"] Oct 07 11:33:34 crc kubenswrapper[4700]: I1007 11:33:34.340464 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7lqzz" Oct 07 11:33:34 crc kubenswrapper[4700]: I1007 11:33:34.343440 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7lqzz"] Oct 07 11:33:34 crc kubenswrapper[4700]: I1007 11:33:34.350722 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-j99t7" Oct 07 11:33:34 crc kubenswrapper[4700]: I1007 11:33:34.492750 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbskp\" (UniqueName: \"kubernetes.io/projected/5d24bd3d-929c-478a-9c26-3a94f09dd79a-kube-api-access-bbskp\") pod \"openstack-operator-index-7lqzz\" (UID: \"5d24bd3d-929c-478a-9c26-3a94f09dd79a\") " pod="openstack-operators/openstack-operator-index-7lqzz" Oct 07 11:33:34 crc kubenswrapper[4700]: I1007 11:33:34.593939 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbskp\" (UniqueName: \"kubernetes.io/projected/5d24bd3d-929c-478a-9c26-3a94f09dd79a-kube-api-access-bbskp\") pod \"openstack-operator-index-7lqzz\" (UID: \"5d24bd3d-929c-478a-9c26-3a94f09dd79a\") " pod="openstack-operators/openstack-operator-index-7lqzz" Oct 07 11:33:34 crc kubenswrapper[4700]: I1007 11:33:34.615824 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbskp\" (UniqueName: \"kubernetes.io/projected/5d24bd3d-929c-478a-9c26-3a94f09dd79a-kube-api-access-bbskp\") pod \"openstack-operator-index-7lqzz\" (UID: \"5d24bd3d-929c-478a-9c26-3a94f09dd79a\") " pod="openstack-operators/openstack-operator-index-7lqzz" Oct 07 11:33:34 crc kubenswrapper[4700]: I1007 11:33:34.620810 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rrqwh" event={"ID":"147f20ac-032c-4a86-8445-8602712382c2","Type":"ContainerStarted","Data":"9921a8d546e43e1d8596b27cfec7167a8b65e68b0cff180e013a15381aa2658f"} Oct 07 11:33:34 crc kubenswrapper[4700]: I1007 11:33:34.620995 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-rrqwh" podUID="147f20ac-032c-4a86-8445-8602712382c2" containerName="registry-server" containerID="cri-o://9921a8d546e43e1d8596b27cfec7167a8b65e68b0cff180e013a15381aa2658f" gracePeriod=2 Oct 07 11:33:34 crc kubenswrapper[4700]: I1007 11:33:34.639602 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-rrqwh" podStartSLOduration=1.5913609229999999 podStartE2EDuration="4.639572129s" podCreationTimestamp="2025-10-07 11:33:30 +0000 UTC" firstStartedPulling="2025-10-07 11:33:31.304928502 +0000 UTC m=+778.101327541" lastFinishedPulling="2025-10-07 11:33:34.353139748 +0000 UTC m=+781.149538747" observedRunningTime="2025-10-07 11:33:34.632913244 +0000 UTC m=+781.429312243" watchObservedRunningTime="2025-10-07 11:33:34.639572129 +0000 UTC m=+781.435971128" Oct 07 11:33:34 crc kubenswrapper[4700]: I1007 11:33:34.672692 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7lqzz" Oct 07 11:33:35 crc kubenswrapper[4700]: I1007 11:33:35.089788 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7lqzz"] Oct 07 11:33:35 crc kubenswrapper[4700]: W1007 11:33:35.098372 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d24bd3d_929c_478a_9c26_3a94f09dd79a.slice/crio-51ba5bbc34c9e3bd278cfb598fef972514ea8952d8b72f7cf47b72ae367f6a40 WatchSource:0}: Error finding container 51ba5bbc34c9e3bd278cfb598fef972514ea8952d8b72f7cf47b72ae367f6a40: Status 404 returned error can't find the container with id 51ba5bbc34c9e3bd278cfb598fef972514ea8952d8b72f7cf47b72ae367f6a40 Oct 07 11:33:35 crc kubenswrapper[4700]: I1007 11:33:35.163508 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rrqwh" Oct 07 11:33:35 crc kubenswrapper[4700]: I1007 11:33:35.302387 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdt9z\" (UniqueName: \"kubernetes.io/projected/147f20ac-032c-4a86-8445-8602712382c2-kube-api-access-cdt9z\") pod \"147f20ac-032c-4a86-8445-8602712382c2\" (UID: \"147f20ac-032c-4a86-8445-8602712382c2\") " Oct 07 11:33:35 crc kubenswrapper[4700]: I1007 11:33:35.311586 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/147f20ac-032c-4a86-8445-8602712382c2-kube-api-access-cdt9z" (OuterVolumeSpecName: "kube-api-access-cdt9z") pod "147f20ac-032c-4a86-8445-8602712382c2" (UID: "147f20ac-032c-4a86-8445-8602712382c2"). InnerVolumeSpecName "kube-api-access-cdt9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:33:35 crc kubenswrapper[4700]: I1007 11:33:35.404138 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdt9z\" (UniqueName: \"kubernetes.io/projected/147f20ac-032c-4a86-8445-8602712382c2-kube-api-access-cdt9z\") on node \"crc\" DevicePath \"\"" Oct 07 11:33:35 crc kubenswrapper[4700]: I1007 11:33:35.638137 4700 generic.go:334] "Generic (PLEG): container finished" podID="147f20ac-032c-4a86-8445-8602712382c2" containerID="9921a8d546e43e1d8596b27cfec7167a8b65e68b0cff180e013a15381aa2658f" exitCode=0 Oct 07 11:33:35 crc kubenswrapper[4700]: I1007 11:33:35.638248 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rrqwh" event={"ID":"147f20ac-032c-4a86-8445-8602712382c2","Type":"ContainerDied","Data":"9921a8d546e43e1d8596b27cfec7167a8b65e68b0cff180e013a15381aa2658f"} Oct 07 11:33:35 crc kubenswrapper[4700]: I1007 11:33:35.638291 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rrqwh" event={"ID":"147f20ac-032c-4a86-8445-8602712382c2","Type":"ContainerDied","Data":"81598f4f4ab7aba349c936b5b9b7297ccea76072dee89d63ef9a14268161137c"} Oct 07 11:33:35 crc kubenswrapper[4700]: I1007 11:33:35.638346 4700 scope.go:117] "RemoveContainer" containerID="9921a8d546e43e1d8596b27cfec7167a8b65e68b0cff180e013a15381aa2658f" Oct 07 11:33:35 crc kubenswrapper[4700]: I1007 11:33:35.638507 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rrqwh" Oct 07 11:33:35 crc kubenswrapper[4700]: I1007 11:33:35.643960 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7lqzz" event={"ID":"5d24bd3d-929c-478a-9c26-3a94f09dd79a","Type":"ContainerStarted","Data":"ecde8f5473edf573b2f06e8260c7b589b1f2682a6e2c5b1d72e6c6abffb646a1"} Oct 07 11:33:35 crc kubenswrapper[4700]: I1007 11:33:35.644059 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7lqzz" event={"ID":"5d24bd3d-929c-478a-9c26-3a94f09dd79a","Type":"ContainerStarted","Data":"51ba5bbc34c9e3bd278cfb598fef972514ea8952d8b72f7cf47b72ae367f6a40"} Oct 07 11:33:35 crc kubenswrapper[4700]: I1007 11:33:35.668054 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-7lqzz" podStartSLOduration=1.618729372 podStartE2EDuration="1.667997352s" podCreationTimestamp="2025-10-07 11:33:34 +0000 UTC" firstStartedPulling="2025-10-07 11:33:35.102606875 +0000 UTC m=+781.899005854" lastFinishedPulling="2025-10-07 11:33:35.151874795 +0000 UTC m=+781.948273834" observedRunningTime="2025-10-07 11:33:35.660713771 +0000 UTC m=+782.457112800" watchObservedRunningTime="2025-10-07 11:33:35.667997352 +0000 UTC m=+782.464396341" Oct 07 11:33:35 crc kubenswrapper[4700]: I1007 11:33:35.672383 4700 scope.go:117] "RemoveContainer" containerID="9921a8d546e43e1d8596b27cfec7167a8b65e68b0cff180e013a15381aa2658f" Oct 07 11:33:35 crc kubenswrapper[4700]: E1007 11:33:35.673146 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9921a8d546e43e1d8596b27cfec7167a8b65e68b0cff180e013a15381aa2658f\": container with ID starting with 9921a8d546e43e1d8596b27cfec7167a8b65e68b0cff180e013a15381aa2658f not found: ID does not exist" containerID="9921a8d546e43e1d8596b27cfec7167a8b65e68b0cff180e013a15381aa2658f" Oct 07 11:33:35 crc kubenswrapper[4700]: I1007 11:33:35.673226 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9921a8d546e43e1d8596b27cfec7167a8b65e68b0cff180e013a15381aa2658f"} err="failed to get container status \"9921a8d546e43e1d8596b27cfec7167a8b65e68b0cff180e013a15381aa2658f\": rpc error: code = NotFound desc = could not find container \"9921a8d546e43e1d8596b27cfec7167a8b65e68b0cff180e013a15381aa2658f\": container with ID starting with 9921a8d546e43e1d8596b27cfec7167a8b65e68b0cff180e013a15381aa2658f not found: ID does not exist" Oct 07 11:33:35 crc kubenswrapper[4700]: I1007 11:33:35.691022 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-rrqwh"] Oct 07 11:33:35 crc kubenswrapper[4700]: I1007 11:33:35.695952 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-rrqwh"] Oct 07 11:33:35 crc kubenswrapper[4700]: I1007 11:33:35.966360 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="147f20ac-032c-4a86-8445-8602712382c2" path="/var/lib/kubelet/pods/147f20ac-032c-4a86-8445-8602712382c2/volumes" Oct 07 11:33:36 crc kubenswrapper[4700]: I1007 11:33:36.728908 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-zz4cl" Oct 07 11:33:44 crc kubenswrapper[4700]: I1007 11:33:44.673152 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-7lqzz" Oct 07 11:33:44 crc kubenswrapper[4700]: I1007 11:33:44.673968 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-7lqzz" Oct 07 11:33:44 crc kubenswrapper[4700]: I1007 11:33:44.714010 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-7lqzz" Oct 07 11:33:44 crc kubenswrapper[4700]: I1007 11:33:44.751821 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-7lqzz" Oct 07 11:33:45 crc kubenswrapper[4700]: I1007 11:33:45.334406 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 11:33:45 crc kubenswrapper[4700]: I1007 11:33:45.334497 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 11:33:45 crc kubenswrapper[4700]: I1007 11:33:45.334563 4700 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" Oct 07 11:33:45 crc kubenswrapper[4700]: I1007 11:33:45.335527 4700 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"01b641391ace1b00a610b14b6a967ed35cc42ded426b03a9ec0a64a8438621b6"} pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 11:33:45 crc kubenswrapper[4700]: I1007 11:33:45.335632 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" containerID="cri-o://01b641391ace1b00a610b14b6a967ed35cc42ded426b03a9ec0a64a8438621b6" gracePeriod=600 Oct 07 11:33:45 crc kubenswrapper[4700]: I1007 11:33:45.731229 4700 generic.go:334] "Generic (PLEG): container finished" podID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerID="01b641391ace1b00a610b14b6a967ed35cc42ded426b03a9ec0a64a8438621b6" exitCode=0 Oct 07 11:33:45 crc kubenswrapper[4700]: I1007 11:33:45.731301 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" event={"ID":"97a77b38-e9b1-4243-ac3a-28d83d87cf15","Type":"ContainerDied","Data":"01b641391ace1b00a610b14b6a967ed35cc42ded426b03a9ec0a64a8438621b6"} Oct 07 11:33:45 crc kubenswrapper[4700]: I1007 11:33:45.731855 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" event={"ID":"97a77b38-e9b1-4243-ac3a-28d83d87cf15","Type":"ContainerStarted","Data":"b9e5650b4ada44376befecad6ee06386391296fc23c23a71914ec3f35d9306ee"} Oct 07 11:33:45 crc kubenswrapper[4700]: I1007 11:33:45.731901 4700 scope.go:117] "RemoveContainer" containerID="7b5dd63de890f68091da6e9c7a22abc43dbafc0c1de89538465502461bd7d04c" Oct 07 11:33:51 crc kubenswrapper[4700]: I1007 11:33:51.996999 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml"] Oct 07 11:33:51 crc kubenswrapper[4700]: E1007 11:33:51.998018 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="147f20ac-032c-4a86-8445-8602712382c2" containerName="registry-server" Oct 07 11:33:51 crc kubenswrapper[4700]: I1007 11:33:51.998040 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="147f20ac-032c-4a86-8445-8602712382c2" containerName="registry-server" Oct 07 11:33:51 crc kubenswrapper[4700]: I1007 11:33:51.998254 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="147f20ac-032c-4a86-8445-8602712382c2" containerName="registry-server" Oct 07 11:33:52 crc kubenswrapper[4700]: I1007 11:33:51.999745 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml" Oct 07 11:33:52 crc kubenswrapper[4700]: I1007 11:33:52.003769 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-v5zll" Oct 07 11:33:52 crc kubenswrapper[4700]: I1007 11:33:52.005848 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml"] Oct 07 11:33:52 crc kubenswrapper[4700]: I1007 11:33:52.121237 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2np77"] Oct 07 11:33:52 crc kubenswrapper[4700]: I1007 11:33:52.124203 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2np77" Oct 07 11:33:52 crc kubenswrapper[4700]: I1007 11:33:52.133079 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2np77"] Oct 07 11:33:52 crc kubenswrapper[4700]: I1007 11:33:52.164565 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6227f006-b587-451f-bff5-cf97da256b9f-bundle\") pod \"673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml\" (UID: \"6227f006-b587-451f-bff5-cf97da256b9f\") " pod="openstack-operators/673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml" Oct 07 11:33:52 crc kubenswrapper[4700]: I1007 11:33:52.164635 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cd187b6-6209-476c-8c8f-633a5c81aa95-catalog-content\") pod \"certified-operators-2np77\" (UID: \"9cd187b6-6209-476c-8c8f-633a5c81aa95\") " pod="openshift-marketplace/certified-operators-2np77" Oct 07 11:33:52 crc kubenswrapper[4700]: I1007 11:33:52.164901 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cd187b6-6209-476c-8c8f-633a5c81aa95-utilities\") pod \"certified-operators-2np77\" (UID: \"9cd187b6-6209-476c-8c8f-633a5c81aa95\") " pod="openshift-marketplace/certified-operators-2np77" Oct 07 11:33:52 crc kubenswrapper[4700]: I1007 11:33:52.164956 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htqjz\" (UniqueName: \"kubernetes.io/projected/9cd187b6-6209-476c-8c8f-633a5c81aa95-kube-api-access-htqjz\") pod \"certified-operators-2np77\" (UID: \"9cd187b6-6209-476c-8c8f-633a5c81aa95\") " pod="openshift-marketplace/certified-operators-2np77" Oct 07 11:33:52 crc kubenswrapper[4700]: I1007 11:33:52.165167 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwnfm\" (UniqueName: \"kubernetes.io/projected/6227f006-b587-451f-bff5-cf97da256b9f-kube-api-access-pwnfm\") pod \"673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml\" (UID: \"6227f006-b587-451f-bff5-cf97da256b9f\") " pod="openstack-operators/673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml" Oct 07 11:33:52 crc kubenswrapper[4700]: I1007 11:33:52.165269 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6227f006-b587-451f-bff5-cf97da256b9f-util\") pod \"673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml\" (UID: \"6227f006-b587-451f-bff5-cf97da256b9f\") " pod="openstack-operators/673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml" Oct 07 11:33:52 crc kubenswrapper[4700]: I1007 11:33:52.266239 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6227f006-b587-451f-bff5-cf97da256b9f-bundle\") pod \"673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml\" (UID: \"6227f006-b587-451f-bff5-cf97da256b9f\") " pod="openstack-operators/673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml" Oct 07 11:33:52 crc kubenswrapper[4700]: I1007 11:33:52.266299 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cd187b6-6209-476c-8c8f-633a5c81aa95-catalog-content\") pod \"certified-operators-2np77\" (UID: \"9cd187b6-6209-476c-8c8f-633a5c81aa95\") " pod="openshift-marketplace/certified-operators-2np77" Oct 07 11:33:52 crc kubenswrapper[4700]: I1007 11:33:52.266404 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cd187b6-6209-476c-8c8f-633a5c81aa95-utilities\") pod \"certified-operators-2np77\" (UID: \"9cd187b6-6209-476c-8c8f-633a5c81aa95\") " pod="openshift-marketplace/certified-operators-2np77" Oct 07 11:33:52 crc kubenswrapper[4700]: I1007 11:33:52.266423 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htqjz\" (UniqueName: \"kubernetes.io/projected/9cd187b6-6209-476c-8c8f-633a5c81aa95-kube-api-access-htqjz\") pod \"certified-operators-2np77\" (UID: \"9cd187b6-6209-476c-8c8f-633a5c81aa95\") " pod="openshift-marketplace/certified-operators-2np77" Oct 07 11:33:52 crc kubenswrapper[4700]: I1007 11:33:52.266455 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwnfm\" (UniqueName: \"kubernetes.io/projected/6227f006-b587-451f-bff5-cf97da256b9f-kube-api-access-pwnfm\") pod \"673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml\" (UID: \"6227f006-b587-451f-bff5-cf97da256b9f\") " pod="openstack-operators/673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml" Oct 07 11:33:52 crc kubenswrapper[4700]: I1007 11:33:52.266478 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6227f006-b587-451f-bff5-cf97da256b9f-util\") pod \"673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml\" (UID: \"6227f006-b587-451f-bff5-cf97da256b9f\") " pod="openstack-operators/673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml" Oct 07 11:33:52 crc kubenswrapper[4700]: I1007 11:33:52.266968 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6227f006-b587-451f-bff5-cf97da256b9f-bundle\") pod \"673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml\" (UID: \"6227f006-b587-451f-bff5-cf97da256b9f\") " pod="openstack-operators/673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml" Oct 07 11:33:52 crc kubenswrapper[4700]: I1007 11:33:52.266986 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6227f006-b587-451f-bff5-cf97da256b9f-util\") pod \"673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml\" (UID: \"6227f006-b587-451f-bff5-cf97da256b9f\") " pod="openstack-operators/673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml" Oct 07 11:33:52 crc kubenswrapper[4700]: I1007 11:33:52.267236 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cd187b6-6209-476c-8c8f-633a5c81aa95-catalog-content\") pod \"certified-operators-2np77\" (UID: \"9cd187b6-6209-476c-8c8f-633a5c81aa95\") " pod="openshift-marketplace/certified-operators-2np77" Oct 07 11:33:52 crc kubenswrapper[4700]: I1007 11:33:52.267388 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cd187b6-6209-476c-8c8f-633a5c81aa95-utilities\") pod \"certified-operators-2np77\" (UID: \"9cd187b6-6209-476c-8c8f-633a5c81aa95\") " pod="openshift-marketplace/certified-operators-2np77" Oct 07 11:33:52 crc kubenswrapper[4700]: I1007 11:33:52.288168 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwnfm\" (UniqueName: \"kubernetes.io/projected/6227f006-b587-451f-bff5-cf97da256b9f-kube-api-access-pwnfm\") pod \"673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml\" (UID: \"6227f006-b587-451f-bff5-cf97da256b9f\") " pod="openstack-operators/673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml" Oct 07 11:33:52 crc kubenswrapper[4700]: I1007 11:33:52.289380 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htqjz\" (UniqueName: \"kubernetes.io/projected/9cd187b6-6209-476c-8c8f-633a5c81aa95-kube-api-access-htqjz\") pod \"certified-operators-2np77\" (UID: \"9cd187b6-6209-476c-8c8f-633a5c81aa95\") " pod="openshift-marketplace/certified-operators-2np77" Oct 07 11:33:52 crc kubenswrapper[4700]: I1007 11:33:52.324199 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml" Oct 07 11:33:52 crc kubenswrapper[4700]: I1007 11:33:52.446991 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2np77" Oct 07 11:33:52 crc kubenswrapper[4700]: I1007 11:33:52.776020 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml"] Oct 07 11:33:52 crc kubenswrapper[4700]: I1007 11:33:52.802226 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml" event={"ID":"6227f006-b587-451f-bff5-cf97da256b9f","Type":"ContainerStarted","Data":"7b3b48279b348737c8e49949db1e144ba06a1a275d22f4110fba9d1ee67889e5"} Oct 07 11:33:52 crc kubenswrapper[4700]: I1007 11:33:52.895295 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2np77"] Oct 07 11:33:52 crc kubenswrapper[4700]: W1007 11:33:52.902740 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cd187b6_6209_476c_8c8f_633a5c81aa95.slice/crio-4be950ef4952767d69bc827f88219689414fc7d834add7ed846f1d05e90ee17f WatchSource:0}: Error finding container 4be950ef4952767d69bc827f88219689414fc7d834add7ed846f1d05e90ee17f: Status 404 returned error can't find the container with id 4be950ef4952767d69bc827f88219689414fc7d834add7ed846f1d05e90ee17f Oct 07 11:33:53 crc kubenswrapper[4700]: I1007 11:33:53.813418 4700 generic.go:334] "Generic (PLEG): container finished" podID="6227f006-b587-451f-bff5-cf97da256b9f" containerID="18e76fbfdd835af3384ca201f03d6969068b9ece7fa99d7991b8516ab32209fb" exitCode=0 Oct 07 11:33:53 crc kubenswrapper[4700]: I1007 11:33:53.813564 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml" event={"ID":"6227f006-b587-451f-bff5-cf97da256b9f","Type":"ContainerDied","Data":"18e76fbfdd835af3384ca201f03d6969068b9ece7fa99d7991b8516ab32209fb"} Oct 07 11:33:53 crc kubenswrapper[4700]: I1007 11:33:53.820334 4700 generic.go:334] "Generic (PLEG): container finished" podID="9cd187b6-6209-476c-8c8f-633a5c81aa95" containerID="1228f24fbd39d5a19d35bece2751e2c38d316fce1e3cbd40c2e952dc10fa91dc" exitCode=0 Oct 07 11:33:53 crc kubenswrapper[4700]: I1007 11:33:53.820415 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2np77" event={"ID":"9cd187b6-6209-476c-8c8f-633a5c81aa95","Type":"ContainerDied","Data":"1228f24fbd39d5a19d35bece2751e2c38d316fce1e3cbd40c2e952dc10fa91dc"} Oct 07 11:33:53 crc kubenswrapper[4700]: I1007 11:33:53.820511 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2np77" event={"ID":"9cd187b6-6209-476c-8c8f-633a5c81aa95","Type":"ContainerStarted","Data":"4be950ef4952767d69bc827f88219689414fc7d834add7ed846f1d05e90ee17f"} Oct 07 11:33:54 crc kubenswrapper[4700]: I1007 11:33:54.831748 4700 generic.go:334] "Generic (PLEG): container finished" podID="6227f006-b587-451f-bff5-cf97da256b9f" containerID="2515ff50429e1190807c82a15d7acc438b21eeb6befae33310a3cb2ac2419e10" exitCode=0 Oct 07 11:33:54 crc kubenswrapper[4700]: I1007 11:33:54.831851 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml" event={"ID":"6227f006-b587-451f-bff5-cf97da256b9f","Type":"ContainerDied","Data":"2515ff50429e1190807c82a15d7acc438b21eeb6befae33310a3cb2ac2419e10"} Oct 07 11:33:55 crc kubenswrapper[4700]: I1007 11:33:55.846175 4700 generic.go:334] "Generic (PLEG): container finished" podID="6227f006-b587-451f-bff5-cf97da256b9f" containerID="63589fa2c7510edefac5c534325532b0829b21a54ce0fe4c2b6843d55bf8be5f" exitCode=0 Oct 07 11:33:55 crc kubenswrapper[4700]: I1007 11:33:55.846273 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml" event={"ID":"6227f006-b587-451f-bff5-cf97da256b9f","Type":"ContainerDied","Data":"63589fa2c7510edefac5c534325532b0829b21a54ce0fe4c2b6843d55bf8be5f"} Oct 07 11:33:55 crc kubenswrapper[4700]: I1007 11:33:55.850247 4700 generic.go:334] "Generic (PLEG): container finished" podID="9cd187b6-6209-476c-8c8f-633a5c81aa95" containerID="cf2724986d93e5f396ed3f076398c88351bb0f17208e8f0d3b8fe910594726aa" exitCode=0 Oct 07 11:33:55 crc kubenswrapper[4700]: I1007 11:33:55.850282 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2np77" event={"ID":"9cd187b6-6209-476c-8c8f-633a5c81aa95","Type":"ContainerDied","Data":"cf2724986d93e5f396ed3f076398c88351bb0f17208e8f0d3b8fe910594726aa"} Oct 07 11:33:56 crc kubenswrapper[4700]: I1007 11:33:56.859821 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2np77" event={"ID":"9cd187b6-6209-476c-8c8f-633a5c81aa95","Type":"ContainerStarted","Data":"91675d4928d7b55f504029c211eb84551645b176ee6fd01b7aa5bc185887a73f"} Oct 07 11:33:57 crc kubenswrapper[4700]: I1007 11:33:57.125326 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml" Oct 07 11:33:57 crc kubenswrapper[4700]: I1007 11:33:57.153526 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2np77" podStartSLOduration=2.588592803 podStartE2EDuration="5.153503251s" podCreationTimestamp="2025-10-07 11:33:52 +0000 UTC" firstStartedPulling="2025-10-07 11:33:53.822980973 +0000 UTC m=+800.619380002" lastFinishedPulling="2025-10-07 11:33:56.387891461 +0000 UTC m=+803.184290450" observedRunningTime="2025-10-07 11:33:56.882000141 +0000 UTC m=+803.678399130" watchObservedRunningTime="2025-10-07 11:33:57.153503251 +0000 UTC m=+803.949902240" Oct 07 11:33:57 crc kubenswrapper[4700]: I1007 11:33:57.244127 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwnfm\" (UniqueName: \"kubernetes.io/projected/6227f006-b587-451f-bff5-cf97da256b9f-kube-api-access-pwnfm\") pod \"6227f006-b587-451f-bff5-cf97da256b9f\" (UID: \"6227f006-b587-451f-bff5-cf97da256b9f\") " Oct 07 11:33:57 crc kubenswrapper[4700]: I1007 11:33:57.244191 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6227f006-b587-451f-bff5-cf97da256b9f-util\") pod \"6227f006-b587-451f-bff5-cf97da256b9f\" (UID: \"6227f006-b587-451f-bff5-cf97da256b9f\") " Oct 07 11:33:57 crc kubenswrapper[4700]: I1007 11:33:57.244232 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6227f006-b587-451f-bff5-cf97da256b9f-bundle\") pod \"6227f006-b587-451f-bff5-cf97da256b9f\" (UID: \"6227f006-b587-451f-bff5-cf97da256b9f\") " Oct 07 11:33:57 crc kubenswrapper[4700]: I1007 11:33:57.245124 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6227f006-b587-451f-bff5-cf97da256b9f-bundle" (OuterVolumeSpecName: "bundle") pod "6227f006-b587-451f-bff5-cf97da256b9f" (UID: "6227f006-b587-451f-bff5-cf97da256b9f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:33:57 crc kubenswrapper[4700]: I1007 11:33:57.252364 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6227f006-b587-451f-bff5-cf97da256b9f-kube-api-access-pwnfm" (OuterVolumeSpecName: "kube-api-access-pwnfm") pod "6227f006-b587-451f-bff5-cf97da256b9f" (UID: "6227f006-b587-451f-bff5-cf97da256b9f"). InnerVolumeSpecName "kube-api-access-pwnfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:33:57 crc kubenswrapper[4700]: I1007 11:33:57.261184 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6227f006-b587-451f-bff5-cf97da256b9f-util" (OuterVolumeSpecName: "util") pod "6227f006-b587-451f-bff5-cf97da256b9f" (UID: "6227f006-b587-451f-bff5-cf97da256b9f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:33:57 crc kubenswrapper[4700]: I1007 11:33:57.346709 4700 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6227f006-b587-451f-bff5-cf97da256b9f-util\") on node \"crc\" DevicePath \"\"" Oct 07 11:33:57 crc kubenswrapper[4700]: I1007 11:33:57.346769 4700 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6227f006-b587-451f-bff5-cf97da256b9f-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:33:57 crc kubenswrapper[4700]: I1007 11:33:57.346778 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwnfm\" (UniqueName: \"kubernetes.io/projected/6227f006-b587-451f-bff5-cf97da256b9f-kube-api-access-pwnfm\") on node \"crc\" DevicePath \"\"" Oct 07 11:33:57 crc kubenswrapper[4700]: I1007 11:33:57.872013 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml" event={"ID":"6227f006-b587-451f-bff5-cf97da256b9f","Type":"ContainerDied","Data":"7b3b48279b348737c8e49949db1e144ba06a1a275d22f4110fba9d1ee67889e5"} Oct 07 11:33:57 crc kubenswrapper[4700]: I1007 11:33:57.872075 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml" Oct 07 11:33:57 crc kubenswrapper[4700]: I1007 11:33:57.872102 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b3b48279b348737c8e49949db1e144ba06a1a275d22f4110fba9d1ee67889e5" Oct 07 11:34:02 crc kubenswrapper[4700]: I1007 11:34:02.035624 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6489b698cc-vp52r"] Oct 07 11:34:02 crc kubenswrapper[4700]: E1007 11:34:02.036327 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6227f006-b587-451f-bff5-cf97da256b9f" containerName="util" Oct 07 11:34:02 crc kubenswrapper[4700]: I1007 11:34:02.036347 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="6227f006-b587-451f-bff5-cf97da256b9f" containerName="util" Oct 07 11:34:02 crc kubenswrapper[4700]: E1007 11:34:02.036368 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6227f006-b587-451f-bff5-cf97da256b9f" containerName="extract" Oct 07 11:34:02 crc kubenswrapper[4700]: I1007 11:34:02.036377 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="6227f006-b587-451f-bff5-cf97da256b9f" containerName="extract" Oct 07 11:34:02 crc kubenswrapper[4700]: E1007 11:34:02.036391 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6227f006-b587-451f-bff5-cf97da256b9f" containerName="pull" Oct 07 11:34:02 crc kubenswrapper[4700]: I1007 11:34:02.036402 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="6227f006-b587-451f-bff5-cf97da256b9f" containerName="pull" Oct 07 11:34:02 crc kubenswrapper[4700]: I1007 11:34:02.036546 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="6227f006-b587-451f-bff5-cf97da256b9f" containerName="extract" Oct 07 11:34:02 crc kubenswrapper[4700]: I1007 11:34:02.037374 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6489b698cc-vp52r" Oct 07 11:34:02 crc kubenswrapper[4700]: I1007 11:34:02.041121 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-k78w2" Oct 07 11:34:02 crc kubenswrapper[4700]: I1007 11:34:02.072239 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6489b698cc-vp52r"] Oct 07 11:34:02 crc kubenswrapper[4700]: I1007 11:34:02.224334 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq8j7\" (UniqueName: \"kubernetes.io/projected/6eb7ce42-9c6b-49c9-b46e-333112be077d-kube-api-access-vq8j7\") pod \"openstack-operator-controller-operator-6489b698cc-vp52r\" (UID: \"6eb7ce42-9c6b-49c9-b46e-333112be077d\") " pod="openstack-operators/openstack-operator-controller-operator-6489b698cc-vp52r" Oct 07 11:34:02 crc kubenswrapper[4700]: I1007 11:34:02.325704 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq8j7\" (UniqueName: \"kubernetes.io/projected/6eb7ce42-9c6b-49c9-b46e-333112be077d-kube-api-access-vq8j7\") pod \"openstack-operator-controller-operator-6489b698cc-vp52r\" (UID: \"6eb7ce42-9c6b-49c9-b46e-333112be077d\") " pod="openstack-operators/openstack-operator-controller-operator-6489b698cc-vp52r" Oct 07 11:34:02 crc kubenswrapper[4700]: I1007 11:34:02.357934 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq8j7\" (UniqueName: \"kubernetes.io/projected/6eb7ce42-9c6b-49c9-b46e-333112be077d-kube-api-access-vq8j7\") pod \"openstack-operator-controller-operator-6489b698cc-vp52r\" (UID: \"6eb7ce42-9c6b-49c9-b46e-333112be077d\") " pod="openstack-operators/openstack-operator-controller-operator-6489b698cc-vp52r" Oct 07 11:34:02 crc kubenswrapper[4700]: I1007 11:34:02.448659 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2np77" Oct 07 11:34:02 crc kubenswrapper[4700]: I1007 11:34:02.448770 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2np77" Oct 07 11:34:02 crc kubenswrapper[4700]: I1007 11:34:02.498076 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2np77" Oct 07 11:34:02 crc kubenswrapper[4700]: I1007 11:34:02.655402 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6489b698cc-vp52r" Oct 07 11:34:02 crc kubenswrapper[4700]: I1007 11:34:02.963996 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2np77" Oct 07 11:34:03 crc kubenswrapper[4700]: I1007 11:34:03.182918 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6489b698cc-vp52r"] Oct 07 11:34:03 crc kubenswrapper[4700]: I1007 11:34:03.924477 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6489b698cc-vp52r" event={"ID":"6eb7ce42-9c6b-49c9-b46e-333112be077d","Type":"ContainerStarted","Data":"7651c3f3535f8db88a6fa18b5d4a05fae9c9c175aeac2ae13121a297fe199425"} Oct 07 11:34:04 crc kubenswrapper[4700]: I1007 11:34:04.498938 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2np77"] Oct 07 11:34:05 crc kubenswrapper[4700]: I1007 11:34:05.938714 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2np77" podUID="9cd187b6-6209-476c-8c8f-633a5c81aa95" containerName="registry-server" containerID="cri-o://91675d4928d7b55f504029c211eb84551645b176ee6fd01b7aa5bc185887a73f" gracePeriod=2 Oct 07 11:34:06 crc kubenswrapper[4700]: I1007 11:34:06.950130 4700 generic.go:334] "Generic (PLEG): container finished" podID="9cd187b6-6209-476c-8c8f-633a5c81aa95" containerID="91675d4928d7b55f504029c211eb84551645b176ee6fd01b7aa5bc185887a73f" exitCode=0 Oct 07 11:34:06 crc kubenswrapper[4700]: I1007 11:34:06.950245 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2np77" event={"ID":"9cd187b6-6209-476c-8c8f-633a5c81aa95","Type":"ContainerDied","Data":"91675d4928d7b55f504029c211eb84551645b176ee6fd01b7aa5bc185887a73f"} Oct 07 11:34:07 crc kubenswrapper[4700]: I1007 11:34:07.223565 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2np77" Oct 07 11:34:07 crc kubenswrapper[4700]: I1007 11:34:07.331937 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htqjz\" (UniqueName: \"kubernetes.io/projected/9cd187b6-6209-476c-8c8f-633a5c81aa95-kube-api-access-htqjz\") pod \"9cd187b6-6209-476c-8c8f-633a5c81aa95\" (UID: \"9cd187b6-6209-476c-8c8f-633a5c81aa95\") " Oct 07 11:34:07 crc kubenswrapper[4700]: I1007 11:34:07.332065 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cd187b6-6209-476c-8c8f-633a5c81aa95-utilities\") pod \"9cd187b6-6209-476c-8c8f-633a5c81aa95\" (UID: \"9cd187b6-6209-476c-8c8f-633a5c81aa95\") " Oct 07 11:34:07 crc kubenswrapper[4700]: I1007 11:34:07.332103 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cd187b6-6209-476c-8c8f-633a5c81aa95-catalog-content\") pod \"9cd187b6-6209-476c-8c8f-633a5c81aa95\" (UID: \"9cd187b6-6209-476c-8c8f-633a5c81aa95\") " Oct 07 11:34:07 crc kubenswrapper[4700]: I1007 11:34:07.333483 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cd187b6-6209-476c-8c8f-633a5c81aa95-utilities" (OuterVolumeSpecName: "utilities") pod "9cd187b6-6209-476c-8c8f-633a5c81aa95" (UID: "9cd187b6-6209-476c-8c8f-633a5c81aa95"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:34:07 crc kubenswrapper[4700]: I1007 11:34:07.352684 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cd187b6-6209-476c-8c8f-633a5c81aa95-kube-api-access-htqjz" (OuterVolumeSpecName: "kube-api-access-htqjz") pod "9cd187b6-6209-476c-8c8f-633a5c81aa95" (UID: "9cd187b6-6209-476c-8c8f-633a5c81aa95"). InnerVolumeSpecName "kube-api-access-htqjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:34:07 crc kubenswrapper[4700]: I1007 11:34:07.393441 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cd187b6-6209-476c-8c8f-633a5c81aa95-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9cd187b6-6209-476c-8c8f-633a5c81aa95" (UID: "9cd187b6-6209-476c-8c8f-633a5c81aa95"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:34:07 crc kubenswrapper[4700]: I1007 11:34:07.434348 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htqjz\" (UniqueName: \"kubernetes.io/projected/9cd187b6-6209-476c-8c8f-633a5c81aa95-kube-api-access-htqjz\") on node \"crc\" DevicePath \"\"" Oct 07 11:34:07 crc kubenswrapper[4700]: I1007 11:34:07.434448 4700 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cd187b6-6209-476c-8c8f-633a5c81aa95-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 11:34:07 crc kubenswrapper[4700]: I1007 11:34:07.434464 4700 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cd187b6-6209-476c-8c8f-633a5c81aa95-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 11:34:07 crc kubenswrapper[4700]: I1007 11:34:07.965815 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2np77" Oct 07 11:34:07 crc kubenswrapper[4700]: I1007 11:34:07.966871 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2np77" event={"ID":"9cd187b6-6209-476c-8c8f-633a5c81aa95","Type":"ContainerDied","Data":"4be950ef4952767d69bc827f88219689414fc7d834add7ed846f1d05e90ee17f"} Oct 07 11:34:07 crc kubenswrapper[4700]: I1007 11:34:07.966942 4700 scope.go:117] "RemoveContainer" containerID="91675d4928d7b55f504029c211eb84551645b176ee6fd01b7aa5bc185887a73f" Oct 07 11:34:07 crc kubenswrapper[4700]: I1007 11:34:07.999356 4700 scope.go:117] "RemoveContainer" containerID="cf2724986d93e5f396ed3f076398c88351bb0f17208e8f0d3b8fe910594726aa" Oct 07 11:34:08 crc kubenswrapper[4700]: I1007 11:34:08.007839 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2np77"] Oct 07 11:34:08 crc kubenswrapper[4700]: I1007 11:34:08.020116 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2np77"] Oct 07 11:34:08 crc kubenswrapper[4700]: I1007 11:34:08.043710 4700 scope.go:117] "RemoveContainer" containerID="1228f24fbd39d5a19d35bece2751e2c38d316fce1e3cbd40c2e952dc10fa91dc" Oct 07 11:34:08 crc kubenswrapper[4700]: I1007 11:34:08.982059 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6489b698cc-vp52r" event={"ID":"6eb7ce42-9c6b-49c9-b46e-333112be077d","Type":"ContainerStarted","Data":"07a20adaebb1038bbce0563326a7c13e8b2bacbcb8c972d590c125022282eb6b"} Oct 07 11:34:09 crc kubenswrapper[4700]: I1007 11:34:09.969283 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cd187b6-6209-476c-8c8f-633a5c81aa95" path="/var/lib/kubelet/pods/9cd187b6-6209-476c-8c8f-633a5c81aa95/volumes" Oct 07 11:34:11 crc kubenswrapper[4700]: I1007 11:34:11.000610 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6489b698cc-vp52r" event={"ID":"6eb7ce42-9c6b-49c9-b46e-333112be077d","Type":"ContainerStarted","Data":"07b7f0d29c63da983a13e486047d668dd7b80fb0b8ea3fdf61da5cca68be8c1f"} Oct 07 11:34:11 crc kubenswrapper[4700]: I1007 11:34:11.000835 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-6489b698cc-vp52r" Oct 07 11:34:11 crc kubenswrapper[4700]: I1007 11:34:11.047210 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-6489b698cc-vp52r" podStartSLOduration=1.7997305209999999 podStartE2EDuration="9.047186537s" podCreationTimestamp="2025-10-07 11:34:02 +0000 UTC" firstStartedPulling="2025-10-07 11:34:03.195355903 +0000 UTC m=+809.991754922" lastFinishedPulling="2025-10-07 11:34:10.442811959 +0000 UTC m=+817.239210938" observedRunningTime="2025-10-07 11:34:11.046952661 +0000 UTC m=+817.843351660" watchObservedRunningTime="2025-10-07 11:34:11.047186537 +0000 UTC m=+817.843585526" Oct 07 11:34:12 crc kubenswrapper[4700]: I1007 11:34:12.010428 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-6489b698cc-vp52r" Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.705989 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-58c4cd55f4-vvp7c"] Oct 07 11:34:28 crc kubenswrapper[4700]: E1007 11:34:28.707199 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cd187b6-6209-476c-8c8f-633a5c81aa95" containerName="extract-utilities" Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.707220 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cd187b6-6209-476c-8c8f-633a5c81aa95" containerName="extract-utilities" Oct 07 11:34:28 crc kubenswrapper[4700]: E1007 11:34:28.707247 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cd187b6-6209-476c-8c8f-633a5c81aa95" containerName="extract-content" Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.707255 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cd187b6-6209-476c-8c8f-633a5c81aa95" containerName="extract-content" Oct 07 11:34:28 crc kubenswrapper[4700]: E1007 11:34:28.707273 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cd187b6-6209-476c-8c8f-633a5c81aa95" containerName="registry-server" Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.707280 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cd187b6-6209-476c-8c8f-633a5c81aa95" containerName="registry-server" Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.707485 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cd187b6-6209-476c-8c8f-633a5c81aa95" containerName="registry-server" Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.708395 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-vvp7c" Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.711406 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-wd2gh" Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.723273 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-mg8jv"] Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.724295 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-mg8jv" Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.733426 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-fwb45" Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.741857 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-58c4cd55f4-vvp7c"] Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.752538 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-9xkbh"] Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.753873 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-9xkbh" Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.756871 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-mg8jv"] Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.757017 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-btkht" Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.783207 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-9xkbh"] Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.793759 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brsvc\" (UniqueName: \"kubernetes.io/projected/6b424cb2-c37e-4db9-86f4-75132c345127-kube-api-access-brsvc\") pod \"designate-operator-controller-manager-75dfd9b554-9xkbh\" (UID: \"6b424cb2-c37e-4db9-86f4-75132c345127\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-9xkbh" Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.793826 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8w9l\" (UniqueName: \"kubernetes.io/projected/b5298439-5a42-4bca-aa5b-c3fb26b2e5e3-kube-api-access-g8w9l\") pod \"cinder-operator-controller-manager-7d4d4f8d-mg8jv\" (UID: \"b5298439-5a42-4bca-aa5b-c3fb26b2e5e3\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-mg8jv" Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.793896 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbh28\" (UniqueName: \"kubernetes.io/projected/b69da5cc-fa66-4adb-b136-1efe25092b40-kube-api-access-fbh28\") pod \"barbican-operator-controller-manager-58c4cd55f4-vvp7c\" (UID: \"b69da5cc-fa66-4adb-b136-1efe25092b40\") " pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-vvp7c" Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.840378 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5dc44df7d5-8dhrp"] Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.841576 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-8dhrp" Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.847984 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-vx4ww" Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.850938 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-54b4974c45-s2qwp"] Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.852130 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-s2qwp" Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.854825 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-mnp8r" Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.871392 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5dc44df7d5-8dhrp"] Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.892229 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-54b4974c45-s2qwp"] Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.895005 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbh28\" (UniqueName: \"kubernetes.io/projected/b69da5cc-fa66-4adb-b136-1efe25092b40-kube-api-access-fbh28\") pod \"barbican-operator-controller-manager-58c4cd55f4-vvp7c\" (UID: \"b69da5cc-fa66-4adb-b136-1efe25092b40\") " pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-vvp7c" Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.895089 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brsvc\" (UniqueName: \"kubernetes.io/projected/6b424cb2-c37e-4db9-86f4-75132c345127-kube-api-access-brsvc\") pod \"designate-operator-controller-manager-75dfd9b554-9xkbh\" (UID: \"6b424cb2-c37e-4db9-86f4-75132c345127\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-9xkbh" Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.895120 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8w9l\" (UniqueName: \"kubernetes.io/projected/b5298439-5a42-4bca-aa5b-c3fb26b2e5e3-kube-api-access-g8w9l\") pod \"cinder-operator-controller-manager-7d4d4f8d-mg8jv\" (UID: \"b5298439-5a42-4bca-aa5b-c3fb26b2e5e3\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-mg8jv" Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.898950 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-76d5b87f47-69f5t"] Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.899935 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-69f5t" Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.910913 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-njc97" Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.921241 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-76d5b87f47-69f5t"] Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.933335 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-5nnwj"] Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.934411 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-5nnwj" Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.941727 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.943600 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8w9l\" (UniqueName: \"kubernetes.io/projected/b5298439-5a42-4bca-aa5b-c3fb26b2e5e3-kube-api-access-g8w9l\") pod \"cinder-operator-controller-manager-7d4d4f8d-mg8jv\" (UID: \"b5298439-5a42-4bca-aa5b-c3fb26b2e5e3\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-mg8jv" Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.947184 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-5nnwj"] Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.948685 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-92ts6" Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.966108 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-649675d675-dfs4r"] Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.967431 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbh28\" (UniqueName: \"kubernetes.io/projected/b69da5cc-fa66-4adb-b136-1efe25092b40-kube-api-access-fbh28\") pod \"barbican-operator-controller-manager-58c4cd55f4-vvp7c\" (UID: \"b69da5cc-fa66-4adb-b136-1efe25092b40\") " pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-vvp7c" Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.971425 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-649675d675-dfs4r" Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.979769 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-qd5t5" Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.979905 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brsvc\" (UniqueName: \"kubernetes.io/projected/6b424cb2-c37e-4db9-86f4-75132c345127-kube-api-access-brsvc\") pod \"designate-operator-controller-manager-75dfd9b554-9xkbh\" (UID: \"6b424cb2-c37e-4db9-86f4-75132c345127\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-9xkbh" Oct 07 11:34:28 crc kubenswrapper[4700]: I1007 11:34:28.998129 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-dnn6x"] Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.001198 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx829\" (UniqueName: \"kubernetes.io/projected/0ef02b09-6290-414e-b0b3-f9d52138d53d-kube-api-access-tx829\") pod \"glance-operator-controller-manager-5dc44df7d5-8dhrp\" (UID: \"0ef02b09-6290-414e-b0b3-f9d52138d53d\") " pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-8dhrp" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.001264 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9qbf\" (UniqueName: \"kubernetes.io/projected/6992fc43-2f9e-414d-8c14-f08185ed395a-kube-api-access-z9qbf\") pod \"heat-operator-controller-manager-54b4974c45-s2qwp\" (UID: \"6992fc43-2f9e-414d-8c14-f08185ed395a\") " pod="openstack-operators/heat-operator-controller-manager-54b4974c45-s2qwp" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.021593 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-dnn6x" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.024187 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-649675d675-dfs4r"] Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.036320 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-2rtxr" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.040802 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-vvp7c" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.057542 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-mg8jv" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.075457 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-n4w8d"] Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.076592 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-n4w8d" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.078427 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-dnn6x"] Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.082741 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-9xkbh" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.084449 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-s9vhg" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.106406 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgmkd\" (UniqueName: \"kubernetes.io/projected/42762f72-5039-4394-a311-299b57c3485a-kube-api-access-cgmkd\") pod \"ironic-operator-controller-manager-649675d675-dfs4r\" (UID: \"42762f72-5039-4394-a311-299b57c3485a\") " pod="openstack-operators/ironic-operator-controller-manager-649675d675-dfs4r" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.106452 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tznpt\" (UniqueName: \"kubernetes.io/projected/1f959f97-0c30-4e7c-a006-9517950bc1c1-kube-api-access-tznpt\") pod \"manila-operator-controller-manager-65d89cfd9f-n4w8d\" (UID: \"1f959f97-0c30-4e7c-a006-9517950bc1c1\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-n4w8d" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.106492 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z6n8\" (UniqueName: \"kubernetes.io/projected/6f56ee78-9e72-4fbb-abff-985e142a17cb-kube-api-access-9z6n8\") pod \"horizon-operator-controller-manager-76d5b87f47-69f5t\" (UID: \"6f56ee78-9e72-4fbb-abff-985e142a17cb\") " pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-69f5t" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.106506 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glkn9\" (UniqueName: \"kubernetes.io/projected/e260e7ed-c267-40b2-861a-9a77325e027a-kube-api-access-glkn9\") pod \"keystone-operator-controller-manager-7b5ccf6d9c-dnn6x\" (UID: \"e260e7ed-c267-40b2-861a-9a77325e027a\") " pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-dnn6x" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.106549 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx829\" (UniqueName: \"kubernetes.io/projected/0ef02b09-6290-414e-b0b3-f9d52138d53d-kube-api-access-tx829\") pod \"glance-operator-controller-manager-5dc44df7d5-8dhrp\" (UID: \"0ef02b09-6290-414e-b0b3-f9d52138d53d\") " pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-8dhrp" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.106572 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9qbf\" (UniqueName: \"kubernetes.io/projected/6992fc43-2f9e-414d-8c14-f08185ed395a-kube-api-access-z9qbf\") pod \"heat-operator-controller-manager-54b4974c45-s2qwp\" (UID: \"6992fc43-2f9e-414d-8c14-f08185ed395a\") " pod="openstack-operators/heat-operator-controller-manager-54b4974c45-s2qwp" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.106592 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxwn8\" (UniqueName: \"kubernetes.io/projected/0df5f995-5a5e-4c40-a498-9dd5ffd4381c-kube-api-access-rxwn8\") pod \"infra-operator-controller-manager-658588b8c9-5nnwj\" (UID: \"0df5f995-5a5e-4c40-a498-9dd5ffd4381c\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-5nnwj" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.106632 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0df5f995-5a5e-4c40-a498-9dd5ffd4381c-cert\") pod \"infra-operator-controller-manager-658588b8c9-5nnwj\" (UID: \"0df5f995-5a5e-4c40-a498-9dd5ffd4381c\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-5nnwj" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.118490 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-n4w8d"] Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.192569 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9qbf\" (UniqueName: \"kubernetes.io/projected/6992fc43-2f9e-414d-8c14-f08185ed395a-kube-api-access-z9qbf\") pod \"heat-operator-controller-manager-54b4974c45-s2qwp\" (UID: \"6992fc43-2f9e-414d-8c14-f08185ed395a\") " pod="openstack-operators/heat-operator-controller-manager-54b4974c45-s2qwp" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.222455 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxwn8\" (UniqueName: \"kubernetes.io/projected/0df5f995-5a5e-4c40-a498-9dd5ffd4381c-kube-api-access-rxwn8\") pod \"infra-operator-controller-manager-658588b8c9-5nnwj\" (UID: \"0df5f995-5a5e-4c40-a498-9dd5ffd4381c\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-5nnwj" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.222539 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0df5f995-5a5e-4c40-a498-9dd5ffd4381c-cert\") pod \"infra-operator-controller-manager-658588b8c9-5nnwj\" (UID: \"0df5f995-5a5e-4c40-a498-9dd5ffd4381c\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-5nnwj" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.222570 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgmkd\" (UniqueName: \"kubernetes.io/projected/42762f72-5039-4394-a311-299b57c3485a-kube-api-access-cgmkd\") pod \"ironic-operator-controller-manager-649675d675-dfs4r\" (UID: \"42762f72-5039-4394-a311-299b57c3485a\") " pod="openstack-operators/ironic-operator-controller-manager-649675d675-dfs4r" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.222595 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tznpt\" (UniqueName: \"kubernetes.io/projected/1f959f97-0c30-4e7c-a006-9517950bc1c1-kube-api-access-tznpt\") pod \"manila-operator-controller-manager-65d89cfd9f-n4w8d\" (UID: \"1f959f97-0c30-4e7c-a006-9517950bc1c1\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-n4w8d" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.222636 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z6n8\" (UniqueName: \"kubernetes.io/projected/6f56ee78-9e72-4fbb-abff-985e142a17cb-kube-api-access-9z6n8\") pod \"horizon-operator-controller-manager-76d5b87f47-69f5t\" (UID: \"6f56ee78-9e72-4fbb-abff-985e142a17cb\") " pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-69f5t" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.222652 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glkn9\" (UniqueName: \"kubernetes.io/projected/e260e7ed-c267-40b2-861a-9a77325e027a-kube-api-access-glkn9\") pod \"keystone-operator-controller-manager-7b5ccf6d9c-dnn6x\" (UID: \"e260e7ed-c267-40b2-861a-9a77325e027a\") " pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-dnn6x" Oct 07 11:34:29 crc kubenswrapper[4700]: E1007 11:34:29.227842 4700 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 07 11:34:29 crc kubenswrapper[4700]: E1007 11:34:29.227916 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0df5f995-5a5e-4c40-a498-9dd5ffd4381c-cert podName:0df5f995-5a5e-4c40-a498-9dd5ffd4381c nodeName:}" failed. No retries permitted until 2025-10-07 11:34:29.727898042 +0000 UTC m=+836.524297031 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0df5f995-5a5e-4c40-a498-9dd5ffd4381c-cert") pod "infra-operator-controller-manager-658588b8c9-5nnwj" (UID: "0df5f995-5a5e-4c40-a498-9dd5ffd4381c") : secret "infra-operator-webhook-server-cert" not found Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.238802 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx829\" (UniqueName: \"kubernetes.io/projected/0ef02b09-6290-414e-b0b3-f9d52138d53d-kube-api-access-tx829\") pod \"glance-operator-controller-manager-5dc44df7d5-8dhrp\" (UID: \"0ef02b09-6290-414e-b0b3-f9d52138d53d\") " pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-8dhrp" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.242772 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-jrw6r"] Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.244157 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-jrw6r" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.249416 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-xxzx6"] Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.250232 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-xxzx6" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.261977 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-5szjk"] Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.263133 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-5szjk" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.264884 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-ggllq" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.265157 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-rswsw" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.276683 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-fk8tl" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.280128 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-x8twn"] Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.285543 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-x8twn" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.301816 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-4tvr2" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.302034 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-xxzx6"] Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.327607 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-x8twn"] Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.339226 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-jrw6r"] Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.341377 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgmkd\" (UniqueName: \"kubernetes.io/projected/42762f72-5039-4394-a311-299b57c3485a-kube-api-access-cgmkd\") pod \"ironic-operator-controller-manager-649675d675-dfs4r\" (UID: \"42762f72-5039-4394-a311-299b57c3485a\") " pod="openstack-operators/ironic-operator-controller-manager-649675d675-dfs4r" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.347158 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxwn8\" (UniqueName: \"kubernetes.io/projected/0df5f995-5a5e-4c40-a498-9dd5ffd4381c-kube-api-access-rxwn8\") pod \"infra-operator-controller-manager-658588b8c9-5nnwj\" (UID: \"0df5f995-5a5e-4c40-a498-9dd5ffd4381c\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-5nnwj" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.347238 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tznpt\" (UniqueName: \"kubernetes.io/projected/1f959f97-0c30-4e7c-a006-9517950bc1c1-kube-api-access-tznpt\") pod \"manila-operator-controller-manager-65d89cfd9f-n4w8d\" (UID: \"1f959f97-0c30-4e7c-a006-9517950bc1c1\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-n4w8d" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.352301 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glkn9\" (UniqueName: \"kubernetes.io/projected/e260e7ed-c267-40b2-861a-9a77325e027a-kube-api-access-glkn9\") pod \"keystone-operator-controller-manager-7b5ccf6d9c-dnn6x\" (UID: \"e260e7ed-c267-40b2-861a-9a77325e027a\") " pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-dnn6x" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.352378 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-5szjk"] Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.358145 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z6n8\" (UniqueName: \"kubernetes.io/projected/6f56ee78-9e72-4fbb-abff-985e142a17cb-kube-api-access-9z6n8\") pod \"horizon-operator-controller-manager-76d5b87f47-69f5t\" (UID: \"6f56ee78-9e72-4fbb-abff-985e142a17cb\") " pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-69f5t" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.423688 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ctqg6s"] Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.425723 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ctqg6s" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.433206 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6txtb\" (UniqueName: \"kubernetes.io/projected/4e24a2f2-716e-4273-86a7-7ad450736748-kube-api-access-6txtb\") pod \"octavia-operator-controller-manager-7468f855d8-x8twn\" (UID: \"4e24a2f2-716e-4273-86a7-7ad450736748\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-x8twn" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.433321 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjl2j\" (UniqueName: \"kubernetes.io/projected/1986b44c-0b98-4c70-a2a3-78f86e586d87-kube-api-access-kjl2j\") pod \"neutron-operator-controller-manager-8d984cc4d-jrw6r\" (UID: \"1986b44c-0b98-4c70-a2a3-78f86e586d87\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-jrw6r" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.434614 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjj97\" (UniqueName: \"kubernetes.io/projected/bc62ffd3-f1d8-46c2-8777-c6ad960d68a8-kube-api-access-xjj97\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-5szjk\" (UID: \"bc62ffd3-f1d8-46c2-8777-c6ad960d68a8\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-5szjk" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.434774 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpc5b\" (UniqueName: \"kubernetes.io/projected/7a64a375-b3e0-47ac-b715-cea59989a781-kube-api-access-kpc5b\") pod \"nova-operator-controller-manager-7c7fc454ff-xxzx6\" (UID: \"7a64a375-b3e0-47ac-b715-cea59989a781\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-xxzx6" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.438684 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-lvzjp" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.438912 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-jv8c7"] Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.440182 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-jv8c7" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.442881 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.449684 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-g8mlb" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.457577 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ctqg6s"] Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.470657 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-jv8c7"] Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.475760 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-8dhrp" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.482410 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-s2qwp" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.483459 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-xvtgc"] Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.484873 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-xvtgc" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.495270 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-gs8dz" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.517394 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-xvtgc"] Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.536507 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrtfq\" (UniqueName: \"kubernetes.io/projected/d50469d1-cf53-4396-9d9c-f03db7eb43f3-kube-api-access-qrtfq\") pod \"ovn-operator-controller-manager-6d8b6f9b9-jv8c7\" (UID: \"d50469d1-cf53-4396-9d9c-f03db7eb43f3\") " pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-jv8c7" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.536586 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6txtb\" (UniqueName: \"kubernetes.io/projected/4e24a2f2-716e-4273-86a7-7ad450736748-kube-api-access-6txtb\") pod \"octavia-operator-controller-manager-7468f855d8-x8twn\" (UID: \"4e24a2f2-716e-4273-86a7-7ad450736748\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-x8twn" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.536656 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjl2j\" (UniqueName: \"kubernetes.io/projected/1986b44c-0b98-4c70-a2a3-78f86e586d87-kube-api-access-kjl2j\") pod \"neutron-operator-controller-manager-8d984cc4d-jrw6r\" (UID: \"1986b44c-0b98-4c70-a2a3-78f86e586d87\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-jrw6r" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.536703 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4140e5b-60c0-42c1-9440-0070b773f8c6-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665ctqg6s\" (UID: \"e4140e5b-60c0-42c1-9440-0070b773f8c6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ctqg6s" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.536729 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9j2h\" (UniqueName: \"kubernetes.io/projected/e4140e5b-60c0-42c1-9440-0070b773f8c6-kube-api-access-q9j2h\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665ctqg6s\" (UID: \"e4140e5b-60c0-42c1-9440-0070b773f8c6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ctqg6s" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.536752 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjj97\" (UniqueName: \"kubernetes.io/projected/bc62ffd3-f1d8-46c2-8777-c6ad960d68a8-kube-api-access-xjj97\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-5szjk\" (UID: \"bc62ffd3-f1d8-46c2-8777-c6ad960d68a8\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-5szjk" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.536786 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpc5b\" (UniqueName: \"kubernetes.io/projected/7a64a375-b3e0-47ac-b715-cea59989a781-kube-api-access-kpc5b\") pod \"nova-operator-controller-manager-7c7fc454ff-xxzx6\" (UID: \"7a64a375-b3e0-47ac-b715-cea59989a781\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-xxzx6" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.542446 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-649675d675-dfs4r" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.543110 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-69f5t" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.544155 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-dnn6x" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.550218 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-nq2r8"] Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.551844 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-nq2r8" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.568532 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-gdgt2" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.591524 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-nq2r8"] Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.599202 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-n4w8d" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.621030 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-bf98bb7b6-ghwv9"] Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.621506 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjl2j\" (UniqueName: \"kubernetes.io/projected/1986b44c-0b98-4c70-a2a3-78f86e586d87-kube-api-access-kjl2j\") pod \"neutron-operator-controller-manager-8d984cc4d-jrw6r\" (UID: \"1986b44c-0b98-4c70-a2a3-78f86e586d87\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-jrw6r" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.622260 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpc5b\" (UniqueName: \"kubernetes.io/projected/7a64a375-b3e0-47ac-b715-cea59989a781-kube-api-access-kpc5b\") pod \"nova-operator-controller-manager-7c7fc454ff-xxzx6\" (UID: \"7a64a375-b3e0-47ac-b715-cea59989a781\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-xxzx6" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.622770 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-bf98bb7b6-ghwv9" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.631438 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-4r4vv" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.632515 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjj97\" (UniqueName: \"kubernetes.io/projected/bc62ffd3-f1d8-46c2-8777-c6ad960d68a8-kube-api-access-xjj97\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-5szjk\" (UID: \"bc62ffd3-f1d8-46c2-8777-c6ad960d68a8\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-5szjk" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.635099 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6txtb\" (UniqueName: \"kubernetes.io/projected/4e24a2f2-716e-4273-86a7-7ad450736748-kube-api-access-6txtb\") pod \"octavia-operator-controller-manager-7468f855d8-x8twn\" (UID: \"4e24a2f2-716e-4273-86a7-7ad450736748\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-x8twn" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.638217 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4140e5b-60c0-42c1-9440-0070b773f8c6-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665ctqg6s\" (UID: \"e4140e5b-60c0-42c1-9440-0070b773f8c6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ctqg6s" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.638437 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9j2h\" (UniqueName: \"kubernetes.io/projected/e4140e5b-60c0-42c1-9440-0070b773f8c6-kube-api-access-q9j2h\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665ctqg6s\" (UID: \"e4140e5b-60c0-42c1-9440-0070b773f8c6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ctqg6s" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.638627 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5f7k\" (UniqueName: \"kubernetes.io/projected/0f2a771c-5adf-4c17-94ac-d7b988e3ea86-kube-api-access-b5f7k\") pod \"swift-operator-controller-manager-6859f9b676-nq2r8\" (UID: \"0f2a771c-5adf-4c17-94ac-d7b988e3ea86\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-nq2r8" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.638757 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsmgv\" (UniqueName: \"kubernetes.io/projected/81628b8b-eb67-4514-bbb4-44341c3962ce-kube-api-access-dsmgv\") pod \"placement-operator-controller-manager-54689d9f88-xvtgc\" (UID: \"81628b8b-eb67-4514-bbb4-44341c3962ce\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-xvtgc" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.638927 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrtfq\" (UniqueName: \"kubernetes.io/projected/d50469d1-cf53-4396-9d9c-f03db7eb43f3-kube-api-access-qrtfq\") pod \"ovn-operator-controller-manager-6d8b6f9b9-jv8c7\" (UID: \"d50469d1-cf53-4396-9d9c-f03db7eb43f3\") " pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-jv8c7" Oct 07 11:34:29 crc kubenswrapper[4700]: E1007 11:34:29.639932 4700 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 07 11:34:29 crc kubenswrapper[4700]: E1007 11:34:29.640113 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4140e5b-60c0-42c1-9440-0070b773f8c6-cert podName:e4140e5b-60c0-42c1-9440-0070b773f8c6 nodeName:}" failed. No retries permitted until 2025-10-07 11:34:30.140070386 +0000 UTC m=+836.936469375 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4140e5b-60c0-42c1-9440-0070b773f8c6-cert") pod "openstack-baremetal-operator-controller-manager-5dfbbd665ctqg6s" (UID: "e4140e5b-60c0-42c1-9440-0070b773f8c6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.654892 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-9zh29"] Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.656449 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-9zh29" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.662326 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-khdlk"] Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.664887 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-khdlk" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.669645 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-px9n8" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.669857 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-nvbfs" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.674041 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-bf98bb7b6-ghwv9"] Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.685738 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-khdlk"] Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.698248 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9j2h\" (UniqueName: \"kubernetes.io/projected/e4140e5b-60c0-42c1-9440-0070b773f8c6-kube-api-access-q9j2h\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665ctqg6s\" (UID: \"e4140e5b-60c0-42c1-9440-0070b773f8c6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ctqg6s" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.701608 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-xxzx6" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.703887 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrtfq\" (UniqueName: \"kubernetes.io/projected/d50469d1-cf53-4396-9d9c-f03db7eb43f3-kube-api-access-qrtfq\") pod \"ovn-operator-controller-manager-6d8b6f9b9-jv8c7\" (UID: \"d50469d1-cf53-4396-9d9c-f03db7eb43f3\") " pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-jv8c7" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.704838 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-9zh29"] Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.730115 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-5szjk" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.741696 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvxqf\" (UniqueName: \"kubernetes.io/projected/ae80f142-9ac8-4614-a0c5-b74dfe98b0c8-kube-api-access-tvxqf\") pod \"test-operator-controller-manager-5cd5cb47d7-9zh29\" (UID: \"ae80f142-9ac8-4614-a0c5-b74dfe98b0c8\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-9zh29" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.741743 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtr9d\" (UniqueName: \"kubernetes.io/projected/c147a183-5f67-45a0-a971-87e75df2a66e-kube-api-access-gtr9d\") pod \"watcher-operator-controller-manager-6cbc6dd547-khdlk\" (UID: \"c147a183-5f67-45a0-a971-87e75df2a66e\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-khdlk" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.741774 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd47c\" (UniqueName: \"kubernetes.io/projected/b5911608-e23e-46e8-9637-488593110278-kube-api-access-bd47c\") pod \"telemetry-operator-controller-manager-bf98bb7b6-ghwv9\" (UID: \"b5911608-e23e-46e8-9637-488593110278\") " pod="openstack-operators/telemetry-operator-controller-manager-bf98bb7b6-ghwv9" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.741820 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0df5f995-5a5e-4c40-a498-9dd5ffd4381c-cert\") pod \"infra-operator-controller-manager-658588b8c9-5nnwj\" (UID: \"0df5f995-5a5e-4c40-a498-9dd5ffd4381c\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-5nnwj" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.741883 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5f7k\" (UniqueName: \"kubernetes.io/projected/0f2a771c-5adf-4c17-94ac-d7b988e3ea86-kube-api-access-b5f7k\") pod \"swift-operator-controller-manager-6859f9b676-nq2r8\" (UID: \"0f2a771c-5adf-4c17-94ac-d7b988e3ea86\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-nq2r8" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.741914 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsmgv\" (UniqueName: \"kubernetes.io/projected/81628b8b-eb67-4514-bbb4-44341c3962ce-kube-api-access-dsmgv\") pod \"placement-operator-controller-manager-54689d9f88-xvtgc\" (UID: \"81628b8b-eb67-4514-bbb4-44341c3962ce\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-xvtgc" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.746951 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-x8twn" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.803504 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0df5f995-5a5e-4c40-a498-9dd5ffd4381c-cert\") pod \"infra-operator-controller-manager-658588b8c9-5nnwj\" (UID: \"0df5f995-5a5e-4c40-a498-9dd5ffd4381c\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-5nnwj" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.809870 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsmgv\" (UniqueName: \"kubernetes.io/projected/81628b8b-eb67-4514-bbb4-44341c3962ce-kube-api-access-dsmgv\") pod \"placement-operator-controller-manager-54689d9f88-xvtgc\" (UID: \"81628b8b-eb67-4514-bbb4-44341c3962ce\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-xvtgc" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.827950 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-jv8c7" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.829351 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5f7k\" (UniqueName: \"kubernetes.io/projected/0f2a771c-5adf-4c17-94ac-d7b988e3ea86-kube-api-access-b5f7k\") pod \"swift-operator-controller-manager-6859f9b676-nq2r8\" (UID: \"0f2a771c-5adf-4c17-94ac-d7b988e3ea86\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-nq2r8" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.844044 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvxqf\" (UniqueName: \"kubernetes.io/projected/ae80f142-9ac8-4614-a0c5-b74dfe98b0c8-kube-api-access-tvxqf\") pod \"test-operator-controller-manager-5cd5cb47d7-9zh29\" (UID: \"ae80f142-9ac8-4614-a0c5-b74dfe98b0c8\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-9zh29" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.844103 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtr9d\" (UniqueName: \"kubernetes.io/projected/c147a183-5f67-45a0-a971-87e75df2a66e-kube-api-access-gtr9d\") pod \"watcher-operator-controller-manager-6cbc6dd547-khdlk\" (UID: \"c147a183-5f67-45a0-a971-87e75df2a66e\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-khdlk" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.844131 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd47c\" (UniqueName: \"kubernetes.io/projected/b5911608-e23e-46e8-9637-488593110278-kube-api-access-bd47c\") pod \"telemetry-operator-controller-manager-bf98bb7b6-ghwv9\" (UID: \"b5911608-e23e-46e8-9637-488593110278\") " pod="openstack-operators/telemetry-operator-controller-manager-bf98bb7b6-ghwv9" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.850968 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-xvtgc" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.878953 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvxqf\" (UniqueName: \"kubernetes.io/projected/ae80f142-9ac8-4614-a0c5-b74dfe98b0c8-kube-api-access-tvxqf\") pod \"test-operator-controller-manager-5cd5cb47d7-9zh29\" (UID: \"ae80f142-9ac8-4614-a0c5-b74dfe98b0c8\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-9zh29" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.898488 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd47c\" (UniqueName: \"kubernetes.io/projected/b5911608-e23e-46e8-9637-488593110278-kube-api-access-bd47c\") pod \"telemetry-operator-controller-manager-bf98bb7b6-ghwv9\" (UID: \"b5911608-e23e-46e8-9637-488593110278\") " pod="openstack-operators/telemetry-operator-controller-manager-bf98bb7b6-ghwv9" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.922071 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-jrw6r" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.923165 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtr9d\" (UniqueName: \"kubernetes.io/projected/c147a183-5f67-45a0-a971-87e75df2a66e-kube-api-access-gtr9d\") pod \"watcher-operator-controller-manager-6cbc6dd547-khdlk\" (UID: \"c147a183-5f67-45a0-a971-87e75df2a66e\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-khdlk" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.939487 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-589f7cdddc-lk7np"] Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.940668 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-khdlk" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.964070 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-589f7cdddc-lk7np" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.973743 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-gfgj9" Oct 07 11:34:29 crc kubenswrapper[4700]: I1007 11:34:29.973976 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 07 11:34:30 crc kubenswrapper[4700]: I1007 11:34:30.022899 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-589f7cdddc-lk7np"] Oct 07 11:34:30 crc kubenswrapper[4700]: I1007 11:34:30.023332 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-5nnwj" Oct 07 11:34:30 crc kubenswrapper[4700]: I1007 11:34:30.032458 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-nq2r8" Oct 07 11:34:30 crc kubenswrapper[4700]: I1007 11:34:30.086814 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rcxsc"] Oct 07 11:34:30 crc kubenswrapper[4700]: I1007 11:34:30.087873 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rcxsc" Oct 07 11:34:30 crc kubenswrapper[4700]: I1007 11:34:30.090352 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgpzz\" (UniqueName: \"kubernetes.io/projected/86cca1e4-373c-41df-b120-c3199ef30fe0-kube-api-access-kgpzz\") pod \"openstack-operator-controller-manager-589f7cdddc-lk7np\" (UID: \"86cca1e4-373c-41df-b120-c3199ef30fe0\") " pod="openstack-operators/openstack-operator-controller-manager-589f7cdddc-lk7np" Oct 07 11:34:30 crc kubenswrapper[4700]: I1007 11:34:30.090406 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-n5qdz" Oct 07 11:34:30 crc kubenswrapper[4700]: I1007 11:34:30.090581 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86cca1e4-373c-41df-b120-c3199ef30fe0-cert\") pod \"openstack-operator-controller-manager-589f7cdddc-lk7np\" (UID: \"86cca1e4-373c-41df-b120-c3199ef30fe0\") " pod="openstack-operators/openstack-operator-controller-manager-589f7cdddc-lk7np" Oct 07 11:34:30 crc kubenswrapper[4700]: I1007 11:34:30.091836 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-bf98bb7b6-ghwv9" Oct 07 11:34:30 crc kubenswrapper[4700]: I1007 11:34:30.095247 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rcxsc"] Oct 07 11:34:30 crc kubenswrapper[4700]: I1007 11:34:30.135715 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-9zh29" Oct 07 11:34:30 crc kubenswrapper[4700]: I1007 11:34:30.192117 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmx7s\" (UniqueName: \"kubernetes.io/projected/ad8f84bd-e06b-4015-8168-938a9e1ebeaa-kube-api-access-gmx7s\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-rcxsc\" (UID: \"ad8f84bd-e06b-4015-8168-938a9e1ebeaa\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rcxsc" Oct 07 11:34:30 crc kubenswrapper[4700]: I1007 11:34:30.192214 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4140e5b-60c0-42c1-9440-0070b773f8c6-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665ctqg6s\" (UID: \"e4140e5b-60c0-42c1-9440-0070b773f8c6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ctqg6s" Oct 07 11:34:30 crc kubenswrapper[4700]: I1007 11:34:30.192260 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgpzz\" (UniqueName: \"kubernetes.io/projected/86cca1e4-373c-41df-b120-c3199ef30fe0-kube-api-access-kgpzz\") pod \"openstack-operator-controller-manager-589f7cdddc-lk7np\" (UID: \"86cca1e4-373c-41df-b120-c3199ef30fe0\") " pod="openstack-operators/openstack-operator-controller-manager-589f7cdddc-lk7np" Oct 07 11:34:30 crc kubenswrapper[4700]: I1007 11:34:30.192363 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86cca1e4-373c-41df-b120-c3199ef30fe0-cert\") pod \"openstack-operator-controller-manager-589f7cdddc-lk7np\" (UID: \"86cca1e4-373c-41df-b120-c3199ef30fe0\") " pod="openstack-operators/openstack-operator-controller-manager-589f7cdddc-lk7np" Oct 07 11:34:30 crc kubenswrapper[4700]: E1007 11:34:30.193572 4700 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 07 11:34:30 crc kubenswrapper[4700]: E1007 11:34:30.193659 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4140e5b-60c0-42c1-9440-0070b773f8c6-cert podName:e4140e5b-60c0-42c1-9440-0070b773f8c6 nodeName:}" failed. No retries permitted until 2025-10-07 11:34:31.193636813 +0000 UTC m=+837.990035992 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4140e5b-60c0-42c1-9440-0070b773f8c6-cert") pod "openstack-baremetal-operator-controller-manager-5dfbbd665ctqg6s" (UID: "e4140e5b-60c0-42c1-9440-0070b773f8c6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 07 11:34:30 crc kubenswrapper[4700]: I1007 11:34:30.198288 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86cca1e4-373c-41df-b120-c3199ef30fe0-cert\") pod \"openstack-operator-controller-manager-589f7cdddc-lk7np\" (UID: \"86cca1e4-373c-41df-b120-c3199ef30fe0\") " pod="openstack-operators/openstack-operator-controller-manager-589f7cdddc-lk7np" Oct 07 11:34:30 crc kubenswrapper[4700]: I1007 11:34:30.228806 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgpzz\" (UniqueName: \"kubernetes.io/projected/86cca1e4-373c-41df-b120-c3199ef30fe0-kube-api-access-kgpzz\") pod \"openstack-operator-controller-manager-589f7cdddc-lk7np\" (UID: \"86cca1e4-373c-41df-b120-c3199ef30fe0\") " pod="openstack-operators/openstack-operator-controller-manager-589f7cdddc-lk7np" Oct 07 11:34:30 crc kubenswrapper[4700]: I1007 11:34:30.231219 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-mg8jv"] Oct 07 11:34:30 crc kubenswrapper[4700]: I1007 11:34:30.235678 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-9xkbh"] Oct 07 11:34:30 crc kubenswrapper[4700]: I1007 11:34:30.298445 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmx7s\" (UniqueName: \"kubernetes.io/projected/ad8f84bd-e06b-4015-8168-938a9e1ebeaa-kube-api-access-gmx7s\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-rcxsc\" (UID: \"ad8f84bd-e06b-4015-8168-938a9e1ebeaa\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rcxsc" Oct 07 11:34:30 crc kubenswrapper[4700]: I1007 11:34:30.333352 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmx7s\" (UniqueName: \"kubernetes.io/projected/ad8f84bd-e06b-4015-8168-938a9e1ebeaa-kube-api-access-gmx7s\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-rcxsc\" (UID: \"ad8f84bd-e06b-4015-8168-938a9e1ebeaa\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rcxsc" Oct 07 11:34:30 crc kubenswrapper[4700]: W1007 11:34:30.354278 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b424cb2_c37e_4db9_86f4_75132c345127.slice/crio-a22bd520e97ce84773c6678c39e5363b2701cea491cc1d6206ed9348e7e5affa WatchSource:0}: Error finding container a22bd520e97ce84773c6678c39e5363b2701cea491cc1d6206ed9348e7e5affa: Status 404 returned error can't find the container with id a22bd520e97ce84773c6678c39e5363b2701cea491cc1d6206ed9348e7e5affa Oct 07 11:34:30 crc kubenswrapper[4700]: I1007 11:34:30.397413 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-58c4cd55f4-vvp7c"] Oct 07 11:34:30 crc kubenswrapper[4700]: I1007 11:34:30.525275 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-589f7cdddc-lk7np" Oct 07 11:34:30 crc kubenswrapper[4700]: I1007 11:34:30.564860 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rcxsc" Oct 07 11:34:30 crc kubenswrapper[4700]: I1007 11:34:30.805903 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5dc44df7d5-8dhrp"] Oct 07 11:34:30 crc kubenswrapper[4700]: W1007 11:34:30.811268 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ef02b09_6290_414e_b0b3_f9d52138d53d.slice/crio-c431e1241fe612dde6eade77a6bdd6fbef43437288ca257ea7a97a0fbe6e584a WatchSource:0}: Error finding container c431e1241fe612dde6eade77a6bdd6fbef43437288ca257ea7a97a0fbe6e584a: Status 404 returned error can't find the container with id c431e1241fe612dde6eade77a6bdd6fbef43437288ca257ea7a97a0fbe6e584a Oct 07 11:34:30 crc kubenswrapper[4700]: I1007 11:34:30.839476 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-54b4974c45-s2qwp"] Oct 07 11:34:30 crc kubenswrapper[4700]: I1007 11:34:30.989113 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-76d5b87f47-69f5t"] Oct 07 11:34:31 crc kubenswrapper[4700]: I1007 11:34:31.003638 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-dnn6x"] Oct 07 11:34:31 crc kubenswrapper[4700]: I1007 11:34:31.012583 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-649675d675-dfs4r"] Oct 07 11:34:31 crc kubenswrapper[4700]: I1007 11:34:31.216617 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4140e5b-60c0-42c1-9440-0070b773f8c6-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665ctqg6s\" (UID: \"e4140e5b-60c0-42c1-9440-0070b773f8c6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ctqg6s" Oct 07 11:34:31 crc kubenswrapper[4700]: I1007 11:34:31.227735 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4140e5b-60c0-42c1-9440-0070b773f8c6-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665ctqg6s\" (UID: \"e4140e5b-60c0-42c1-9440-0070b773f8c6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ctqg6s" Oct 07 11:34:31 crc kubenswrapper[4700]: I1007 11:34:31.295137 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-vvp7c" event={"ID":"b69da5cc-fa66-4adb-b136-1efe25092b40","Type":"ContainerStarted","Data":"31689d1eb7f2c3c39ef7519c29dd340af86dc75e61ed2dcfbe2b65420caca23f"} Oct 07 11:34:31 crc kubenswrapper[4700]: I1007 11:34:31.296805 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-69f5t" event={"ID":"6f56ee78-9e72-4fbb-abff-985e142a17cb","Type":"ContainerStarted","Data":"89c1f56e89412ef1b9ab7df1ee2ba641d3f371622c7d5df733db27b3fdcd128c"} Oct 07 11:34:31 crc kubenswrapper[4700]: I1007 11:34:31.300516 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-s2qwp" event={"ID":"6992fc43-2f9e-414d-8c14-f08185ed395a","Type":"ContainerStarted","Data":"f4697e201a9ddd6371d26f91f441a74d990a6ff06c9018d27bad098564928b58"} Oct 07 11:34:31 crc kubenswrapper[4700]: I1007 11:34:31.306496 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-mg8jv" event={"ID":"b5298439-5a42-4bca-aa5b-c3fb26b2e5e3","Type":"ContainerStarted","Data":"1b47a8977cbb7470bf005e1bcaef109584afcd69e0e88f88351f191b8715949d"} Oct 07 11:34:31 crc kubenswrapper[4700]: I1007 11:34:31.315753 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ctqg6s" Oct 07 11:34:31 crc kubenswrapper[4700]: I1007 11:34:31.316481 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-9xkbh" event={"ID":"6b424cb2-c37e-4db9-86f4-75132c345127","Type":"ContainerStarted","Data":"a22bd520e97ce84773c6678c39e5363b2701cea491cc1d6206ed9348e7e5affa"} Oct 07 11:34:31 crc kubenswrapper[4700]: I1007 11:34:31.331439 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-dnn6x" event={"ID":"e260e7ed-c267-40b2-861a-9a77325e027a","Type":"ContainerStarted","Data":"c3c79f1d8cde30fc00d802fde231e844fe20aaf1e0a7d038c492a4f980101e92"} Oct 07 11:34:31 crc kubenswrapper[4700]: I1007 11:34:31.356240 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-649675d675-dfs4r" event={"ID":"42762f72-5039-4394-a311-299b57c3485a","Type":"ContainerStarted","Data":"bd62467b843b5a45f174512005f31dce5d2227d50a69ff9bd9377fd3e9b49c29"} Oct 07 11:34:31 crc kubenswrapper[4700]: I1007 11:34:31.367508 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-8dhrp" event={"ID":"0ef02b09-6290-414e-b0b3-f9d52138d53d","Type":"ContainerStarted","Data":"c431e1241fe612dde6eade77a6bdd6fbef43437288ca257ea7a97a0fbe6e584a"} Oct 07 11:34:31 crc kubenswrapper[4700]: I1007 11:34:31.394696 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-jrw6r"] Oct 07 11:34:31 crc kubenswrapper[4700]: I1007 11:34:31.494470 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-jv8c7"] Oct 07 11:34:31 crc kubenswrapper[4700]: I1007 11:34:31.513822 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-xxzx6"] Oct 07 11:34:31 crc kubenswrapper[4700]: I1007 11:34:31.541503 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-x8twn"] Oct 07 11:34:31 crc kubenswrapper[4700]: I1007 11:34:31.552041 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-5nnwj"] Oct 07 11:34:31 crc kubenswrapper[4700]: I1007 11:34:31.559038 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-9zh29"] Oct 07 11:34:31 crc kubenswrapper[4700]: I1007 11:34:31.569407 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-xvtgc"] Oct 07 11:34:31 crc kubenswrapper[4700]: I1007 11:34:31.576400 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-n4w8d"] Oct 07 11:34:31 crc kubenswrapper[4700]: W1007 11:34:31.582737 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd50469d1_cf53_4396_9d9c_f03db7eb43f3.slice/crio-51a29ff15c5a9c3474cd60ba943d4b5a7bede2be87c17ce516a64d347906e617 WatchSource:0}: Error finding container 51a29ff15c5a9c3474cd60ba943d4b5a7bede2be87c17ce516a64d347906e617: Status 404 returned error can't find the container with id 51a29ff15c5a9c3474cd60ba943d4b5a7bede2be87c17ce516a64d347906e617 Oct 07 11:34:31 crc kubenswrapper[4700]: I1007 11:34:31.582838 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-589f7cdddc-lk7np"] Oct 07 11:34:31 crc kubenswrapper[4700]: I1007 11:34:31.586697 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-5szjk"] Oct 07 11:34:31 crc kubenswrapper[4700]: W1007 11:34:31.668699 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0df5f995_5a5e_4c40_a498_9dd5ffd4381c.slice/crio-3dd4dde6a65c04ac3c8cc02b5997b7a986af117a31a552f7d039e5f76120b694 WatchSource:0}: Error finding container 3dd4dde6a65c04ac3c8cc02b5997b7a986af117a31a552f7d039e5f76120b694: Status 404 returned error can't find the container with id 3dd4dde6a65c04ac3c8cc02b5997b7a986af117a31a552f7d039e5f76120b694 Oct 07 11:34:31 crc kubenswrapper[4700]: I1007 11:34:31.704896 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-khdlk"] Oct 07 11:34:31 crc kubenswrapper[4700]: I1007 11:34:31.761085 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-bf98bb7b6-ghwv9"] Oct 07 11:34:31 crc kubenswrapper[4700]: E1007 11:34:31.826701 4700 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.89:5001/openstack-k8s-operators/telemetry-operator:dca0dd56bbbd37bc25583aa07e260de4ba61d7f2,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bd47c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-bf98bb7b6-ghwv9_openstack-operators(b5911608-e23e-46e8-9637-488593110278): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 11:34:31 crc kubenswrapper[4700]: I1007 11:34:31.859117 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-nq2r8"] Oct 07 11:34:31 crc kubenswrapper[4700]: E1007 11:34:31.865194 4700 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gtr9d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6cbc6dd547-khdlk_openstack-operators(c147a183-5f67-45a0-a971-87e75df2a66e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 11:34:31 crc kubenswrapper[4700]: E1007 11:34:31.868973 4700 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gmx7s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-rcxsc_openstack-operators(ad8f84bd-e06b-4015-8168-938a9e1ebeaa): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 11:34:31 crc kubenswrapper[4700]: E1007 11:34:31.869148 4700 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b5f7k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-6859f9b676-nq2r8_openstack-operators(0f2a771c-5adf-4c17-94ac-d7b988e3ea86): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 11:34:31 crc kubenswrapper[4700]: E1007 11:34:31.870369 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rcxsc" podUID="ad8f84bd-e06b-4015-8168-938a9e1ebeaa" Oct 07 11:34:31 crc kubenswrapper[4700]: I1007 11:34:31.880049 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rcxsc"] Oct 07 11:34:32 crc kubenswrapper[4700]: I1007 11:34:32.088820 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ctqg6s"] Oct 07 11:34:32 crc kubenswrapper[4700]: W1007 11:34:32.135177 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4140e5b_60c0_42c1_9440_0070b773f8c6.slice/crio-6f3f2632f3ed765486d402d722c165686ee101827f6d568d7066c18b135f0587 WatchSource:0}: Error finding container 6f3f2632f3ed765486d402d722c165686ee101827f6d568d7066c18b135f0587: Status 404 returned error can't find the container with id 6f3f2632f3ed765486d402d722c165686ee101827f6d568d7066c18b135f0587 Oct 07 11:34:32 crc kubenswrapper[4700]: E1007 11:34:32.328961 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-khdlk" podUID="c147a183-5f67-45a0-a971-87e75df2a66e" Oct 07 11:34:32 crc kubenswrapper[4700]: E1007 11:34:32.348692 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-nq2r8" podUID="0f2a771c-5adf-4c17-94ac-d7b988e3ea86" Oct 07 11:34:32 crc kubenswrapper[4700]: E1007 11:34:32.348906 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-bf98bb7b6-ghwv9" podUID="b5911608-e23e-46e8-9637-488593110278" Oct 07 11:34:32 crc kubenswrapper[4700]: I1007 11:34:32.387275 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-nq2r8" event={"ID":"0f2a771c-5adf-4c17-94ac-d7b988e3ea86","Type":"ContainerStarted","Data":"db19e4235fbe1785e599ab0076dd1b7e9edf2a1bdee80c20b7716ab8ffb7b498"} Oct 07 11:34:32 crc kubenswrapper[4700]: I1007 11:34:32.387348 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-nq2r8" event={"ID":"0f2a771c-5adf-4c17-94ac-d7b988e3ea86","Type":"ContainerStarted","Data":"d7da669d86acc45094726586cd42fcf135eb6029a40baabfcbb7ba7de69ad8e5"} Oct 07 11:34:32 crc kubenswrapper[4700]: E1007 11:34:32.389818 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-nq2r8" podUID="0f2a771c-5adf-4c17-94ac-d7b988e3ea86" Oct 07 11:34:32 crc kubenswrapper[4700]: I1007 11:34:32.390235 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-xxzx6" event={"ID":"7a64a375-b3e0-47ac-b715-cea59989a781","Type":"ContainerStarted","Data":"4c271169b066a9de9150d89ce6fb8769876020702e986ddfb7f3682f3a846f6f"} Oct 07 11:34:32 crc kubenswrapper[4700]: I1007 11:34:32.398223 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-bf98bb7b6-ghwv9" event={"ID":"b5911608-e23e-46e8-9637-488593110278","Type":"ContainerStarted","Data":"f81f6c45ba84eb75d20792e74b4d44f6873db1ca21760b77fe29dc75fa3a694b"} Oct 07 11:34:32 crc kubenswrapper[4700]: I1007 11:34:32.398284 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-bf98bb7b6-ghwv9" event={"ID":"b5911608-e23e-46e8-9637-488593110278","Type":"ContainerStarted","Data":"4cbdee30691296af8a138dd554d5b9b4b4e17f36ee1c26e89f5cd66f231f6ab5"} Oct 07 11:34:32 crc kubenswrapper[4700]: E1007 11:34:32.400414 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.89:5001/openstack-k8s-operators/telemetry-operator:dca0dd56bbbd37bc25583aa07e260de4ba61d7f2\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-bf98bb7b6-ghwv9" podUID="b5911608-e23e-46e8-9637-488593110278" Oct 07 11:34:32 crc kubenswrapper[4700]: I1007 11:34:32.403686 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-n4w8d" event={"ID":"1f959f97-0c30-4e7c-a006-9517950bc1c1","Type":"ContainerStarted","Data":"3dc0aac40974b0e37d80d37d9ec47da52a0766d00abc967fe8ca060b16846f1c"} Oct 07 11:34:32 crc kubenswrapper[4700]: I1007 11:34:32.405212 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-jrw6r" event={"ID":"1986b44c-0b98-4c70-a2a3-78f86e586d87","Type":"ContainerStarted","Data":"f00fce9353c5fc0c2afa2f7e9a0db88241a9c8d5de475283554f600cf3cd09a0"} Oct 07 11:34:32 crc kubenswrapper[4700]: I1007 11:34:32.407689 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-5szjk" event={"ID":"bc62ffd3-f1d8-46c2-8777-c6ad960d68a8","Type":"ContainerStarted","Data":"8148063b545b48bfd7260fe348924ccb84391591ba864e9874c0ef2f3d849ff5"} Oct 07 11:34:32 crc kubenswrapper[4700]: I1007 11:34:32.445551 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-5nnwj" event={"ID":"0df5f995-5a5e-4c40-a498-9dd5ffd4381c","Type":"ContainerStarted","Data":"3dd4dde6a65c04ac3c8cc02b5997b7a986af117a31a552f7d039e5f76120b694"} Oct 07 11:34:32 crc kubenswrapper[4700]: I1007 11:34:32.455148 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-9zh29" event={"ID":"ae80f142-9ac8-4614-a0c5-b74dfe98b0c8","Type":"ContainerStarted","Data":"a543a06926cc0b725d8d2f2bd62c4b72e91a7ca15885e5f53e9b66f1e9369c6b"} Oct 07 11:34:32 crc kubenswrapper[4700]: I1007 11:34:32.463903 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ctqg6s" event={"ID":"e4140e5b-60c0-42c1-9440-0070b773f8c6","Type":"ContainerStarted","Data":"6f3f2632f3ed765486d402d722c165686ee101827f6d568d7066c18b135f0587"} Oct 07 11:34:32 crc kubenswrapper[4700]: I1007 11:34:32.479344 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-khdlk" event={"ID":"c147a183-5f67-45a0-a971-87e75df2a66e","Type":"ContainerStarted","Data":"d0da4abba2e4dc56646a8b8a90fe5f6ecb7f0246d466d6bddeffc8a527141e11"} Oct 07 11:34:32 crc kubenswrapper[4700]: I1007 11:34:32.479401 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-khdlk" event={"ID":"c147a183-5f67-45a0-a971-87e75df2a66e","Type":"ContainerStarted","Data":"5d8896206f54cb667166b13e68c195d50876e5890a34dd86ec693148fcedb161"} Oct 07 11:34:32 crc kubenswrapper[4700]: I1007 11:34:32.484861 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-x8twn" event={"ID":"4e24a2f2-716e-4273-86a7-7ad450736748","Type":"ContainerStarted","Data":"8c70b1c2de8323c992eff499507d12652aef8890b9ada80c9b86a2271eac3e19"} Oct 07 11:34:32 crc kubenswrapper[4700]: I1007 11:34:32.489067 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-xvtgc" event={"ID":"81628b8b-eb67-4514-bbb4-44341c3962ce","Type":"ContainerStarted","Data":"50e4219d591cc66fc2e769751eb6c656a5a9367a91e61cd8d84567ce626bfd42"} Oct 07 11:34:32 crc kubenswrapper[4700]: E1007 11:34:32.492844 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-khdlk" podUID="c147a183-5f67-45a0-a971-87e75df2a66e" Oct 07 11:34:32 crc kubenswrapper[4700]: I1007 11:34:32.496921 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rcxsc" event={"ID":"ad8f84bd-e06b-4015-8168-938a9e1ebeaa","Type":"ContainerStarted","Data":"ed101329973a52a0ece121d16f41eb204a1276013f863c5c0c5b5546431990c5"} Oct 07 11:34:32 crc kubenswrapper[4700]: E1007 11:34:32.498808 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rcxsc" podUID="ad8f84bd-e06b-4015-8168-938a9e1ebeaa" Oct 07 11:34:32 crc kubenswrapper[4700]: I1007 11:34:32.503295 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-589f7cdddc-lk7np" event={"ID":"86cca1e4-373c-41df-b120-c3199ef30fe0","Type":"ContainerStarted","Data":"17727e3cda6b8f1f8dd402104443dce81d3764ef2227057a4e0518ad8ef44bbf"} Oct 07 11:34:32 crc kubenswrapper[4700]: I1007 11:34:32.503391 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-589f7cdddc-lk7np" event={"ID":"86cca1e4-373c-41df-b120-c3199ef30fe0","Type":"ContainerStarted","Data":"d8c396d645d9f9e72ca9148c4251ca23ce01a12b4d967edadac6c7fd9819f3c2"} Oct 07 11:34:32 crc kubenswrapper[4700]: I1007 11:34:32.505745 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-jv8c7" event={"ID":"d50469d1-cf53-4396-9d9c-f03db7eb43f3","Type":"ContainerStarted","Data":"51a29ff15c5a9c3474cd60ba943d4b5a7bede2be87c17ce516a64d347906e617"} Oct 07 11:34:33 crc kubenswrapper[4700]: I1007 11:34:33.521604 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-589f7cdddc-lk7np" event={"ID":"86cca1e4-373c-41df-b120-c3199ef30fe0","Type":"ContainerStarted","Data":"846a4111347bebca499701312a523f4ad31dfd56453ffee69604d7aad820e4f0"} Oct 07 11:34:33 crc kubenswrapper[4700]: I1007 11:34:33.522114 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-589f7cdddc-lk7np" Oct 07 11:34:33 crc kubenswrapper[4700]: E1007 11:34:33.523699 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rcxsc" podUID="ad8f84bd-e06b-4015-8168-938a9e1ebeaa" Oct 07 11:34:33 crc kubenswrapper[4700]: E1007 11:34:33.524757 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-nq2r8" podUID="0f2a771c-5adf-4c17-94ac-d7b988e3ea86" Oct 07 11:34:33 crc kubenswrapper[4700]: E1007 11:34:33.527446 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.89:5001/openstack-k8s-operators/telemetry-operator:dca0dd56bbbd37bc25583aa07e260de4ba61d7f2\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-bf98bb7b6-ghwv9" podUID="b5911608-e23e-46e8-9637-488593110278" Oct 07 11:34:33 crc kubenswrapper[4700]: E1007 11:34:33.527767 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-khdlk" podUID="c147a183-5f67-45a0-a971-87e75df2a66e" Oct 07 11:34:33 crc kubenswrapper[4700]: I1007 11:34:33.633536 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-589f7cdddc-lk7np" podStartSLOduration=4.633513206 podStartE2EDuration="4.633513206s" podCreationTimestamp="2025-10-07 11:34:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:34:33.5927876 +0000 UTC m=+840.389186589" watchObservedRunningTime="2025-10-07 11:34:33.633513206 +0000 UTC m=+840.429912185" Oct 07 11:34:40 crc kubenswrapper[4700]: I1007 11:34:40.533942 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-589f7cdddc-lk7np" Oct 07 11:34:44 crc kubenswrapper[4700]: E1007 11:34:44.320917 4700 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:dfd044635f9df9ed1d249387fa622177db35cdc72475e1c570617b8d17c64862" Oct 07 11:34:44 crc kubenswrapper[4700]: E1007 11:34:44.321562 4700 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:dfd044635f9df9ed1d249387fa622177db35cdc72475e1c570617b8d17c64862,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kjl2j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-8d984cc4d-jrw6r_openstack-operators(1986b44c-0b98-4c70-a2a3-78f86e586d87): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 11:34:44 crc kubenswrapper[4700]: E1007 11:34:44.806640 4700 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:5f96b563a63494082323bfced089d6589e0c89db43c6a39a2e912c79b1a278fe" Oct 07 11:34:44 crc kubenswrapper[4700]: E1007 11:34:44.806868 4700 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:5f96b563a63494082323bfced089d6589e0c89db43c6a39a2e912c79b1a278fe,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xjj97,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-6cd6d7bdf5-5szjk_openstack-operators(bc62ffd3-f1d8-46c2-8777-c6ad960d68a8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 11:34:45 crc kubenswrapper[4700]: E1007 11:34:45.334721 4700 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:063aae1458289d1090a77c74c2b978b9eb978b0e4062c399f0cb5434a8dd2757" Oct 07 11:34:45 crc kubenswrapper[4700]: E1007 11:34:45.335518 4700 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:063aae1458289d1090a77c74c2b978b9eb978b0e4062c399f0cb5434a8dd2757,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tznpt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-65d89cfd9f-n4w8d_openstack-operators(1f959f97-0c30-4e7c-a006-9517950bc1c1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 11:34:45 crc kubenswrapper[4700]: E1007 11:34:45.862013 4700 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb" Oct 07 11:34:45 crc kubenswrapper[4700]: E1007 11:34:45.862223 4700 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tvxqf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5cd5cb47d7-9zh29_openstack-operators(ae80f142-9ac8-4614-a0c5-b74dfe98b0c8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 11:34:46 crc kubenswrapper[4700]: E1007 11:34:46.399362 4700 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:b6ab8fc3ad425eca2e073fe9ba9d5b29d9ea4d9814de7bb799fa330209566cd4" Oct 07 11:34:46 crc kubenswrapper[4700]: E1007 11:34:46.399573 4700 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:b6ab8fc3ad425eca2e073fe9ba9d5b29d9ea4d9814de7bb799fa330209566cd4,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9z6n8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-76d5b87f47-69f5t_openstack-operators(6f56ee78-9e72-4fbb-abff-985e142a17cb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 11:34:46 crc kubenswrapper[4700]: E1007 11:34:46.901980 4700 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:f37e29d1f621c23c0d77b09076006d1e8002a77c2ff3d9b8921f893221cb1d09" Oct 07 11:34:46 crc kubenswrapper[4700]: E1007 11:34:46.902747 4700 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:f37e29d1f621c23c0d77b09076006d1e8002a77c2ff3d9b8921f893221cb1d09,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qrtfq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-6d8b6f9b9-jv8c7_openstack-operators(d50469d1-cf53-4396-9d9c-f03db7eb43f3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 11:34:47 crc kubenswrapper[4700]: E1007 11:34:47.377189 4700 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:e4c4ff39c54c0af231fb781759ab50ed86285c74d38bdea43fa75646b762d842" Oct 07 11:34:47 crc kubenswrapper[4700]: E1007 11:34:47.377447 4700 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:e4c4ff39c54c0af231fb781759ab50ed86285c74d38bdea43fa75646b762d842,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kpc5b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-7c7fc454ff-xxzx6_openstack-operators(7a64a375-b3e0-47ac-b715-cea59989a781): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 11:34:47 crc kubenswrapper[4700]: E1007 11:34:47.881633 4700 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:bcd1acac74e68eea5a9c3b7ba1bcb29d3a5b43423fc23c19ad4715bdac41f799" Oct 07 11:34:47 crc kubenswrapper[4700]: E1007 11:34:47.882281 4700 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:bcd1acac74e68eea5a9c3b7ba1bcb29d3a5b43423fc23c19ad4715bdac41f799,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q9j2h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-5dfbbd665ctqg6s_openstack-operators(e4140e5b-60c0-42c1-9440-0070b773f8c6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 11:34:48 crc kubenswrapper[4700]: E1007 11:34:48.426762 4700 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/infra-operator@sha256:b6cef68bfaacdf992a9fa1a6b03a848a48c18cbb6ed12d95561b4b37d858b99f" Oct 07 11:34:48 crc kubenswrapper[4700]: E1007 11:34:48.427077 4700 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:b6cef68bfaacdf992a9fa1a6b03a848a48c18cbb6ed12d95561b4b37d858b99f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rxwn8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-658588b8c9-5nnwj_openstack-operators(0df5f995-5a5e-4c40-a498-9dd5ffd4381c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 11:34:48 crc kubenswrapper[4700]: E1007 11:34:48.836308 4700 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:adc23c5fd1aece2b16dc8e22ceed628f9a719455e39d3f98c77544665c6749e1" Oct 07 11:34:48 crc kubenswrapper[4700]: E1007 11:34:48.836516 4700 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:adc23c5fd1aece2b16dc8e22ceed628f9a719455e39d3f98c77544665c6749e1,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dsmgv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-54689d9f88-xvtgc_openstack-operators(81628b8b-eb67-4514-bbb4-44341c3962ce): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 11:34:49 crc kubenswrapper[4700]: E1007 11:34:49.460380 4700 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:da5c3078d80878d66c616e6f8a0bb909f95d971cde2c612f96fded064113e182" Oct 07 11:34:49 crc kubenswrapper[4700]: E1007 11:34:49.461242 4700 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:da5c3078d80878d66c616e6f8a0bb909f95d971cde2c612f96fded064113e182,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6txtb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-7468f855d8-x8twn_openstack-operators(4e24a2f2-716e-4273-86a7-7ad450736748): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 11:34:51 crc kubenswrapper[4700]: E1007 11:34:51.572480 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-5szjk" podUID="bc62ffd3-f1d8-46c2-8777-c6ad960d68a8" Oct 07 11:34:51 crc kubenswrapper[4700]: E1007 11:34:51.689969 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-jrw6r" podUID="1986b44c-0b98-4c70-a2a3-78f86e586d87" Oct 07 11:34:51 crc kubenswrapper[4700]: I1007 11:34:51.726205 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-jrw6r" event={"ID":"1986b44c-0b98-4c70-a2a3-78f86e586d87","Type":"ContainerStarted","Data":"908d90b691df40c9c5d86d5111ed5fd0dbf9406874bf90a21c39e8fdadf93906"} Oct 07 11:34:51 crc kubenswrapper[4700]: E1007 11:34:51.729465 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:dfd044635f9df9ed1d249387fa622177db35cdc72475e1c570617b8d17c64862\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-jrw6r" podUID="1986b44c-0b98-4c70-a2a3-78f86e586d87" Oct 07 11:34:51 crc kubenswrapper[4700]: I1007 11:34:51.740763 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-5szjk" event={"ID":"bc62ffd3-f1d8-46c2-8777-c6ad960d68a8","Type":"ContainerStarted","Data":"9096578ec8b3def90e74dacb96a27b77b087ca9df4101865fb3297fa4c72c928"} Oct 07 11:34:51 crc kubenswrapper[4700]: E1007 11:34:51.768419 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:5f96b563a63494082323bfced089d6589e0c89db43c6a39a2e912c79b1a278fe\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-5szjk" podUID="bc62ffd3-f1d8-46c2-8777-c6ad960d68a8" Oct 07 11:34:51 crc kubenswrapper[4700]: E1007 11:34:51.861649 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-jv8c7" podUID="d50469d1-cf53-4396-9d9c-f03db7eb43f3" Oct 07 11:34:51 crc kubenswrapper[4700]: E1007 11:34:51.864722 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-5nnwj" podUID="0df5f995-5a5e-4c40-a498-9dd5ffd4381c" Oct 07 11:34:51 crc kubenswrapper[4700]: E1007 11:34:51.965349 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-xvtgc" podUID="81628b8b-eb67-4514-bbb4-44341c3962ce" Oct 07 11:34:51 crc kubenswrapper[4700]: E1007 11:34:51.970623 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-9zh29" podUID="ae80f142-9ac8-4614-a0c5-b74dfe98b0c8" Oct 07 11:34:52 crc kubenswrapper[4700]: E1007 11:34:52.030362 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ctqg6s" podUID="e4140e5b-60c0-42c1-9440-0070b773f8c6" Oct 07 11:34:52 crc kubenswrapper[4700]: E1007 11:34:52.057300 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-xxzx6" podUID="7a64a375-b3e0-47ac-b715-cea59989a781" Oct 07 11:34:52 crc kubenswrapper[4700]: E1007 11:34:52.065055 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-n4w8d" podUID="1f959f97-0c30-4e7c-a006-9517950bc1c1" Oct 07 11:34:52 crc kubenswrapper[4700]: E1007 11:34:52.066203 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-69f5t" podUID="6f56ee78-9e72-4fbb-abff-985e142a17cb" Oct 07 11:34:52 crc kubenswrapper[4700]: E1007 11:34:52.147378 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-x8twn" podUID="4e24a2f2-716e-4273-86a7-7ad450736748" Oct 07 11:34:52 crc kubenswrapper[4700]: I1007 11:34:52.760472 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-mg8jv" event={"ID":"b5298439-5a42-4bca-aa5b-c3fb26b2e5e3","Type":"ContainerStarted","Data":"b123f078156e473ae5a25908345a078f8e3f1190614e68a07e8195ca558c0603"} Oct 07 11:34:52 crc kubenswrapper[4700]: I1007 11:34:52.761679 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-xvtgc" event={"ID":"81628b8b-eb67-4514-bbb4-44341c3962ce","Type":"ContainerStarted","Data":"1ced2fa206dc20863fe1e094cdde6aecf7565c2575e4cc0be9ebe9aa7ecb31e8"} Oct 07 11:34:52 crc kubenswrapper[4700]: E1007 11:34:52.763639 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:adc23c5fd1aece2b16dc8e22ceed628f9a719455e39d3f98c77544665c6749e1\\\"\"" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-xvtgc" podUID="81628b8b-eb67-4514-bbb4-44341c3962ce" Oct 07 11:34:52 crc kubenswrapper[4700]: I1007 11:34:52.764619 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-vvp7c" event={"ID":"b69da5cc-fa66-4adb-b136-1efe25092b40","Type":"ContainerStarted","Data":"0b6e011e36605624733f60ec95d18c51deb739bfd4d62228b49acec97c213bbf"} Oct 07 11:34:52 crc kubenswrapper[4700]: I1007 11:34:52.774961 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rcxsc" event={"ID":"ad8f84bd-e06b-4015-8168-938a9e1ebeaa","Type":"ContainerStarted","Data":"0ac9c49bec4bc80b4faa101c7c6a533b6871b00eae5a341bd37a694a825b0e91"} Oct 07 11:34:52 crc kubenswrapper[4700]: I1007 11:34:52.786413 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-9zh29" event={"ID":"ae80f142-9ac8-4614-a0c5-b74dfe98b0c8","Type":"ContainerStarted","Data":"6c6ab142d52a0741f03bac980bce42a1fca2f8dd2e1932ee74db0d0a033c497c"} Oct 07 11:34:52 crc kubenswrapper[4700]: E1007 11:34:52.790907 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb\\\"\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-9zh29" podUID="ae80f142-9ac8-4614-a0c5-b74dfe98b0c8" Oct 07 11:34:52 crc kubenswrapper[4700]: I1007 11:34:52.808426 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-9xkbh" event={"ID":"6b424cb2-c37e-4db9-86f4-75132c345127","Type":"ContainerStarted","Data":"b8b4c2715bb49fcfc5b7da5291f0d2a3c118084e6bdccd48acac66b6d351463d"} Oct 07 11:34:52 crc kubenswrapper[4700]: I1007 11:34:52.808488 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-9xkbh" event={"ID":"6b424cb2-c37e-4db9-86f4-75132c345127","Type":"ContainerStarted","Data":"1f518f45ce0c5a5f7b2afa7f876ba59d8492933c12b847bbb3cd6ee60d1d27c7"} Oct 07 11:34:52 crc kubenswrapper[4700]: I1007 11:34:52.809390 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-9xkbh" Oct 07 11:34:52 crc kubenswrapper[4700]: I1007 11:34:52.815430 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ctqg6s" event={"ID":"e4140e5b-60c0-42c1-9440-0070b773f8c6","Type":"ContainerStarted","Data":"10a0bcb5bdcc576c497aedf9c24b17f99835351611a1a9bda37cd6af29c74795"} Oct 07 11:34:52 crc kubenswrapper[4700]: E1007 11:34:52.839863 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:bcd1acac74e68eea5a9c3b7ba1bcb29d3a5b43423fc23c19ad4715bdac41f799\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ctqg6s" podUID="e4140e5b-60c0-42c1-9440-0070b773f8c6" Oct 07 11:34:52 crc kubenswrapper[4700]: I1007 11:34:52.846107 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-khdlk" event={"ID":"c147a183-5f67-45a0-a971-87e75df2a66e","Type":"ContainerStarted","Data":"8cb592880ac4f9ed82bd63fe30e3413c7d5ba378045c14d342fabde93bf48626"} Oct 07 11:34:52 crc kubenswrapper[4700]: I1007 11:34:52.846990 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-khdlk" Oct 07 11:34:52 crc kubenswrapper[4700]: I1007 11:34:52.848328 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-xxzx6" event={"ID":"7a64a375-b3e0-47ac-b715-cea59989a781","Type":"ContainerStarted","Data":"7251358b92b1737cac232cf8bee98b3498ccd805eafb2c54d4d5edfdf37e9c46"} Oct 07 11:34:52 crc kubenswrapper[4700]: E1007 11:34:52.855858 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:e4c4ff39c54c0af231fb781759ab50ed86285c74d38bdea43fa75646b762d842\\\"\"" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-xxzx6" podUID="7a64a375-b3e0-47ac-b715-cea59989a781" Oct 07 11:34:52 crc kubenswrapper[4700]: I1007 11:34:52.866984 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-649675d675-dfs4r" event={"ID":"42762f72-5039-4394-a311-299b57c3485a","Type":"ContainerStarted","Data":"8576315a3a9d024df7589b0fc50e067b813707d9729133fc33bfc2144998032b"} Oct 07 11:34:52 crc kubenswrapper[4700]: I1007 11:34:52.874744 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-dnn6x" event={"ID":"e260e7ed-c267-40b2-861a-9a77325e027a","Type":"ContainerStarted","Data":"47c4e0a5f2d83597d047336ed6a3918b75a927026d88e58a727f431a127f058b"} Oct 07 11:34:52 crc kubenswrapper[4700]: I1007 11:34:52.877921 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rcxsc" podStartSLOduration=4.286604052 podStartE2EDuration="23.877903435s" podCreationTimestamp="2025-10-07 11:34:29 +0000 UTC" firstStartedPulling="2025-10-07 11:34:31.868850633 +0000 UTC m=+838.665249622" lastFinishedPulling="2025-10-07 11:34:51.460150016 +0000 UTC m=+858.256549005" observedRunningTime="2025-10-07 11:34:52.874961828 +0000 UTC m=+859.671360817" watchObservedRunningTime="2025-10-07 11:34:52.877903435 +0000 UTC m=+859.674302424" Oct 07 11:34:52 crc kubenswrapper[4700]: I1007 11:34:52.885977 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-n4w8d" event={"ID":"1f959f97-0c30-4e7c-a006-9517950bc1c1","Type":"ContainerStarted","Data":"066210c498c4e54772734d189c379823ae4bf87c5f195a09b1462180c6854a75"} Oct 07 11:34:52 crc kubenswrapper[4700]: E1007 11:34:52.889886 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:063aae1458289d1090a77c74c2b978b9eb978b0e4062c399f0cb5434a8dd2757\\\"\"" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-n4w8d" podUID="1f959f97-0c30-4e7c-a006-9517950bc1c1" Oct 07 11:34:52 crc kubenswrapper[4700]: I1007 11:34:52.905637 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-jv8c7" event={"ID":"d50469d1-cf53-4396-9d9c-f03db7eb43f3","Type":"ContainerStarted","Data":"039579f06d732e80b9ef75389754cb767381509bbaf804a6d29d66d708e981ca"} Oct 07 11:34:52 crc kubenswrapper[4700]: E1007 11:34:52.910240 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:f37e29d1f621c23c0d77b09076006d1e8002a77c2ff3d9b8921f893221cb1d09\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-jv8c7" podUID="d50469d1-cf53-4396-9d9c-f03db7eb43f3" Oct 07 11:34:52 crc kubenswrapper[4700]: I1007 11:34:52.919501 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-8dhrp" event={"ID":"0ef02b09-6290-414e-b0b3-f9d52138d53d","Type":"ContainerStarted","Data":"12f336856fda243ade17a4ef211f883ef4c9ac6881bdf632ca681c0dd0bc5d01"} Oct 07 11:34:52 crc kubenswrapper[4700]: I1007 11:34:52.924729 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-bf98bb7b6-ghwv9" event={"ID":"b5911608-e23e-46e8-9637-488593110278","Type":"ContainerStarted","Data":"d04aa1dd6064492c926c83f2c19bb97fc3cd39a53125fce256724aa054998731"} Oct 07 11:34:52 crc kubenswrapper[4700]: I1007 11:34:52.925522 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-bf98bb7b6-ghwv9" Oct 07 11:34:52 crc kubenswrapper[4700]: I1007 11:34:52.935977 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-s2qwp" event={"ID":"6992fc43-2f9e-414d-8c14-f08185ed395a","Type":"ContainerStarted","Data":"b6675638565a29eb7f5291d61a2ff4921edbfb4aab51678aa0bc975e5c3ae9c9"} Oct 07 11:34:52 crc kubenswrapper[4700]: I1007 11:34:52.942708 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-69f5t" event={"ID":"6f56ee78-9e72-4fbb-abff-985e142a17cb","Type":"ContainerStarted","Data":"3e05be494681a9daf23c1e09cd9e8a62261746f85fc73d0e330f47994b5a338d"} Oct 07 11:34:52 crc kubenswrapper[4700]: E1007 11:34:52.944397 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:b6ab8fc3ad425eca2e073fe9ba9d5b29d9ea4d9814de7bb799fa330209566cd4\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-69f5t" podUID="6f56ee78-9e72-4fbb-abff-985e142a17cb" Oct 07 11:34:52 crc kubenswrapper[4700]: I1007 11:34:52.952816 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-nq2r8" event={"ID":"0f2a771c-5adf-4c17-94ac-d7b988e3ea86","Type":"ContainerStarted","Data":"7252dfec80599b916c20352f9836c87bbf7328c8bc3a1ad19fbb4984c131cf96"} Oct 07 11:34:52 crc kubenswrapper[4700]: I1007 11:34:52.953652 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-nq2r8" Oct 07 11:34:52 crc kubenswrapper[4700]: I1007 11:34:52.962686 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-x8twn" event={"ID":"4e24a2f2-716e-4273-86a7-7ad450736748","Type":"ContainerStarted","Data":"623d029de2d832e708d92bc3cece055cc77d5ea6050370681577b353688f6991"} Oct 07 11:34:52 crc kubenswrapper[4700]: E1007 11:34:52.965669 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:da5c3078d80878d66c616e6f8a0bb909f95d971cde2c612f96fded064113e182\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-x8twn" podUID="4e24a2f2-716e-4273-86a7-7ad450736748" Oct 07 11:34:52 crc kubenswrapper[4700]: I1007 11:34:52.974280 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-5nnwj" event={"ID":"0df5f995-5a5e-4c40-a498-9dd5ffd4381c","Type":"ContainerStarted","Data":"4f3761d78bc89bf20547c3450538f539611c3a968b52eb2f44385b4b2fe61c6d"} Oct 07 11:34:52 crc kubenswrapper[4700]: I1007 11:34:52.975172 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-khdlk" podStartSLOduration=4.293540144 podStartE2EDuration="23.974356278s" podCreationTimestamp="2025-10-07 11:34:29 +0000 UTC" firstStartedPulling="2025-10-07 11:34:31.865004022 +0000 UTC m=+838.661403011" lastFinishedPulling="2025-10-07 11:34:51.545820156 +0000 UTC m=+858.342219145" observedRunningTime="2025-10-07 11:34:52.965499796 +0000 UTC m=+859.761898785" watchObservedRunningTime="2025-10-07 11:34:52.974356278 +0000 UTC m=+859.770755267" Oct 07 11:34:52 crc kubenswrapper[4700]: E1007 11:34:52.976883 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:dfd044635f9df9ed1d249387fa622177db35cdc72475e1c570617b8d17c64862\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-jrw6r" podUID="1986b44c-0b98-4c70-a2a3-78f86e586d87" Oct 07 11:34:52 crc kubenswrapper[4700]: E1007 11:34:52.976957 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:5f96b563a63494082323bfced089d6589e0c89db43c6a39a2e912c79b1a278fe\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-5szjk" podUID="bc62ffd3-f1d8-46c2-8777-c6ad960d68a8" Oct 07 11:34:52 crc kubenswrapper[4700]: E1007 11:34:52.977025 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:b6cef68bfaacdf992a9fa1a6b03a848a48c18cbb6ed12d95561b4b37d858b99f\\\"\"" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-5nnwj" podUID="0df5f995-5a5e-4c40-a498-9dd5ffd4381c" Oct 07 11:34:53 crc kubenswrapper[4700]: I1007 11:34:53.153860 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-9xkbh" podStartSLOduration=4.320362481 podStartE2EDuration="25.153840932s" podCreationTimestamp="2025-10-07 11:34:28 +0000 UTC" firstStartedPulling="2025-10-07 11:34:30.376549473 +0000 UTC m=+837.172948462" lastFinishedPulling="2025-10-07 11:34:51.210027924 +0000 UTC m=+858.006426913" observedRunningTime="2025-10-07 11:34:53.082869316 +0000 UTC m=+859.879268295" watchObservedRunningTime="2025-10-07 11:34:53.153840932 +0000 UTC m=+859.950239921" Oct 07 11:34:53 crc kubenswrapper[4700]: I1007 11:34:53.248840 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-nq2r8" podStartSLOduration=4.656458905 podStartE2EDuration="24.248821086s" podCreationTimestamp="2025-10-07 11:34:29 +0000 UTC" firstStartedPulling="2025-10-07 11:34:31.869076739 +0000 UTC m=+838.665475728" lastFinishedPulling="2025-10-07 11:34:51.46143892 +0000 UTC m=+858.257837909" observedRunningTime="2025-10-07 11:34:53.244564635 +0000 UTC m=+860.040963624" watchObservedRunningTime="2025-10-07 11:34:53.248821086 +0000 UTC m=+860.045220075" Oct 07 11:34:53 crc kubenswrapper[4700]: I1007 11:34:53.314226 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-bf98bb7b6-ghwv9" podStartSLOduration=4.596501946 podStartE2EDuration="24.314204966s" podCreationTimestamp="2025-10-07 11:34:29 +0000 UTC" firstStartedPulling="2025-10-07 11:34:31.826429402 +0000 UTC m=+838.622828391" lastFinishedPulling="2025-10-07 11:34:51.544132432 +0000 UTC m=+858.340531411" observedRunningTime="2025-10-07 11:34:53.312284486 +0000 UTC m=+860.108683505" watchObservedRunningTime="2025-10-07 11:34:53.314204966 +0000 UTC m=+860.110603955" Oct 07 11:34:53 crc kubenswrapper[4700]: I1007 11:34:53.985603 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-s2qwp" event={"ID":"6992fc43-2f9e-414d-8c14-f08185ed395a","Type":"ContainerStarted","Data":"09074ac7bc415d94de3b229e248992b15c89796270cadc525d91cfdcb0d39380"} Oct 07 11:34:53 crc kubenswrapper[4700]: I1007 11:34:53.986664 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-s2qwp" Oct 07 11:34:53 crc kubenswrapper[4700]: I1007 11:34:53.990323 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-649675d675-dfs4r" event={"ID":"42762f72-5039-4394-a311-299b57c3485a","Type":"ContainerStarted","Data":"57333798ba69090c315bb5ea1dff5b504adf3ccfe33bc5cd36727cd80cf6bf00"} Oct 07 11:34:53 crc kubenswrapper[4700]: I1007 11:34:53.990546 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-649675d675-dfs4r" Oct 07 11:34:53 crc kubenswrapper[4700]: I1007 11:34:53.996616 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-8dhrp" event={"ID":"0ef02b09-6290-414e-b0b3-f9d52138d53d","Type":"ContainerStarted","Data":"ce614af3cdadca06212eecb57f7f92f59157048ae2d3be6b04b4818ec3ab9886"} Oct 07 11:34:53 crc kubenswrapper[4700]: I1007 11:34:53.996735 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-8dhrp" Oct 07 11:34:53 crc kubenswrapper[4700]: I1007 11:34:53.999279 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-mg8jv" event={"ID":"b5298439-5a42-4bca-aa5b-c3fb26b2e5e3","Type":"ContainerStarted","Data":"804d20e3243bad7a2269c8eec0ff64da5607b28b1e8920e930b3e4f3b0ba2561"} Oct 07 11:34:53 crc kubenswrapper[4700]: I1007 11:34:53.999499 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-mg8jv" Oct 07 11:34:54 crc kubenswrapper[4700]: I1007 11:34:54.003263 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-dnn6x" event={"ID":"e260e7ed-c267-40b2-861a-9a77325e027a","Type":"ContainerStarted","Data":"07ecde031ffb8a7838b59624371df139e212de45d8a4a8226a74cb656faf9db9"} Oct 07 11:34:54 crc kubenswrapper[4700]: I1007 11:34:54.003429 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-dnn6x" Oct 07 11:34:54 crc kubenswrapper[4700]: I1007 11:34:54.005077 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-s2qwp" podStartSLOduration=5.650394864 podStartE2EDuration="26.005066016s" podCreationTimestamp="2025-10-07 11:34:28 +0000 UTC" firstStartedPulling="2025-10-07 11:34:30.854844849 +0000 UTC m=+837.651243838" lastFinishedPulling="2025-10-07 11:34:51.209516001 +0000 UTC m=+858.005914990" observedRunningTime="2025-10-07 11:34:54.00294134 +0000 UTC m=+860.799340339" watchObservedRunningTime="2025-10-07 11:34:54.005066016 +0000 UTC m=+860.801465005" Oct 07 11:34:54 crc kubenswrapper[4700]: I1007 11:34:54.006841 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-vvp7c" event={"ID":"b69da5cc-fa66-4adb-b136-1efe25092b40","Type":"ContainerStarted","Data":"deab14d39f8ad34494dee4b5f7ed005ba011fb2531883d34b39f35c9bad5ff6b"} Oct 07 11:34:54 crc kubenswrapper[4700]: E1007 11:34:54.010544 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:e4c4ff39c54c0af231fb781759ab50ed86285c74d38bdea43fa75646b762d842\\\"\"" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-xxzx6" podUID="7a64a375-b3e0-47ac-b715-cea59989a781" Oct 07 11:34:54 crc kubenswrapper[4700]: E1007 11:34:54.010993 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:adc23c5fd1aece2b16dc8e22ceed628f9a719455e39d3f98c77544665c6749e1\\\"\"" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-xvtgc" podUID="81628b8b-eb67-4514-bbb4-44341c3962ce" Oct 07 11:34:54 crc kubenswrapper[4700]: E1007 11:34:54.011022 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:b6ab8fc3ad425eca2e073fe9ba9d5b29d9ea4d9814de7bb799fa330209566cd4\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-69f5t" podUID="6f56ee78-9e72-4fbb-abff-985e142a17cb" Oct 07 11:34:54 crc kubenswrapper[4700]: E1007 11:34:54.011064 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:b6cef68bfaacdf992a9fa1a6b03a848a48c18cbb6ed12d95561b4b37d858b99f\\\"\"" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-5nnwj" podUID="0df5f995-5a5e-4c40-a498-9dd5ffd4381c" Oct 07 11:34:54 crc kubenswrapper[4700]: E1007 11:34:54.011107 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb\\\"\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-9zh29" podUID="ae80f142-9ac8-4614-a0c5-b74dfe98b0c8" Oct 07 11:34:54 crc kubenswrapper[4700]: E1007 11:34:54.011154 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:f37e29d1f621c23c0d77b09076006d1e8002a77c2ff3d9b8921f893221cb1d09\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-jv8c7" podUID="d50469d1-cf53-4396-9d9c-f03db7eb43f3" Oct 07 11:34:54 crc kubenswrapper[4700]: E1007 11:34:54.011247 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:063aae1458289d1090a77c74c2b978b9eb978b0e4062c399f0cb5434a8dd2757\\\"\"" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-n4w8d" podUID="1f959f97-0c30-4e7c-a006-9517950bc1c1" Oct 07 11:34:54 crc kubenswrapper[4700]: E1007 11:34:54.011820 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:da5c3078d80878d66c616e6f8a0bb909f95d971cde2c612f96fded064113e182\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-x8twn" podUID="4e24a2f2-716e-4273-86a7-7ad450736748" Oct 07 11:34:54 crc kubenswrapper[4700]: E1007 11:34:54.011869 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:bcd1acac74e68eea5a9c3b7ba1bcb29d3a5b43423fc23c19ad4715bdac41f799\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ctqg6s" podUID="e4140e5b-60c0-42c1-9440-0070b773f8c6" Oct 07 11:34:54 crc kubenswrapper[4700]: I1007 11:34:54.026237 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-649675d675-dfs4r" podStartSLOduration=5.846928198 podStartE2EDuration="26.026209168s" podCreationTimestamp="2025-10-07 11:34:28 +0000 UTC" firstStartedPulling="2025-10-07 11:34:31.030316363 +0000 UTC m=+837.826715352" lastFinishedPulling="2025-10-07 11:34:51.209597333 +0000 UTC m=+858.005996322" observedRunningTime="2025-10-07 11:34:54.02053873 +0000 UTC m=+860.816937709" watchObservedRunningTime="2025-10-07 11:34:54.026209168 +0000 UTC m=+860.822608167" Oct 07 11:34:54 crc kubenswrapper[4700]: I1007 11:34:54.057460 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-mg8jv" podStartSLOduration=5.208000996 podStartE2EDuration="26.057438645s" podCreationTimestamp="2025-10-07 11:34:28 +0000 UTC" firstStartedPulling="2025-10-07 11:34:30.36041087 +0000 UTC m=+837.156809859" lastFinishedPulling="2025-10-07 11:34:51.209848519 +0000 UTC m=+858.006247508" observedRunningTime="2025-10-07 11:34:54.040075931 +0000 UTC m=+860.836474950" watchObservedRunningTime="2025-10-07 11:34:54.057438645 +0000 UTC m=+860.853837634" Oct 07 11:34:54 crc kubenswrapper[4700]: I1007 11:34:54.076480 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-8dhrp" podStartSLOduration=6.11219629 podStartE2EDuration="26.076442612s" podCreationTimestamp="2025-10-07 11:34:28 +0000 UTC" firstStartedPulling="2025-10-07 11:34:30.818955309 +0000 UTC m=+837.615354298" lastFinishedPulling="2025-10-07 11:34:50.783201631 +0000 UTC m=+857.579600620" observedRunningTime="2025-10-07 11:34:54.054616692 +0000 UTC m=+860.851015681" watchObservedRunningTime="2025-10-07 11:34:54.076442612 +0000 UTC m=+860.872841601" Oct 07 11:34:54 crc kubenswrapper[4700]: I1007 11:34:54.097810 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-vvp7c" podStartSLOduration=5.321420155 podStartE2EDuration="26.097356449s" podCreationTimestamp="2025-10-07 11:34:28 +0000 UTC" firstStartedPulling="2025-10-07 11:34:30.433759221 +0000 UTC m=+837.230158210" lastFinishedPulling="2025-10-07 11:34:51.209695495 +0000 UTC m=+858.006094504" observedRunningTime="2025-10-07 11:34:54.078329572 +0000 UTC m=+860.874728581" watchObservedRunningTime="2025-10-07 11:34:54.097356449 +0000 UTC m=+860.893755448" Oct 07 11:34:54 crc kubenswrapper[4700]: I1007 11:34:54.257577 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-dnn6x" podStartSLOduration=6.073116604 podStartE2EDuration="26.257550329s" podCreationTimestamp="2025-10-07 11:34:28 +0000 UTC" firstStartedPulling="2025-10-07 11:34:31.025085776 +0000 UTC m=+837.821484755" lastFinishedPulling="2025-10-07 11:34:51.209519491 +0000 UTC m=+858.005918480" observedRunningTime="2025-10-07 11:34:54.25223108 +0000 UTC m=+861.048630079" watchObservedRunningTime="2025-10-07 11:34:54.257550329 +0000 UTC m=+861.053949318" Oct 07 11:34:55 crc kubenswrapper[4700]: I1007 11:34:55.015964 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-vvp7c" Oct 07 11:34:58 crc kubenswrapper[4700]: I1007 11:34:58.220090 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-whcl4"] Oct 07 11:34:58 crc kubenswrapper[4700]: I1007 11:34:58.223502 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-whcl4" Oct 07 11:34:58 crc kubenswrapper[4700]: I1007 11:34:58.248281 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-whcl4"] Oct 07 11:34:58 crc kubenswrapper[4700]: I1007 11:34:58.291509 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8986340-c2d7-47f2-9b6d-73896e1362ec-utilities\") pod \"redhat-operators-whcl4\" (UID: \"c8986340-c2d7-47f2-9b6d-73896e1362ec\") " pod="openshift-marketplace/redhat-operators-whcl4" Oct 07 11:34:58 crc kubenswrapper[4700]: I1007 11:34:58.291608 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcvmp\" (UniqueName: \"kubernetes.io/projected/c8986340-c2d7-47f2-9b6d-73896e1362ec-kube-api-access-gcvmp\") pod \"redhat-operators-whcl4\" (UID: \"c8986340-c2d7-47f2-9b6d-73896e1362ec\") " pod="openshift-marketplace/redhat-operators-whcl4" Oct 07 11:34:58 crc kubenswrapper[4700]: I1007 11:34:58.291644 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8986340-c2d7-47f2-9b6d-73896e1362ec-catalog-content\") pod \"redhat-operators-whcl4\" (UID: \"c8986340-c2d7-47f2-9b6d-73896e1362ec\") " pod="openshift-marketplace/redhat-operators-whcl4" Oct 07 11:34:58 crc kubenswrapper[4700]: I1007 11:34:58.393250 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8986340-c2d7-47f2-9b6d-73896e1362ec-utilities\") pod \"redhat-operators-whcl4\" (UID: \"c8986340-c2d7-47f2-9b6d-73896e1362ec\") " pod="openshift-marketplace/redhat-operators-whcl4" Oct 07 11:34:58 crc kubenswrapper[4700]: I1007 11:34:58.393399 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcvmp\" (UniqueName: \"kubernetes.io/projected/c8986340-c2d7-47f2-9b6d-73896e1362ec-kube-api-access-gcvmp\") pod \"redhat-operators-whcl4\" (UID: \"c8986340-c2d7-47f2-9b6d-73896e1362ec\") " pod="openshift-marketplace/redhat-operators-whcl4" Oct 07 11:34:58 crc kubenswrapper[4700]: I1007 11:34:58.393441 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8986340-c2d7-47f2-9b6d-73896e1362ec-catalog-content\") pod \"redhat-operators-whcl4\" (UID: \"c8986340-c2d7-47f2-9b6d-73896e1362ec\") " pod="openshift-marketplace/redhat-operators-whcl4" Oct 07 11:34:58 crc kubenswrapper[4700]: I1007 11:34:58.394094 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8986340-c2d7-47f2-9b6d-73896e1362ec-catalog-content\") pod \"redhat-operators-whcl4\" (UID: \"c8986340-c2d7-47f2-9b6d-73896e1362ec\") " pod="openshift-marketplace/redhat-operators-whcl4" Oct 07 11:34:58 crc kubenswrapper[4700]: I1007 11:34:58.394092 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8986340-c2d7-47f2-9b6d-73896e1362ec-utilities\") pod \"redhat-operators-whcl4\" (UID: \"c8986340-c2d7-47f2-9b6d-73896e1362ec\") " pod="openshift-marketplace/redhat-operators-whcl4" Oct 07 11:34:58 crc kubenswrapper[4700]: I1007 11:34:58.418976 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcvmp\" (UniqueName: \"kubernetes.io/projected/c8986340-c2d7-47f2-9b6d-73896e1362ec-kube-api-access-gcvmp\") pod \"redhat-operators-whcl4\" (UID: \"c8986340-c2d7-47f2-9b6d-73896e1362ec\") " pod="openshift-marketplace/redhat-operators-whcl4" Oct 07 11:34:58 crc kubenswrapper[4700]: I1007 11:34:58.564163 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-whcl4" Oct 07 11:34:58 crc kubenswrapper[4700]: I1007 11:34:58.994391 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-whcl4"] Oct 07 11:34:59 crc kubenswrapper[4700]: I1007 11:34:59.046023 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-vvp7c" Oct 07 11:34:59 crc kubenswrapper[4700]: I1007 11:34:59.054356 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-whcl4" event={"ID":"c8986340-c2d7-47f2-9b6d-73896e1362ec","Type":"ContainerStarted","Data":"7652d954523c80cda8f43c0a14ce7e3de4c8e69e0dc1a96e041b8afe960d038b"} Oct 07 11:34:59 crc kubenswrapper[4700]: I1007 11:34:59.061507 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-mg8jv" Oct 07 11:34:59 crc kubenswrapper[4700]: I1007 11:34:59.117930 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-9xkbh" Oct 07 11:34:59 crc kubenswrapper[4700]: I1007 11:34:59.479694 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-8dhrp" Oct 07 11:34:59 crc kubenswrapper[4700]: I1007 11:34:59.484289 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-s2qwp" Oct 07 11:34:59 crc kubenswrapper[4700]: I1007 11:34:59.545762 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-649675d675-dfs4r" Oct 07 11:34:59 crc kubenswrapper[4700]: I1007 11:34:59.547074 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-dnn6x" Oct 07 11:34:59 crc kubenswrapper[4700]: I1007 11:34:59.944227 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-khdlk" Oct 07 11:35:00 crc kubenswrapper[4700]: I1007 11:35:00.036866 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-nq2r8" Oct 07 11:35:00 crc kubenswrapper[4700]: I1007 11:35:00.071472 4700 generic.go:334] "Generic (PLEG): container finished" podID="c8986340-c2d7-47f2-9b6d-73896e1362ec" containerID="1b73a8fb82c2647d0b1364a7435e4ed5f3e88238fa19b51e7cf889c5f7d08c93" exitCode=0 Oct 07 11:35:00 crc kubenswrapper[4700]: I1007 11:35:00.071549 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-whcl4" event={"ID":"c8986340-c2d7-47f2-9b6d-73896e1362ec","Type":"ContainerDied","Data":"1b73a8fb82c2647d0b1364a7435e4ed5f3e88238fa19b51e7cf889c5f7d08c93"} Oct 07 11:35:00 crc kubenswrapper[4700]: I1007 11:35:00.109223 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-bf98bb7b6-ghwv9" Oct 07 11:35:02 crc kubenswrapper[4700]: I1007 11:35:02.110530 4700 generic.go:334] "Generic (PLEG): container finished" podID="c8986340-c2d7-47f2-9b6d-73896e1362ec" containerID="d15dfb6d9175a213baf6ec33faca4d94d7af68be164664c58cd3b43958d6bb42" exitCode=0 Oct 07 11:35:02 crc kubenswrapper[4700]: I1007 11:35:02.110590 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-whcl4" event={"ID":"c8986340-c2d7-47f2-9b6d-73896e1362ec","Type":"ContainerDied","Data":"d15dfb6d9175a213baf6ec33faca4d94d7af68be164664c58cd3b43958d6bb42"} Oct 07 11:35:03 crc kubenswrapper[4700]: I1007 11:35:03.125142 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-whcl4" event={"ID":"c8986340-c2d7-47f2-9b6d-73896e1362ec","Type":"ContainerStarted","Data":"7614682fc253b67fedf8dfc7a0d5a3da5eb14966d51654e0f895b33539ca51d8"} Oct 07 11:35:05 crc kubenswrapper[4700]: I1007 11:35:05.647904 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-whcl4" podStartSLOduration=5.046869205 podStartE2EDuration="7.647877803s" podCreationTimestamp="2025-10-07 11:34:58 +0000 UTC" firstStartedPulling="2025-10-07 11:35:00.079676091 +0000 UTC m=+866.876075110" lastFinishedPulling="2025-10-07 11:35:02.680684679 +0000 UTC m=+869.477083708" observedRunningTime="2025-10-07 11:35:03.158845814 +0000 UTC m=+869.955244863" watchObservedRunningTime="2025-10-07 11:35:05.647877803 +0000 UTC m=+872.444276792" Oct 07 11:35:05 crc kubenswrapper[4700]: I1007 11:35:05.653364 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-22wzl"] Oct 07 11:35:05 crc kubenswrapper[4700]: I1007 11:35:05.655966 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-22wzl" Oct 07 11:35:05 crc kubenswrapper[4700]: I1007 11:35:05.662531 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-22wzl"] Oct 07 11:35:05 crc kubenswrapper[4700]: I1007 11:35:05.722436 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d2e0af8-3619-4c16-87e1-0ec8d500bc56-catalog-content\") pod \"redhat-marketplace-22wzl\" (UID: \"1d2e0af8-3619-4c16-87e1-0ec8d500bc56\") " pod="openshift-marketplace/redhat-marketplace-22wzl" Oct 07 11:35:05 crc kubenswrapper[4700]: I1007 11:35:05.722491 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmn76\" (UniqueName: \"kubernetes.io/projected/1d2e0af8-3619-4c16-87e1-0ec8d500bc56-kube-api-access-mmn76\") pod \"redhat-marketplace-22wzl\" (UID: \"1d2e0af8-3619-4c16-87e1-0ec8d500bc56\") " pod="openshift-marketplace/redhat-marketplace-22wzl" Oct 07 11:35:05 crc kubenswrapper[4700]: I1007 11:35:05.722535 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d2e0af8-3619-4c16-87e1-0ec8d500bc56-utilities\") pod \"redhat-marketplace-22wzl\" (UID: \"1d2e0af8-3619-4c16-87e1-0ec8d500bc56\") " pod="openshift-marketplace/redhat-marketplace-22wzl" Oct 07 11:35:05 crc kubenswrapper[4700]: I1007 11:35:05.823940 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d2e0af8-3619-4c16-87e1-0ec8d500bc56-catalog-content\") pod \"redhat-marketplace-22wzl\" (UID: \"1d2e0af8-3619-4c16-87e1-0ec8d500bc56\") " pod="openshift-marketplace/redhat-marketplace-22wzl" Oct 07 11:35:05 crc kubenswrapper[4700]: I1007 11:35:05.824350 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmn76\" (UniqueName: \"kubernetes.io/projected/1d2e0af8-3619-4c16-87e1-0ec8d500bc56-kube-api-access-mmn76\") pod \"redhat-marketplace-22wzl\" (UID: \"1d2e0af8-3619-4c16-87e1-0ec8d500bc56\") " pod="openshift-marketplace/redhat-marketplace-22wzl" Oct 07 11:35:05 crc kubenswrapper[4700]: I1007 11:35:05.824390 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d2e0af8-3619-4c16-87e1-0ec8d500bc56-utilities\") pod \"redhat-marketplace-22wzl\" (UID: \"1d2e0af8-3619-4c16-87e1-0ec8d500bc56\") " pod="openshift-marketplace/redhat-marketplace-22wzl" Oct 07 11:35:05 crc kubenswrapper[4700]: I1007 11:35:05.824744 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d2e0af8-3619-4c16-87e1-0ec8d500bc56-catalog-content\") pod \"redhat-marketplace-22wzl\" (UID: \"1d2e0af8-3619-4c16-87e1-0ec8d500bc56\") " pod="openshift-marketplace/redhat-marketplace-22wzl" Oct 07 11:35:05 crc kubenswrapper[4700]: I1007 11:35:05.824816 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d2e0af8-3619-4c16-87e1-0ec8d500bc56-utilities\") pod \"redhat-marketplace-22wzl\" (UID: \"1d2e0af8-3619-4c16-87e1-0ec8d500bc56\") " pod="openshift-marketplace/redhat-marketplace-22wzl" Oct 07 11:35:05 crc kubenswrapper[4700]: I1007 11:35:05.853565 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmn76\" (UniqueName: \"kubernetes.io/projected/1d2e0af8-3619-4c16-87e1-0ec8d500bc56-kube-api-access-mmn76\") pod \"redhat-marketplace-22wzl\" (UID: \"1d2e0af8-3619-4c16-87e1-0ec8d500bc56\") " pod="openshift-marketplace/redhat-marketplace-22wzl" Oct 07 11:35:05 crc kubenswrapper[4700]: I1007 11:35:05.987317 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-22wzl" Oct 07 11:35:06 crc kubenswrapper[4700]: I1007 11:35:06.181063 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-x8twn" event={"ID":"4e24a2f2-716e-4273-86a7-7ad450736748","Type":"ContainerStarted","Data":"8951f786960db9058cfbba5e6be34e1375ac423ca0ee014967f21aade3dcae83"} Oct 07 11:35:06 crc kubenswrapper[4700]: I1007 11:35:06.182250 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-x8twn" Oct 07 11:35:06 crc kubenswrapper[4700]: I1007 11:35:06.199532 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-xvtgc" event={"ID":"81628b8b-eb67-4514-bbb4-44341c3962ce","Type":"ContainerStarted","Data":"33c1e3fc3fa2bab2aa44ecf7f80ea9ca31a5403c7def93376d75fe8cb37e5250"} Oct 07 11:35:06 crc kubenswrapper[4700]: I1007 11:35:06.200471 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-xvtgc" Oct 07 11:35:06 crc kubenswrapper[4700]: I1007 11:35:06.201914 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-5szjk" event={"ID":"bc62ffd3-f1d8-46c2-8777-c6ad960d68a8","Type":"ContainerStarted","Data":"be6af551231380f47a8e1148e2295d88fdf4ec5931f076036cdb62b6e79a183d"} Oct 07 11:35:06 crc kubenswrapper[4700]: I1007 11:35:06.202406 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-5szjk" Oct 07 11:35:06 crc kubenswrapper[4700]: I1007 11:35:06.203655 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ctqg6s" event={"ID":"e4140e5b-60c0-42c1-9440-0070b773f8c6","Type":"ContainerStarted","Data":"b55734328304a48b7060d52dc17bdbf49ccf020333cd9381c9316f813089fdb5"} Oct 07 11:35:06 crc kubenswrapper[4700]: I1007 11:35:06.204107 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ctqg6s" Oct 07 11:35:06 crc kubenswrapper[4700]: I1007 11:35:06.241812 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-5szjk" podStartSLOduration=3.337902607 podStartE2EDuration="37.241787516s" podCreationTimestamp="2025-10-07 11:34:29 +0000 UTC" firstStartedPulling="2025-10-07 11:34:31.793787567 +0000 UTC m=+838.590186556" lastFinishedPulling="2025-10-07 11:35:05.697672456 +0000 UTC m=+872.494071465" observedRunningTime="2025-10-07 11:35:06.235598755 +0000 UTC m=+873.031997754" watchObservedRunningTime="2025-10-07 11:35:06.241787516 +0000 UTC m=+873.038186505" Oct 07 11:35:06 crc kubenswrapper[4700]: I1007 11:35:06.242752 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-x8twn" podStartSLOduration=3.390449683 podStartE2EDuration="37.242747182s" podCreationTimestamp="2025-10-07 11:34:29 +0000 UTC" firstStartedPulling="2025-10-07 11:34:31.794057674 +0000 UTC m=+838.590456663" lastFinishedPulling="2025-10-07 11:35:05.646355133 +0000 UTC m=+872.442754162" observedRunningTime="2025-10-07 11:35:06.216597698 +0000 UTC m=+873.012996697" watchObservedRunningTime="2025-10-07 11:35:06.242747182 +0000 UTC m=+873.039146171" Oct 07 11:35:06 crc kubenswrapper[4700]: I1007 11:35:06.272520 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ctqg6s" podStartSLOduration=3.589248334 podStartE2EDuration="37.27249872s" podCreationTimestamp="2025-10-07 11:34:29 +0000 UTC" firstStartedPulling="2025-10-07 11:34:32.151800863 +0000 UTC m=+838.948199852" lastFinishedPulling="2025-10-07 11:35:05.835051249 +0000 UTC m=+872.631450238" observedRunningTime="2025-10-07 11:35:06.263485484 +0000 UTC m=+873.059884473" watchObservedRunningTime="2025-10-07 11:35:06.27249872 +0000 UTC m=+873.068897709" Oct 07 11:35:06 crc kubenswrapper[4700]: I1007 11:35:06.294381 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-xvtgc" podStartSLOduration=3.198145763 podStartE2EDuration="37.294360751s" podCreationTimestamp="2025-10-07 11:34:29 +0000 UTC" firstStartedPulling="2025-10-07 11:34:31.803627395 +0000 UTC m=+838.600026384" lastFinishedPulling="2025-10-07 11:35:05.899842383 +0000 UTC m=+872.696241372" observedRunningTime="2025-10-07 11:35:06.288745665 +0000 UTC m=+873.085144654" watchObservedRunningTime="2025-10-07 11:35:06.294360751 +0000 UTC m=+873.090759740" Oct 07 11:35:06 crc kubenswrapper[4700]: I1007 11:35:06.314013 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-22wzl"] Oct 07 11:35:07 crc kubenswrapper[4700]: I1007 11:35:07.211883 4700 generic.go:334] "Generic (PLEG): container finished" podID="1d2e0af8-3619-4c16-87e1-0ec8d500bc56" containerID="be66d1cb9cc0eba585fb19a23d7efb47793d65cda57b07b01b22265fe02c9c42" exitCode=0 Oct 07 11:35:07 crc kubenswrapper[4700]: I1007 11:35:07.211994 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-22wzl" event={"ID":"1d2e0af8-3619-4c16-87e1-0ec8d500bc56","Type":"ContainerDied","Data":"be66d1cb9cc0eba585fb19a23d7efb47793d65cda57b07b01b22265fe02c9c42"} Oct 07 11:35:07 crc kubenswrapper[4700]: I1007 11:35:07.212480 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-22wzl" event={"ID":"1d2e0af8-3619-4c16-87e1-0ec8d500bc56","Type":"ContainerStarted","Data":"5b2f35fd101ce4ce07293ea4dfa024642b61cf64191d20411f2db80d30da715f"} Oct 07 11:35:07 crc kubenswrapper[4700]: I1007 11:35:07.214965 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-xxzx6" event={"ID":"7a64a375-b3e0-47ac-b715-cea59989a781","Type":"ContainerStarted","Data":"e59fda3ba5dc33ca3c4d12cdb760980202f3e111d37b7c8b3cc58a1c5415b170"} Oct 07 11:35:07 crc kubenswrapper[4700]: I1007 11:35:07.255870 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-xxzx6" podStartSLOduration=3.4065090270000002 podStartE2EDuration="38.255839737s" podCreationTimestamp="2025-10-07 11:34:29 +0000 UTC" firstStartedPulling="2025-10-07 11:34:31.668774243 +0000 UTC m=+838.465173232" lastFinishedPulling="2025-10-07 11:35:06.518104953 +0000 UTC m=+873.314503942" observedRunningTime="2025-10-07 11:35:07.252622773 +0000 UTC m=+874.049021792" watchObservedRunningTime="2025-10-07 11:35:07.255839737 +0000 UTC m=+874.052238746" Oct 07 11:35:08 crc kubenswrapper[4700]: I1007 11:35:08.222745 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-69f5t" event={"ID":"6f56ee78-9e72-4fbb-abff-985e142a17cb","Type":"ContainerStarted","Data":"c29b35d4e3a7249b5b84467903448d31afc993a73c4bb3017e613c7457e78f4e"} Oct 07 11:35:08 crc kubenswrapper[4700]: I1007 11:35:08.222987 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-69f5t" Oct 07 11:35:08 crc kubenswrapper[4700]: I1007 11:35:08.225345 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-n4w8d" event={"ID":"1f959f97-0c30-4e7c-a006-9517950bc1c1","Type":"ContainerStarted","Data":"aa4ed80709aeeb8f94112ba23a82583069157fa46f0c7e87ceb52f064c8e7dfa"} Oct 07 11:35:08 crc kubenswrapper[4700]: I1007 11:35:08.225704 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-n4w8d" Oct 07 11:35:08 crc kubenswrapper[4700]: I1007 11:35:08.293667 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-69f5t" podStartSLOduration=3.820976832 podStartE2EDuration="40.29363628s" podCreationTimestamp="2025-10-07 11:34:28 +0000 UTC" firstStartedPulling="2025-10-07 11:34:31.025825125 +0000 UTC m=+837.822224114" lastFinishedPulling="2025-10-07 11:35:07.498484583 +0000 UTC m=+874.294883562" observedRunningTime="2025-10-07 11:35:08.285787255 +0000 UTC m=+875.082186314" watchObservedRunningTime="2025-10-07 11:35:08.29363628 +0000 UTC m=+875.090035309" Oct 07 11:35:08 crc kubenswrapper[4700]: I1007 11:35:08.316151 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-n4w8d" podStartSLOduration=4.622951983 podStartE2EDuration="40.316124368s" podCreationTimestamp="2025-10-07 11:34:28 +0000 UTC" firstStartedPulling="2025-10-07 11:34:31.803154692 +0000 UTC m=+838.599553681" lastFinishedPulling="2025-10-07 11:35:07.496327077 +0000 UTC m=+874.292726066" observedRunningTime="2025-10-07 11:35:08.31158343 +0000 UTC m=+875.107982419" watchObservedRunningTime="2025-10-07 11:35:08.316124368 +0000 UTC m=+875.112523397" Oct 07 11:35:08 crc kubenswrapper[4700]: I1007 11:35:08.564300 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-whcl4" Oct 07 11:35:08 crc kubenswrapper[4700]: I1007 11:35:08.564794 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-whcl4" Oct 07 11:35:08 crc kubenswrapper[4700]: I1007 11:35:08.632896 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-whcl4" Oct 07 11:35:09 crc kubenswrapper[4700]: I1007 11:35:09.239878 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-jrw6r" event={"ID":"1986b44c-0b98-4c70-a2a3-78f86e586d87","Type":"ContainerStarted","Data":"3b441b61833a63fde71f649ef3e2d82b4fd1c5e1aaf122bc64ce1792347ab301"} Oct 07 11:35:09 crc kubenswrapper[4700]: I1007 11:35:09.241556 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-jrw6r" Oct 07 11:35:09 crc kubenswrapper[4700]: I1007 11:35:09.243861 4700 generic.go:334] "Generic (PLEG): container finished" podID="1d2e0af8-3619-4c16-87e1-0ec8d500bc56" containerID="c1af56ae52767534b006bd40fe0c848d97e275c6df90caf6725b512e70abafc2" exitCode=0 Oct 07 11:35:09 crc kubenswrapper[4700]: I1007 11:35:09.243950 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-22wzl" event={"ID":"1d2e0af8-3619-4c16-87e1-0ec8d500bc56","Type":"ContainerDied","Data":"c1af56ae52767534b006bd40fe0c848d97e275c6df90caf6725b512e70abafc2"} Oct 07 11:35:09 crc kubenswrapper[4700]: I1007 11:35:09.283003 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-jrw6r" podStartSLOduration=3.070352131 podStartE2EDuration="40.281291452s" podCreationTimestamp="2025-10-07 11:34:29 +0000 UTC" firstStartedPulling="2025-10-07 11:34:31.511816503 +0000 UTC m=+838.308215492" lastFinishedPulling="2025-10-07 11:35:08.722755794 +0000 UTC m=+875.519154813" observedRunningTime="2025-10-07 11:35:09.266958617 +0000 UTC m=+876.063357666" watchObservedRunningTime="2025-10-07 11:35:09.281291452 +0000 UTC m=+876.077690501" Oct 07 11:35:09 crc kubenswrapper[4700]: I1007 11:35:09.312033 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-whcl4" Oct 07 11:35:09 crc kubenswrapper[4700]: I1007 11:35:09.703398 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-xxzx6" Oct 07 11:35:10 crc kubenswrapper[4700]: I1007 11:35:10.180897 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-whcl4"] Oct 07 11:35:11 crc kubenswrapper[4700]: I1007 11:35:11.277662 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-whcl4" podUID="c8986340-c2d7-47f2-9b6d-73896e1362ec" containerName="registry-server" containerID="cri-o://7614682fc253b67fedf8dfc7a0d5a3da5eb14966d51654e0f895b33539ca51d8" gracePeriod=2 Oct 07 11:35:11 crc kubenswrapper[4700]: I1007 11:35:11.334928 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ctqg6s" Oct 07 11:35:18 crc kubenswrapper[4700]: I1007 11:35:16.435269 4700 generic.go:334] "Generic (PLEG): container finished" podID="c8986340-c2d7-47f2-9b6d-73896e1362ec" containerID="7614682fc253b67fedf8dfc7a0d5a3da5eb14966d51654e0f895b33539ca51d8" exitCode=0 Oct 07 11:35:18 crc kubenswrapper[4700]: I1007 11:35:16.435355 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-whcl4" event={"ID":"c8986340-c2d7-47f2-9b6d-73896e1362ec","Type":"ContainerDied","Data":"7614682fc253b67fedf8dfc7a0d5a3da5eb14966d51654e0f895b33539ca51d8"} Oct 07 11:35:18 crc kubenswrapper[4700]: I1007 11:35:17.450517 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-5nnwj" event={"ID":"0df5f995-5a5e-4c40-a498-9dd5ffd4381c","Type":"ContainerStarted","Data":"9d8d13060ebd4d356c0b8f0778e0fefe55f3040f1fafa1826d6dacca3ac6854b"} Oct 07 11:35:18 crc kubenswrapper[4700]: I1007 11:35:17.451132 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-5nnwj" Oct 07 11:35:18 crc kubenswrapper[4700]: I1007 11:35:18.202081 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-whcl4" Oct 07 11:35:18 crc kubenswrapper[4700]: I1007 11:35:18.222615 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-5nnwj" podStartSLOduration=12.622351623 podStartE2EDuration="50.222595436s" podCreationTimestamp="2025-10-07 11:34:28 +0000 UTC" firstStartedPulling="2025-10-07 11:34:31.803193693 +0000 UTC m=+838.599592682" lastFinishedPulling="2025-10-07 11:35:09.403437496 +0000 UTC m=+876.199836495" observedRunningTime="2025-10-07 11:35:17.497971143 +0000 UTC m=+884.294370172" watchObservedRunningTime="2025-10-07 11:35:18.222595436 +0000 UTC m=+885.018994425" Oct 07 11:35:18 crc kubenswrapper[4700]: I1007 11:35:18.258227 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8986340-c2d7-47f2-9b6d-73896e1362ec-catalog-content\") pod \"c8986340-c2d7-47f2-9b6d-73896e1362ec\" (UID: \"c8986340-c2d7-47f2-9b6d-73896e1362ec\") " Oct 07 11:35:18 crc kubenswrapper[4700]: I1007 11:35:18.258367 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcvmp\" (UniqueName: \"kubernetes.io/projected/c8986340-c2d7-47f2-9b6d-73896e1362ec-kube-api-access-gcvmp\") pod \"c8986340-c2d7-47f2-9b6d-73896e1362ec\" (UID: \"c8986340-c2d7-47f2-9b6d-73896e1362ec\") " Oct 07 11:35:18 crc kubenswrapper[4700]: I1007 11:35:18.258441 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8986340-c2d7-47f2-9b6d-73896e1362ec-utilities\") pod \"c8986340-c2d7-47f2-9b6d-73896e1362ec\" (UID: \"c8986340-c2d7-47f2-9b6d-73896e1362ec\") " Oct 07 11:35:18 crc kubenswrapper[4700]: I1007 11:35:18.259627 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8986340-c2d7-47f2-9b6d-73896e1362ec-utilities" (OuterVolumeSpecName: "utilities") pod "c8986340-c2d7-47f2-9b6d-73896e1362ec" (UID: "c8986340-c2d7-47f2-9b6d-73896e1362ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:35:18 crc kubenswrapper[4700]: I1007 11:35:18.268808 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8986340-c2d7-47f2-9b6d-73896e1362ec-kube-api-access-gcvmp" (OuterVolumeSpecName: "kube-api-access-gcvmp") pod "c8986340-c2d7-47f2-9b6d-73896e1362ec" (UID: "c8986340-c2d7-47f2-9b6d-73896e1362ec"). InnerVolumeSpecName "kube-api-access-gcvmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:35:18 crc kubenswrapper[4700]: I1007 11:35:18.330207 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8986340-c2d7-47f2-9b6d-73896e1362ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c8986340-c2d7-47f2-9b6d-73896e1362ec" (UID: "c8986340-c2d7-47f2-9b6d-73896e1362ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:35:18 crc kubenswrapper[4700]: I1007 11:35:18.359789 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcvmp\" (UniqueName: \"kubernetes.io/projected/c8986340-c2d7-47f2-9b6d-73896e1362ec-kube-api-access-gcvmp\") on node \"crc\" DevicePath \"\"" Oct 07 11:35:18 crc kubenswrapper[4700]: I1007 11:35:18.359813 4700 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8986340-c2d7-47f2-9b6d-73896e1362ec-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 11:35:18 crc kubenswrapper[4700]: I1007 11:35:18.359822 4700 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8986340-c2d7-47f2-9b6d-73896e1362ec-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 11:35:18 crc kubenswrapper[4700]: I1007 11:35:18.461069 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-whcl4" event={"ID":"c8986340-c2d7-47f2-9b6d-73896e1362ec","Type":"ContainerDied","Data":"7652d954523c80cda8f43c0a14ce7e3de4c8e69e0dc1a96e041b8afe960d038b"} Oct 07 11:35:18 crc kubenswrapper[4700]: I1007 11:35:18.461127 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-whcl4" Oct 07 11:35:18 crc kubenswrapper[4700]: I1007 11:35:18.461172 4700 scope.go:117] "RemoveContainer" containerID="7614682fc253b67fedf8dfc7a0d5a3da5eb14966d51654e0f895b33539ca51d8" Oct 07 11:35:18 crc kubenswrapper[4700]: I1007 11:35:18.494299 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-whcl4"] Oct 07 11:35:18 crc kubenswrapper[4700]: I1007 11:35:18.506257 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-whcl4"] Oct 07 11:35:18 crc kubenswrapper[4700]: I1007 11:35:18.766320 4700 scope.go:117] "RemoveContainer" containerID="d15dfb6d9175a213baf6ec33faca4d94d7af68be164664c58cd3b43958d6bb42" Oct 07 11:35:18 crc kubenswrapper[4700]: I1007 11:35:18.806884 4700 scope.go:117] "RemoveContainer" containerID="1b73a8fb82c2647d0b1364a7435e4ed5f3e88238fa19b51e7cf889c5f7d08c93" Oct 07 11:35:19 crc kubenswrapper[4700]: I1007 11:35:19.469482 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-jv8c7" event={"ID":"d50469d1-cf53-4396-9d9c-f03db7eb43f3","Type":"ContainerStarted","Data":"381287969e3e60d496645871e9140dd93dc9928593cb98cc157ab6d269adca66"} Oct 07 11:35:19 crc kubenswrapper[4700]: I1007 11:35:19.469930 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-jv8c7" Oct 07 11:35:19 crc kubenswrapper[4700]: I1007 11:35:19.471988 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-9zh29" event={"ID":"ae80f142-9ac8-4614-a0c5-b74dfe98b0c8","Type":"ContainerStarted","Data":"046a0c4bed216834aaf06756e18b8f1359c87ec098a94c311a56aa59cbeb829f"} Oct 07 11:35:19 crc kubenswrapper[4700]: I1007 11:35:19.472327 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-9zh29" Oct 07 11:35:19 crc kubenswrapper[4700]: I1007 11:35:19.479973 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-22wzl" event={"ID":"1d2e0af8-3619-4c16-87e1-0ec8d500bc56","Type":"ContainerStarted","Data":"cd39c7c9c5fd9f77a3b80a46dc90a114d2672886f47f7ee77b0b37dea0505651"} Oct 07 11:35:19 crc kubenswrapper[4700]: I1007 11:35:19.530694 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-22wzl" podStartSLOduration=2.978552879 podStartE2EDuration="14.530673918s" podCreationTimestamp="2025-10-07 11:35:05 +0000 UTC" firstStartedPulling="2025-10-07 11:35:07.214085725 +0000 UTC m=+874.010484704" lastFinishedPulling="2025-10-07 11:35:18.766206754 +0000 UTC m=+885.562605743" observedRunningTime="2025-10-07 11:35:19.523602683 +0000 UTC m=+886.320001672" watchObservedRunningTime="2025-10-07 11:35:19.530673918 +0000 UTC m=+886.327072917" Oct 07 11:35:19 crc kubenswrapper[4700]: I1007 11:35:19.531584 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-jv8c7" podStartSLOduration=3.404730481 podStartE2EDuration="50.531578152s" podCreationTimestamp="2025-10-07 11:34:29 +0000 UTC" firstStartedPulling="2025-10-07 11:34:31.639513247 +0000 UTC m=+838.435912226" lastFinishedPulling="2025-10-07 11:35:18.766360908 +0000 UTC m=+885.562759897" observedRunningTime="2025-10-07 11:35:19.495035606 +0000 UTC m=+886.291434595" watchObservedRunningTime="2025-10-07 11:35:19.531578152 +0000 UTC m=+886.327977161" Oct 07 11:35:19 crc kubenswrapper[4700]: I1007 11:35:19.543103 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-9zh29" podStartSLOduration=3.5690915050000003 podStartE2EDuration="50.543086803s" podCreationTimestamp="2025-10-07 11:34:29 +0000 UTC" firstStartedPulling="2025-10-07 11:34:31.793777237 +0000 UTC m=+838.590176226" lastFinishedPulling="2025-10-07 11:35:18.767772525 +0000 UTC m=+885.564171524" observedRunningTime="2025-10-07 11:35:19.538319668 +0000 UTC m=+886.334718657" watchObservedRunningTime="2025-10-07 11:35:19.543086803 +0000 UTC m=+886.339485792" Oct 07 11:35:19 crc kubenswrapper[4700]: I1007 11:35:19.546578 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-69f5t" Oct 07 11:35:19 crc kubenswrapper[4700]: I1007 11:35:19.604473 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-n4w8d" Oct 07 11:35:19 crc kubenswrapper[4700]: I1007 11:35:19.706374 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-xxzx6" Oct 07 11:35:19 crc kubenswrapper[4700]: I1007 11:35:19.747582 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-5szjk" Oct 07 11:35:19 crc kubenswrapper[4700]: I1007 11:35:19.751271 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-x8twn" Oct 07 11:35:19 crc kubenswrapper[4700]: I1007 11:35:19.853874 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-xvtgc" Oct 07 11:35:19 crc kubenswrapper[4700]: I1007 11:35:19.927501 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-jrw6r" Oct 07 11:35:19 crc kubenswrapper[4700]: I1007 11:35:19.966350 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8986340-c2d7-47f2-9b6d-73896e1362ec" path="/var/lib/kubelet/pods/c8986340-c2d7-47f2-9b6d-73896e1362ec/volumes" Oct 07 11:35:20 crc kubenswrapper[4700]: I1007 11:35:20.031491 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-5nnwj" Oct 07 11:35:23 crc kubenswrapper[4700]: I1007 11:35:23.677938 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zk2n7"] Oct 07 11:35:23 crc kubenswrapper[4700]: E1007 11:35:23.678948 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8986340-c2d7-47f2-9b6d-73896e1362ec" containerName="registry-server" Oct 07 11:35:23 crc kubenswrapper[4700]: I1007 11:35:23.678977 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8986340-c2d7-47f2-9b6d-73896e1362ec" containerName="registry-server" Oct 07 11:35:23 crc kubenswrapper[4700]: E1007 11:35:23.679042 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8986340-c2d7-47f2-9b6d-73896e1362ec" containerName="extract-utilities" Oct 07 11:35:23 crc kubenswrapper[4700]: I1007 11:35:23.679060 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8986340-c2d7-47f2-9b6d-73896e1362ec" containerName="extract-utilities" Oct 07 11:35:23 crc kubenswrapper[4700]: E1007 11:35:23.679092 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8986340-c2d7-47f2-9b6d-73896e1362ec" containerName="extract-content" Oct 07 11:35:23 crc kubenswrapper[4700]: I1007 11:35:23.679109 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8986340-c2d7-47f2-9b6d-73896e1362ec" containerName="extract-content" Oct 07 11:35:23 crc kubenswrapper[4700]: I1007 11:35:23.679516 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8986340-c2d7-47f2-9b6d-73896e1362ec" containerName="registry-server" Oct 07 11:35:23 crc kubenswrapper[4700]: I1007 11:35:23.681724 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zk2n7" Oct 07 11:35:23 crc kubenswrapper[4700]: I1007 11:35:23.701512 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zk2n7"] Oct 07 11:35:23 crc kubenswrapper[4700]: I1007 11:35:23.748087 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8jvf\" (UniqueName: \"kubernetes.io/projected/239c24ff-3d2a-4e4a-a1d9-49e303d579ec-kube-api-access-g8jvf\") pod \"community-operators-zk2n7\" (UID: \"239c24ff-3d2a-4e4a-a1d9-49e303d579ec\") " pod="openshift-marketplace/community-operators-zk2n7" Oct 07 11:35:23 crc kubenswrapper[4700]: I1007 11:35:23.748163 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/239c24ff-3d2a-4e4a-a1d9-49e303d579ec-utilities\") pod \"community-operators-zk2n7\" (UID: \"239c24ff-3d2a-4e4a-a1d9-49e303d579ec\") " pod="openshift-marketplace/community-operators-zk2n7" Oct 07 11:35:23 crc kubenswrapper[4700]: I1007 11:35:23.748204 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/239c24ff-3d2a-4e4a-a1d9-49e303d579ec-catalog-content\") pod \"community-operators-zk2n7\" (UID: \"239c24ff-3d2a-4e4a-a1d9-49e303d579ec\") " pod="openshift-marketplace/community-operators-zk2n7" Oct 07 11:35:23 crc kubenswrapper[4700]: I1007 11:35:23.849740 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/239c24ff-3d2a-4e4a-a1d9-49e303d579ec-utilities\") pod \"community-operators-zk2n7\" (UID: \"239c24ff-3d2a-4e4a-a1d9-49e303d579ec\") " pod="openshift-marketplace/community-operators-zk2n7" Oct 07 11:35:23 crc kubenswrapper[4700]: I1007 11:35:23.849811 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/239c24ff-3d2a-4e4a-a1d9-49e303d579ec-catalog-content\") pod \"community-operators-zk2n7\" (UID: \"239c24ff-3d2a-4e4a-a1d9-49e303d579ec\") " pod="openshift-marketplace/community-operators-zk2n7" Oct 07 11:35:23 crc kubenswrapper[4700]: I1007 11:35:23.849874 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8jvf\" (UniqueName: \"kubernetes.io/projected/239c24ff-3d2a-4e4a-a1d9-49e303d579ec-kube-api-access-g8jvf\") pod \"community-operators-zk2n7\" (UID: \"239c24ff-3d2a-4e4a-a1d9-49e303d579ec\") " pod="openshift-marketplace/community-operators-zk2n7" Oct 07 11:35:23 crc kubenswrapper[4700]: I1007 11:35:23.850507 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/239c24ff-3d2a-4e4a-a1d9-49e303d579ec-utilities\") pod \"community-operators-zk2n7\" (UID: \"239c24ff-3d2a-4e4a-a1d9-49e303d579ec\") " pod="openshift-marketplace/community-operators-zk2n7" Oct 07 11:35:23 crc kubenswrapper[4700]: I1007 11:35:23.850540 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/239c24ff-3d2a-4e4a-a1d9-49e303d579ec-catalog-content\") pod \"community-operators-zk2n7\" (UID: \"239c24ff-3d2a-4e4a-a1d9-49e303d579ec\") " pod="openshift-marketplace/community-operators-zk2n7" Oct 07 11:35:23 crc kubenswrapper[4700]: I1007 11:35:23.875647 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8jvf\" (UniqueName: \"kubernetes.io/projected/239c24ff-3d2a-4e4a-a1d9-49e303d579ec-kube-api-access-g8jvf\") pod \"community-operators-zk2n7\" (UID: \"239c24ff-3d2a-4e4a-a1d9-49e303d579ec\") " pod="openshift-marketplace/community-operators-zk2n7" Oct 07 11:35:24 crc kubenswrapper[4700]: I1007 11:35:24.019411 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zk2n7" Oct 07 11:35:24 crc kubenswrapper[4700]: I1007 11:35:24.515822 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zk2n7"] Oct 07 11:35:25 crc kubenswrapper[4700]: I1007 11:35:25.536709 4700 generic.go:334] "Generic (PLEG): container finished" podID="239c24ff-3d2a-4e4a-a1d9-49e303d579ec" containerID="b965c7660faee4ed97583531f73d0e0a699cc4e2194963dfec010f47e4326343" exitCode=0 Oct 07 11:35:25 crc kubenswrapper[4700]: I1007 11:35:25.536804 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zk2n7" event={"ID":"239c24ff-3d2a-4e4a-a1d9-49e303d579ec","Type":"ContainerDied","Data":"b965c7660faee4ed97583531f73d0e0a699cc4e2194963dfec010f47e4326343"} Oct 07 11:35:25 crc kubenswrapper[4700]: I1007 11:35:25.537097 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zk2n7" event={"ID":"239c24ff-3d2a-4e4a-a1d9-49e303d579ec","Type":"ContainerStarted","Data":"6e805956c3d5ccd363e49e952dd0f90e3b4d432e7d69ad6a275f52a92ae0bde6"} Oct 07 11:35:25 crc kubenswrapper[4700]: I1007 11:35:25.988231 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-22wzl" Oct 07 11:35:25 crc kubenswrapper[4700]: I1007 11:35:25.988283 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-22wzl" Oct 07 11:35:26 crc kubenswrapper[4700]: I1007 11:35:26.034016 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-22wzl" Oct 07 11:35:26 crc kubenswrapper[4700]: I1007 11:35:26.603242 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-22wzl" Oct 07 11:35:27 crc kubenswrapper[4700]: I1007 11:35:27.559421 4700 generic.go:334] "Generic (PLEG): container finished" podID="239c24ff-3d2a-4e4a-a1d9-49e303d579ec" containerID="f8350cc94afb0be0b4e3d1ef00b334a5f611c06d20eceffb9d5451f17d827bb2" exitCode=0 Oct 07 11:35:27 crc kubenswrapper[4700]: I1007 11:35:27.559517 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zk2n7" event={"ID":"239c24ff-3d2a-4e4a-a1d9-49e303d579ec","Type":"ContainerDied","Data":"f8350cc94afb0be0b4e3d1ef00b334a5f611c06d20eceffb9d5451f17d827bb2"} Oct 07 11:35:28 crc kubenswrapper[4700]: I1007 11:35:28.451169 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-22wzl"] Oct 07 11:35:28 crc kubenswrapper[4700]: I1007 11:35:28.567492 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zk2n7" event={"ID":"239c24ff-3d2a-4e4a-a1d9-49e303d579ec","Type":"ContainerStarted","Data":"d2377b48adc9d340f6f8b000d81de6a762f24e3a2ae597688ce1afe5847994ce"} Oct 07 11:35:28 crc kubenswrapper[4700]: I1007 11:35:28.567649 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-22wzl" podUID="1d2e0af8-3619-4c16-87e1-0ec8d500bc56" containerName="registry-server" containerID="cri-o://cd39c7c9c5fd9f77a3b80a46dc90a114d2672886f47f7ee77b0b37dea0505651" gracePeriod=2 Oct 07 11:35:28 crc kubenswrapper[4700]: I1007 11:35:28.597291 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zk2n7" podStartSLOduration=3.132455031 podStartE2EDuration="5.597273048s" podCreationTimestamp="2025-10-07 11:35:23 +0000 UTC" firstStartedPulling="2025-10-07 11:35:25.539544075 +0000 UTC m=+892.335943094" lastFinishedPulling="2025-10-07 11:35:28.004362122 +0000 UTC m=+894.800761111" observedRunningTime="2025-10-07 11:35:28.592042911 +0000 UTC m=+895.388441900" watchObservedRunningTime="2025-10-07 11:35:28.597273048 +0000 UTC m=+895.393672047" Oct 07 11:35:28 crc kubenswrapper[4700]: I1007 11:35:28.981063 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-22wzl" Oct 07 11:35:29 crc kubenswrapper[4700]: I1007 11:35:29.029154 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d2e0af8-3619-4c16-87e1-0ec8d500bc56-catalog-content\") pod \"1d2e0af8-3619-4c16-87e1-0ec8d500bc56\" (UID: \"1d2e0af8-3619-4c16-87e1-0ec8d500bc56\") " Oct 07 11:35:29 crc kubenswrapper[4700]: I1007 11:35:29.029224 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d2e0af8-3619-4c16-87e1-0ec8d500bc56-utilities\") pod \"1d2e0af8-3619-4c16-87e1-0ec8d500bc56\" (UID: \"1d2e0af8-3619-4c16-87e1-0ec8d500bc56\") " Oct 07 11:35:29 crc kubenswrapper[4700]: I1007 11:35:29.029475 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmn76\" (UniqueName: \"kubernetes.io/projected/1d2e0af8-3619-4c16-87e1-0ec8d500bc56-kube-api-access-mmn76\") pod \"1d2e0af8-3619-4c16-87e1-0ec8d500bc56\" (UID: \"1d2e0af8-3619-4c16-87e1-0ec8d500bc56\") " Oct 07 11:35:29 crc kubenswrapper[4700]: I1007 11:35:29.030193 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d2e0af8-3619-4c16-87e1-0ec8d500bc56-utilities" (OuterVolumeSpecName: "utilities") pod "1d2e0af8-3619-4c16-87e1-0ec8d500bc56" (UID: "1d2e0af8-3619-4c16-87e1-0ec8d500bc56"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:35:29 crc kubenswrapper[4700]: I1007 11:35:29.039703 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d2e0af8-3619-4c16-87e1-0ec8d500bc56-kube-api-access-mmn76" (OuterVolumeSpecName: "kube-api-access-mmn76") pod "1d2e0af8-3619-4c16-87e1-0ec8d500bc56" (UID: "1d2e0af8-3619-4c16-87e1-0ec8d500bc56"). InnerVolumeSpecName "kube-api-access-mmn76". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:35:29 crc kubenswrapper[4700]: I1007 11:35:29.058911 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d2e0af8-3619-4c16-87e1-0ec8d500bc56-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d2e0af8-3619-4c16-87e1-0ec8d500bc56" (UID: "1d2e0af8-3619-4c16-87e1-0ec8d500bc56"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:35:29 crc kubenswrapper[4700]: I1007 11:35:29.131596 4700 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d2e0af8-3619-4c16-87e1-0ec8d500bc56-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 11:35:29 crc kubenswrapper[4700]: I1007 11:35:29.131642 4700 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d2e0af8-3619-4c16-87e1-0ec8d500bc56-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 11:35:29 crc kubenswrapper[4700]: I1007 11:35:29.131662 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmn76\" (UniqueName: \"kubernetes.io/projected/1d2e0af8-3619-4c16-87e1-0ec8d500bc56-kube-api-access-mmn76\") on node \"crc\" DevicePath \"\"" Oct 07 11:35:29 crc kubenswrapper[4700]: I1007 11:35:29.580769 4700 generic.go:334] "Generic (PLEG): container finished" podID="1d2e0af8-3619-4c16-87e1-0ec8d500bc56" containerID="cd39c7c9c5fd9f77a3b80a46dc90a114d2672886f47f7ee77b0b37dea0505651" exitCode=0 Oct 07 11:35:29 crc kubenswrapper[4700]: I1007 11:35:29.580872 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-22wzl" Oct 07 11:35:29 crc kubenswrapper[4700]: I1007 11:35:29.580913 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-22wzl" event={"ID":"1d2e0af8-3619-4c16-87e1-0ec8d500bc56","Type":"ContainerDied","Data":"cd39c7c9c5fd9f77a3b80a46dc90a114d2672886f47f7ee77b0b37dea0505651"} Oct 07 11:35:29 crc kubenswrapper[4700]: I1007 11:35:29.580971 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-22wzl" event={"ID":"1d2e0af8-3619-4c16-87e1-0ec8d500bc56","Type":"ContainerDied","Data":"5b2f35fd101ce4ce07293ea4dfa024642b61cf64191d20411f2db80d30da715f"} Oct 07 11:35:29 crc kubenswrapper[4700]: I1007 11:35:29.580996 4700 scope.go:117] "RemoveContainer" containerID="cd39c7c9c5fd9f77a3b80a46dc90a114d2672886f47f7ee77b0b37dea0505651" Oct 07 11:35:29 crc kubenswrapper[4700]: I1007 11:35:29.618919 4700 scope.go:117] "RemoveContainer" containerID="c1af56ae52767534b006bd40fe0c848d97e275c6df90caf6725b512e70abafc2" Oct 07 11:35:29 crc kubenswrapper[4700]: I1007 11:35:29.644639 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-22wzl"] Oct 07 11:35:29 crc kubenswrapper[4700]: I1007 11:35:29.654263 4700 scope.go:117] "RemoveContainer" containerID="be66d1cb9cc0eba585fb19a23d7efb47793d65cda57b07b01b22265fe02c9c42" Oct 07 11:35:29 crc kubenswrapper[4700]: I1007 11:35:29.654672 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-22wzl"] Oct 07 11:35:29 crc kubenswrapper[4700]: I1007 11:35:29.687370 4700 scope.go:117] "RemoveContainer" containerID="cd39c7c9c5fd9f77a3b80a46dc90a114d2672886f47f7ee77b0b37dea0505651" Oct 07 11:35:29 crc kubenswrapper[4700]: E1007 11:35:29.688029 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd39c7c9c5fd9f77a3b80a46dc90a114d2672886f47f7ee77b0b37dea0505651\": container with ID starting with cd39c7c9c5fd9f77a3b80a46dc90a114d2672886f47f7ee77b0b37dea0505651 not found: ID does not exist" containerID="cd39c7c9c5fd9f77a3b80a46dc90a114d2672886f47f7ee77b0b37dea0505651" Oct 07 11:35:29 crc kubenswrapper[4700]: I1007 11:35:29.688084 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd39c7c9c5fd9f77a3b80a46dc90a114d2672886f47f7ee77b0b37dea0505651"} err="failed to get container status \"cd39c7c9c5fd9f77a3b80a46dc90a114d2672886f47f7ee77b0b37dea0505651\": rpc error: code = NotFound desc = could not find container \"cd39c7c9c5fd9f77a3b80a46dc90a114d2672886f47f7ee77b0b37dea0505651\": container with ID starting with cd39c7c9c5fd9f77a3b80a46dc90a114d2672886f47f7ee77b0b37dea0505651 not found: ID does not exist" Oct 07 11:35:29 crc kubenswrapper[4700]: I1007 11:35:29.688119 4700 scope.go:117] "RemoveContainer" containerID="c1af56ae52767534b006bd40fe0c848d97e275c6df90caf6725b512e70abafc2" Oct 07 11:35:29 crc kubenswrapper[4700]: E1007 11:35:29.688587 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1af56ae52767534b006bd40fe0c848d97e275c6df90caf6725b512e70abafc2\": container with ID starting with c1af56ae52767534b006bd40fe0c848d97e275c6df90caf6725b512e70abafc2 not found: ID does not exist" containerID="c1af56ae52767534b006bd40fe0c848d97e275c6df90caf6725b512e70abafc2" Oct 07 11:35:29 crc kubenswrapper[4700]: I1007 11:35:29.688642 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1af56ae52767534b006bd40fe0c848d97e275c6df90caf6725b512e70abafc2"} err="failed to get container status \"c1af56ae52767534b006bd40fe0c848d97e275c6df90caf6725b512e70abafc2\": rpc error: code = NotFound desc = could not find container \"c1af56ae52767534b006bd40fe0c848d97e275c6df90caf6725b512e70abafc2\": container with ID starting with c1af56ae52767534b006bd40fe0c848d97e275c6df90caf6725b512e70abafc2 not found: ID does not exist" Oct 07 11:35:29 crc kubenswrapper[4700]: I1007 11:35:29.688675 4700 scope.go:117] "RemoveContainer" containerID="be66d1cb9cc0eba585fb19a23d7efb47793d65cda57b07b01b22265fe02c9c42" Oct 07 11:35:29 crc kubenswrapper[4700]: E1007 11:35:29.689435 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be66d1cb9cc0eba585fb19a23d7efb47793d65cda57b07b01b22265fe02c9c42\": container with ID starting with be66d1cb9cc0eba585fb19a23d7efb47793d65cda57b07b01b22265fe02c9c42 not found: ID does not exist" containerID="be66d1cb9cc0eba585fb19a23d7efb47793d65cda57b07b01b22265fe02c9c42" Oct 07 11:35:29 crc kubenswrapper[4700]: I1007 11:35:29.689478 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be66d1cb9cc0eba585fb19a23d7efb47793d65cda57b07b01b22265fe02c9c42"} err="failed to get container status \"be66d1cb9cc0eba585fb19a23d7efb47793d65cda57b07b01b22265fe02c9c42\": rpc error: code = NotFound desc = could not find container \"be66d1cb9cc0eba585fb19a23d7efb47793d65cda57b07b01b22265fe02c9c42\": container with ID starting with be66d1cb9cc0eba585fb19a23d7efb47793d65cda57b07b01b22265fe02c9c42 not found: ID does not exist" Oct 07 11:35:29 crc kubenswrapper[4700]: I1007 11:35:29.831738 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-jv8c7" Oct 07 11:35:29 crc kubenswrapper[4700]: I1007 11:35:29.967142 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d2e0af8-3619-4c16-87e1-0ec8d500bc56" path="/var/lib/kubelet/pods/1d2e0af8-3619-4c16-87e1-0ec8d500bc56/volumes" Oct 07 11:35:30 crc kubenswrapper[4700]: I1007 11:35:30.141008 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-9zh29" Oct 07 11:35:34 crc kubenswrapper[4700]: I1007 11:35:34.020058 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zk2n7" Oct 07 11:35:34 crc kubenswrapper[4700]: I1007 11:35:34.021186 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zk2n7" Oct 07 11:35:34 crc kubenswrapper[4700]: I1007 11:35:34.079399 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zk2n7" Oct 07 11:35:34 crc kubenswrapper[4700]: I1007 11:35:34.685862 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zk2n7" Oct 07 11:35:35 crc kubenswrapper[4700]: I1007 11:35:35.248843 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zk2n7"] Oct 07 11:35:36 crc kubenswrapper[4700]: I1007 11:35:36.643688 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zk2n7" podUID="239c24ff-3d2a-4e4a-a1d9-49e303d579ec" containerName="registry-server" containerID="cri-o://d2377b48adc9d340f6f8b000d81de6a762f24e3a2ae597688ce1afe5847994ce" gracePeriod=2 Oct 07 11:35:37 crc kubenswrapper[4700]: I1007 11:35:37.074439 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zk2n7" Oct 07 11:35:37 crc kubenswrapper[4700]: I1007 11:35:37.154656 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/239c24ff-3d2a-4e4a-a1d9-49e303d579ec-utilities\") pod \"239c24ff-3d2a-4e4a-a1d9-49e303d579ec\" (UID: \"239c24ff-3d2a-4e4a-a1d9-49e303d579ec\") " Oct 07 11:35:37 crc kubenswrapper[4700]: I1007 11:35:37.154822 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/239c24ff-3d2a-4e4a-a1d9-49e303d579ec-catalog-content\") pod \"239c24ff-3d2a-4e4a-a1d9-49e303d579ec\" (UID: \"239c24ff-3d2a-4e4a-a1d9-49e303d579ec\") " Oct 07 11:35:37 crc kubenswrapper[4700]: I1007 11:35:37.154892 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8jvf\" (UniqueName: \"kubernetes.io/projected/239c24ff-3d2a-4e4a-a1d9-49e303d579ec-kube-api-access-g8jvf\") pod \"239c24ff-3d2a-4e4a-a1d9-49e303d579ec\" (UID: \"239c24ff-3d2a-4e4a-a1d9-49e303d579ec\") " Oct 07 11:35:37 crc kubenswrapper[4700]: I1007 11:35:37.156037 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/239c24ff-3d2a-4e4a-a1d9-49e303d579ec-utilities" (OuterVolumeSpecName: "utilities") pod "239c24ff-3d2a-4e4a-a1d9-49e303d579ec" (UID: "239c24ff-3d2a-4e4a-a1d9-49e303d579ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:35:37 crc kubenswrapper[4700]: I1007 11:35:37.162784 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/239c24ff-3d2a-4e4a-a1d9-49e303d579ec-kube-api-access-g8jvf" (OuterVolumeSpecName: "kube-api-access-g8jvf") pod "239c24ff-3d2a-4e4a-a1d9-49e303d579ec" (UID: "239c24ff-3d2a-4e4a-a1d9-49e303d579ec"). InnerVolumeSpecName "kube-api-access-g8jvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:35:37 crc kubenswrapper[4700]: I1007 11:35:37.256465 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8jvf\" (UniqueName: \"kubernetes.io/projected/239c24ff-3d2a-4e4a-a1d9-49e303d579ec-kube-api-access-g8jvf\") on node \"crc\" DevicePath \"\"" Oct 07 11:35:37 crc kubenswrapper[4700]: I1007 11:35:37.256493 4700 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/239c24ff-3d2a-4e4a-a1d9-49e303d579ec-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 11:35:37 crc kubenswrapper[4700]: I1007 11:35:37.397574 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/239c24ff-3d2a-4e4a-a1d9-49e303d579ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "239c24ff-3d2a-4e4a-a1d9-49e303d579ec" (UID: "239c24ff-3d2a-4e4a-a1d9-49e303d579ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:35:37 crc kubenswrapper[4700]: I1007 11:35:37.458602 4700 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/239c24ff-3d2a-4e4a-a1d9-49e303d579ec-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 11:35:37 crc kubenswrapper[4700]: I1007 11:35:37.656687 4700 generic.go:334] "Generic (PLEG): container finished" podID="239c24ff-3d2a-4e4a-a1d9-49e303d579ec" containerID="d2377b48adc9d340f6f8b000d81de6a762f24e3a2ae597688ce1afe5847994ce" exitCode=0 Oct 07 11:35:37 crc kubenswrapper[4700]: I1007 11:35:37.656738 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zk2n7" event={"ID":"239c24ff-3d2a-4e4a-a1d9-49e303d579ec","Type":"ContainerDied","Data":"d2377b48adc9d340f6f8b000d81de6a762f24e3a2ae597688ce1afe5847994ce"} Oct 07 11:35:37 crc kubenswrapper[4700]: I1007 11:35:37.656775 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zk2n7" event={"ID":"239c24ff-3d2a-4e4a-a1d9-49e303d579ec","Type":"ContainerDied","Data":"6e805956c3d5ccd363e49e952dd0f90e3b4d432e7d69ad6a275f52a92ae0bde6"} Oct 07 11:35:37 crc kubenswrapper[4700]: I1007 11:35:37.656785 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zk2n7" Oct 07 11:35:37 crc kubenswrapper[4700]: I1007 11:35:37.656796 4700 scope.go:117] "RemoveContainer" containerID="d2377b48adc9d340f6f8b000d81de6a762f24e3a2ae597688ce1afe5847994ce" Oct 07 11:35:37 crc kubenswrapper[4700]: I1007 11:35:37.691352 4700 scope.go:117] "RemoveContainer" containerID="f8350cc94afb0be0b4e3d1ef00b334a5f611c06d20eceffb9d5451f17d827bb2" Oct 07 11:35:37 crc kubenswrapper[4700]: I1007 11:35:37.703697 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zk2n7"] Oct 07 11:35:37 crc kubenswrapper[4700]: I1007 11:35:37.712195 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zk2n7"] Oct 07 11:35:37 crc kubenswrapper[4700]: I1007 11:35:37.727629 4700 scope.go:117] "RemoveContainer" containerID="b965c7660faee4ed97583531f73d0e0a699cc4e2194963dfec010f47e4326343" Oct 07 11:35:37 crc kubenswrapper[4700]: I1007 11:35:37.765791 4700 scope.go:117] "RemoveContainer" containerID="d2377b48adc9d340f6f8b000d81de6a762f24e3a2ae597688ce1afe5847994ce" Oct 07 11:35:37 crc kubenswrapper[4700]: E1007 11:35:37.766322 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2377b48adc9d340f6f8b000d81de6a762f24e3a2ae597688ce1afe5847994ce\": container with ID starting with d2377b48adc9d340f6f8b000d81de6a762f24e3a2ae597688ce1afe5847994ce not found: ID does not exist" containerID="d2377b48adc9d340f6f8b000d81de6a762f24e3a2ae597688ce1afe5847994ce" Oct 07 11:35:37 crc kubenswrapper[4700]: I1007 11:35:37.766603 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2377b48adc9d340f6f8b000d81de6a762f24e3a2ae597688ce1afe5847994ce"} err="failed to get container status \"d2377b48adc9d340f6f8b000d81de6a762f24e3a2ae597688ce1afe5847994ce\": rpc error: code = NotFound desc = could not find container \"d2377b48adc9d340f6f8b000d81de6a762f24e3a2ae597688ce1afe5847994ce\": container with ID starting with d2377b48adc9d340f6f8b000d81de6a762f24e3a2ae597688ce1afe5847994ce not found: ID does not exist" Oct 07 11:35:37 crc kubenswrapper[4700]: I1007 11:35:37.766623 4700 scope.go:117] "RemoveContainer" containerID="f8350cc94afb0be0b4e3d1ef00b334a5f611c06d20eceffb9d5451f17d827bb2" Oct 07 11:35:37 crc kubenswrapper[4700]: E1007 11:35:37.766897 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8350cc94afb0be0b4e3d1ef00b334a5f611c06d20eceffb9d5451f17d827bb2\": container with ID starting with f8350cc94afb0be0b4e3d1ef00b334a5f611c06d20eceffb9d5451f17d827bb2 not found: ID does not exist" containerID="f8350cc94afb0be0b4e3d1ef00b334a5f611c06d20eceffb9d5451f17d827bb2" Oct 07 11:35:37 crc kubenswrapper[4700]: I1007 11:35:37.766943 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8350cc94afb0be0b4e3d1ef00b334a5f611c06d20eceffb9d5451f17d827bb2"} err="failed to get container status \"f8350cc94afb0be0b4e3d1ef00b334a5f611c06d20eceffb9d5451f17d827bb2\": rpc error: code = NotFound desc = could not find container \"f8350cc94afb0be0b4e3d1ef00b334a5f611c06d20eceffb9d5451f17d827bb2\": container with ID starting with f8350cc94afb0be0b4e3d1ef00b334a5f611c06d20eceffb9d5451f17d827bb2 not found: ID does not exist" Oct 07 11:35:37 crc kubenswrapper[4700]: I1007 11:35:37.766970 4700 scope.go:117] "RemoveContainer" containerID="b965c7660faee4ed97583531f73d0e0a699cc4e2194963dfec010f47e4326343" Oct 07 11:35:37 crc kubenswrapper[4700]: E1007 11:35:37.767182 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b965c7660faee4ed97583531f73d0e0a699cc4e2194963dfec010f47e4326343\": container with ID starting with b965c7660faee4ed97583531f73d0e0a699cc4e2194963dfec010f47e4326343 not found: ID does not exist" containerID="b965c7660faee4ed97583531f73d0e0a699cc4e2194963dfec010f47e4326343" Oct 07 11:35:37 crc kubenswrapper[4700]: I1007 11:35:37.767204 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b965c7660faee4ed97583531f73d0e0a699cc4e2194963dfec010f47e4326343"} err="failed to get container status \"b965c7660faee4ed97583531f73d0e0a699cc4e2194963dfec010f47e4326343\": rpc error: code = NotFound desc = could not find container \"b965c7660faee4ed97583531f73d0e0a699cc4e2194963dfec010f47e4326343\": container with ID starting with b965c7660faee4ed97583531f73d0e0a699cc4e2194963dfec010f47e4326343 not found: ID does not exist" Oct 07 11:35:37 crc kubenswrapper[4700]: I1007 11:35:37.968480 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="239c24ff-3d2a-4e4a-a1d9-49e303d579ec" path="/var/lib/kubelet/pods/239c24ff-3d2a-4e4a-a1d9-49e303d579ec/volumes" Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.334034 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.334590 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.520755 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-52b4v"] Oct 07 11:35:45 crc kubenswrapper[4700]: E1007 11:35:45.521344 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="239c24ff-3d2a-4e4a-a1d9-49e303d579ec" containerName="extract-utilities" Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.521364 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="239c24ff-3d2a-4e4a-a1d9-49e303d579ec" containerName="extract-utilities" Oct 07 11:35:45 crc kubenswrapper[4700]: E1007 11:35:45.521389 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="239c24ff-3d2a-4e4a-a1d9-49e303d579ec" containerName="extract-content" Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.521396 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="239c24ff-3d2a-4e4a-a1d9-49e303d579ec" containerName="extract-content" Oct 07 11:35:45 crc kubenswrapper[4700]: E1007 11:35:45.521415 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d2e0af8-3619-4c16-87e1-0ec8d500bc56" containerName="extract-utilities" Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.521421 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d2e0af8-3619-4c16-87e1-0ec8d500bc56" containerName="extract-utilities" Oct 07 11:35:45 crc kubenswrapper[4700]: E1007 11:35:45.521434 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="239c24ff-3d2a-4e4a-a1d9-49e303d579ec" containerName="registry-server" Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.521440 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="239c24ff-3d2a-4e4a-a1d9-49e303d579ec" containerName="registry-server" Oct 07 11:35:45 crc kubenswrapper[4700]: E1007 11:35:45.521457 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d2e0af8-3619-4c16-87e1-0ec8d500bc56" containerName="registry-server" Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.521463 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d2e0af8-3619-4c16-87e1-0ec8d500bc56" containerName="registry-server" Oct 07 11:35:45 crc kubenswrapper[4700]: E1007 11:35:45.521475 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d2e0af8-3619-4c16-87e1-0ec8d500bc56" containerName="extract-content" Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.521481 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d2e0af8-3619-4c16-87e1-0ec8d500bc56" containerName="extract-content" Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.521610 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="239c24ff-3d2a-4e4a-a1d9-49e303d579ec" containerName="registry-server" Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.521625 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d2e0af8-3619-4c16-87e1-0ec8d500bc56" containerName="registry-server" Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.522509 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-52b4v" Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.525356 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.525553 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.525847 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.526025 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-7wqwm" Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.533789 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-52b4v"] Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.564002 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-846qw"] Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.565138 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-846qw" Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.566850 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.583026 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74a5f38a-0346-47ea-9f20-0272bd55a191-config\") pod \"dnsmasq-dns-675f4bcbfc-52b4v\" (UID: \"74a5f38a-0346-47ea-9f20-0272bd55a191\") " pod="openstack/dnsmasq-dns-675f4bcbfc-52b4v" Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.583086 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5rnj\" (UniqueName: \"kubernetes.io/projected/74a5f38a-0346-47ea-9f20-0272bd55a191-kube-api-access-p5rnj\") pod \"dnsmasq-dns-675f4bcbfc-52b4v\" (UID: \"74a5f38a-0346-47ea-9f20-0272bd55a191\") " pod="openstack/dnsmasq-dns-675f4bcbfc-52b4v" Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.596148 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-846qw"] Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.684723 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5rnj\" (UniqueName: \"kubernetes.io/projected/74a5f38a-0346-47ea-9f20-0272bd55a191-kube-api-access-p5rnj\") pod \"dnsmasq-dns-675f4bcbfc-52b4v\" (UID: \"74a5f38a-0346-47ea-9f20-0272bd55a191\") " pod="openstack/dnsmasq-dns-675f4bcbfc-52b4v" Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.684827 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzpgs\" (UniqueName: \"kubernetes.io/projected/a0f30c73-3f35-4269-9bf7-ea649fd9b616-kube-api-access-fzpgs\") pod \"dnsmasq-dns-78dd6ddcc-846qw\" (UID: \"a0f30c73-3f35-4269-9bf7-ea649fd9b616\") " pod="openstack/dnsmasq-dns-78dd6ddcc-846qw" Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.684864 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0f30c73-3f35-4269-9bf7-ea649fd9b616-config\") pod \"dnsmasq-dns-78dd6ddcc-846qw\" (UID: \"a0f30c73-3f35-4269-9bf7-ea649fd9b616\") " pod="openstack/dnsmasq-dns-78dd6ddcc-846qw" Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.684882 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0f30c73-3f35-4269-9bf7-ea649fd9b616-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-846qw\" (UID: \"a0f30c73-3f35-4269-9bf7-ea649fd9b616\") " pod="openstack/dnsmasq-dns-78dd6ddcc-846qw" Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.684927 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74a5f38a-0346-47ea-9f20-0272bd55a191-config\") pod \"dnsmasq-dns-675f4bcbfc-52b4v\" (UID: \"74a5f38a-0346-47ea-9f20-0272bd55a191\") " pod="openstack/dnsmasq-dns-675f4bcbfc-52b4v" Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.685814 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74a5f38a-0346-47ea-9f20-0272bd55a191-config\") pod \"dnsmasq-dns-675f4bcbfc-52b4v\" (UID: \"74a5f38a-0346-47ea-9f20-0272bd55a191\") " pod="openstack/dnsmasq-dns-675f4bcbfc-52b4v" Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.711476 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5rnj\" (UniqueName: \"kubernetes.io/projected/74a5f38a-0346-47ea-9f20-0272bd55a191-kube-api-access-p5rnj\") pod \"dnsmasq-dns-675f4bcbfc-52b4v\" (UID: \"74a5f38a-0346-47ea-9f20-0272bd55a191\") " pod="openstack/dnsmasq-dns-675f4bcbfc-52b4v" Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.786065 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzpgs\" (UniqueName: \"kubernetes.io/projected/a0f30c73-3f35-4269-9bf7-ea649fd9b616-kube-api-access-fzpgs\") pod \"dnsmasq-dns-78dd6ddcc-846qw\" (UID: \"a0f30c73-3f35-4269-9bf7-ea649fd9b616\") " pod="openstack/dnsmasq-dns-78dd6ddcc-846qw" Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.786121 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0f30c73-3f35-4269-9bf7-ea649fd9b616-config\") pod \"dnsmasq-dns-78dd6ddcc-846qw\" (UID: \"a0f30c73-3f35-4269-9bf7-ea649fd9b616\") " pod="openstack/dnsmasq-dns-78dd6ddcc-846qw" Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.786139 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0f30c73-3f35-4269-9bf7-ea649fd9b616-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-846qw\" (UID: \"a0f30c73-3f35-4269-9bf7-ea649fd9b616\") " pod="openstack/dnsmasq-dns-78dd6ddcc-846qw" Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.786973 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0f30c73-3f35-4269-9bf7-ea649fd9b616-config\") pod \"dnsmasq-dns-78dd6ddcc-846qw\" (UID: \"a0f30c73-3f35-4269-9bf7-ea649fd9b616\") " pod="openstack/dnsmasq-dns-78dd6ddcc-846qw" Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.786974 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0f30c73-3f35-4269-9bf7-ea649fd9b616-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-846qw\" (UID: \"a0f30c73-3f35-4269-9bf7-ea649fd9b616\") " pod="openstack/dnsmasq-dns-78dd6ddcc-846qw" Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.803143 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzpgs\" (UniqueName: \"kubernetes.io/projected/a0f30c73-3f35-4269-9bf7-ea649fd9b616-kube-api-access-fzpgs\") pod \"dnsmasq-dns-78dd6ddcc-846qw\" (UID: \"a0f30c73-3f35-4269-9bf7-ea649fd9b616\") " pod="openstack/dnsmasq-dns-78dd6ddcc-846qw" Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.842071 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-52b4v" Oct 07 11:35:45 crc kubenswrapper[4700]: I1007 11:35:45.886982 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-846qw" Oct 07 11:35:46 crc kubenswrapper[4700]: I1007 11:35:46.334199 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-52b4v"] Oct 07 11:35:46 crc kubenswrapper[4700]: I1007 11:35:46.343195 4700 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 11:35:46 crc kubenswrapper[4700]: I1007 11:35:46.405376 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-846qw"] Oct 07 11:35:46 crc kubenswrapper[4700]: W1007 11:35:46.415996 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0f30c73_3f35_4269_9bf7_ea649fd9b616.slice/crio-5941882517f9c1a252c94fbf3625b2d402e27f49914296eb7ac207f08c594723 WatchSource:0}: Error finding container 5941882517f9c1a252c94fbf3625b2d402e27f49914296eb7ac207f08c594723: Status 404 returned error can't find the container with id 5941882517f9c1a252c94fbf3625b2d402e27f49914296eb7ac207f08c594723 Oct 07 11:35:46 crc kubenswrapper[4700]: I1007 11:35:46.745078 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-52b4v" event={"ID":"74a5f38a-0346-47ea-9f20-0272bd55a191","Type":"ContainerStarted","Data":"3cd85ad9f4c3f2f91de6220e56db79cc25e8fa52ec57d2cb6aaae2108bf710d4"} Oct 07 11:35:46 crc kubenswrapper[4700]: I1007 11:35:46.746451 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-846qw" event={"ID":"a0f30c73-3f35-4269-9bf7-ea649fd9b616","Type":"ContainerStarted","Data":"5941882517f9c1a252c94fbf3625b2d402e27f49914296eb7ac207f08c594723"} Oct 07 11:35:48 crc kubenswrapper[4700]: I1007 11:35:48.375874 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-52b4v"] Oct 07 11:35:48 crc kubenswrapper[4700]: I1007 11:35:48.396872 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-dtxl5"] Oct 07 11:35:48 crc kubenswrapper[4700]: I1007 11:35:48.400173 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-dtxl5" Oct 07 11:35:48 crc kubenswrapper[4700]: I1007 11:35:48.428818 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-dtxl5"] Oct 07 11:35:48 crc kubenswrapper[4700]: I1007 11:35:48.531095 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88daacf3-e024-4e18-8631-406b6e504a5f-config\") pod \"dnsmasq-dns-666b6646f7-dtxl5\" (UID: \"88daacf3-e024-4e18-8631-406b6e504a5f\") " pod="openstack/dnsmasq-dns-666b6646f7-dtxl5" Oct 07 11:35:48 crc kubenswrapper[4700]: I1007 11:35:48.531145 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88daacf3-e024-4e18-8631-406b6e504a5f-dns-svc\") pod \"dnsmasq-dns-666b6646f7-dtxl5\" (UID: \"88daacf3-e024-4e18-8631-406b6e504a5f\") " pod="openstack/dnsmasq-dns-666b6646f7-dtxl5" Oct 07 11:35:48 crc kubenswrapper[4700]: I1007 11:35:48.531187 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tthlx\" (UniqueName: \"kubernetes.io/projected/88daacf3-e024-4e18-8631-406b6e504a5f-kube-api-access-tthlx\") pod \"dnsmasq-dns-666b6646f7-dtxl5\" (UID: \"88daacf3-e024-4e18-8631-406b6e504a5f\") " pod="openstack/dnsmasq-dns-666b6646f7-dtxl5" Oct 07 11:35:48 crc kubenswrapper[4700]: I1007 11:35:48.633068 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88daacf3-e024-4e18-8631-406b6e504a5f-config\") pod \"dnsmasq-dns-666b6646f7-dtxl5\" (UID: \"88daacf3-e024-4e18-8631-406b6e504a5f\") " pod="openstack/dnsmasq-dns-666b6646f7-dtxl5" Oct 07 11:35:48 crc kubenswrapper[4700]: I1007 11:35:48.633119 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88daacf3-e024-4e18-8631-406b6e504a5f-dns-svc\") pod \"dnsmasq-dns-666b6646f7-dtxl5\" (UID: \"88daacf3-e024-4e18-8631-406b6e504a5f\") " pod="openstack/dnsmasq-dns-666b6646f7-dtxl5" Oct 07 11:35:48 crc kubenswrapper[4700]: I1007 11:35:48.633156 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tthlx\" (UniqueName: \"kubernetes.io/projected/88daacf3-e024-4e18-8631-406b6e504a5f-kube-api-access-tthlx\") pod \"dnsmasq-dns-666b6646f7-dtxl5\" (UID: \"88daacf3-e024-4e18-8631-406b6e504a5f\") " pod="openstack/dnsmasq-dns-666b6646f7-dtxl5" Oct 07 11:35:48 crc kubenswrapper[4700]: I1007 11:35:48.634368 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88daacf3-e024-4e18-8631-406b6e504a5f-dns-svc\") pod \"dnsmasq-dns-666b6646f7-dtxl5\" (UID: \"88daacf3-e024-4e18-8631-406b6e504a5f\") " pod="openstack/dnsmasq-dns-666b6646f7-dtxl5" Oct 07 11:35:48 crc kubenswrapper[4700]: I1007 11:35:48.634443 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88daacf3-e024-4e18-8631-406b6e504a5f-config\") pod \"dnsmasq-dns-666b6646f7-dtxl5\" (UID: \"88daacf3-e024-4e18-8631-406b6e504a5f\") " pod="openstack/dnsmasq-dns-666b6646f7-dtxl5" Oct 07 11:35:48 crc kubenswrapper[4700]: I1007 11:35:48.680168 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tthlx\" (UniqueName: \"kubernetes.io/projected/88daacf3-e024-4e18-8631-406b6e504a5f-kube-api-access-tthlx\") pod \"dnsmasq-dns-666b6646f7-dtxl5\" (UID: \"88daacf3-e024-4e18-8631-406b6e504a5f\") " pod="openstack/dnsmasq-dns-666b6646f7-dtxl5" Oct 07 11:35:48 crc kubenswrapper[4700]: I1007 11:35:48.728645 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-dtxl5" Oct 07 11:35:48 crc kubenswrapper[4700]: I1007 11:35:48.897287 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-846qw"] Oct 07 11:35:48 crc kubenswrapper[4700]: I1007 11:35:48.930214 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-42x4h"] Oct 07 11:35:48 crc kubenswrapper[4700]: I1007 11:35:48.934032 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-42x4h" Oct 07 11:35:48 crc kubenswrapper[4700]: I1007 11:35:48.941504 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-42x4h"] Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.041648 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1936b589-6677-4ce8-8281-58612a3a5687-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-42x4h\" (UID: \"1936b589-6677-4ce8-8281-58612a3a5687\") " pod="openstack/dnsmasq-dns-57d769cc4f-42x4h" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.041719 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqjxc\" (UniqueName: \"kubernetes.io/projected/1936b589-6677-4ce8-8281-58612a3a5687-kube-api-access-xqjxc\") pod \"dnsmasq-dns-57d769cc4f-42x4h\" (UID: \"1936b589-6677-4ce8-8281-58612a3a5687\") " pod="openstack/dnsmasq-dns-57d769cc4f-42x4h" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.041758 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1936b589-6677-4ce8-8281-58612a3a5687-config\") pod \"dnsmasq-dns-57d769cc4f-42x4h\" (UID: \"1936b589-6677-4ce8-8281-58612a3a5687\") " pod="openstack/dnsmasq-dns-57d769cc4f-42x4h" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.143635 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqjxc\" (UniqueName: \"kubernetes.io/projected/1936b589-6677-4ce8-8281-58612a3a5687-kube-api-access-xqjxc\") pod \"dnsmasq-dns-57d769cc4f-42x4h\" (UID: \"1936b589-6677-4ce8-8281-58612a3a5687\") " pod="openstack/dnsmasq-dns-57d769cc4f-42x4h" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.143700 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1936b589-6677-4ce8-8281-58612a3a5687-config\") pod \"dnsmasq-dns-57d769cc4f-42x4h\" (UID: \"1936b589-6677-4ce8-8281-58612a3a5687\") " pod="openstack/dnsmasq-dns-57d769cc4f-42x4h" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.143793 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1936b589-6677-4ce8-8281-58612a3a5687-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-42x4h\" (UID: \"1936b589-6677-4ce8-8281-58612a3a5687\") " pod="openstack/dnsmasq-dns-57d769cc4f-42x4h" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.146999 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1936b589-6677-4ce8-8281-58612a3a5687-config\") pod \"dnsmasq-dns-57d769cc4f-42x4h\" (UID: \"1936b589-6677-4ce8-8281-58612a3a5687\") " pod="openstack/dnsmasq-dns-57d769cc4f-42x4h" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.147611 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1936b589-6677-4ce8-8281-58612a3a5687-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-42x4h\" (UID: \"1936b589-6677-4ce8-8281-58612a3a5687\") " pod="openstack/dnsmasq-dns-57d769cc4f-42x4h" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.177708 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqjxc\" (UniqueName: \"kubernetes.io/projected/1936b589-6677-4ce8-8281-58612a3a5687-kube-api-access-xqjxc\") pod \"dnsmasq-dns-57d769cc4f-42x4h\" (UID: \"1936b589-6677-4ce8-8281-58612a3a5687\") " pod="openstack/dnsmasq-dns-57d769cc4f-42x4h" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.254951 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-42x4h" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.307635 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-dtxl5"] Oct 07 11:35:49 crc kubenswrapper[4700]: W1007 11:35:49.309835 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88daacf3_e024_4e18_8631_406b6e504a5f.slice/crio-2506fcf6b0666e77632a7f478646c6a81df87879ca3624c6eb911d281430fe1d WatchSource:0}: Error finding container 2506fcf6b0666e77632a7f478646c6a81df87879ca3624c6eb911d281430fe1d: Status 404 returned error can't find the container with id 2506fcf6b0666e77632a7f478646c6a81df87879ca3624c6eb911d281430fe1d Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.719709 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-42x4h"] Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.733361 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 11:35:49 crc kubenswrapper[4700]: W1007 11:35:49.733452 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1936b589_6677_4ce8_8281_58612a3a5687.slice/crio-e332884553931e1d08f6e7774f378c4248c545bbbf22e091b94a647d004d252f WatchSource:0}: Error finding container e332884553931e1d08f6e7774f378c4248c545bbbf22e091b94a647d004d252f: Status 404 returned error can't find the container with id e332884553931e1d08f6e7774f378c4248c545bbbf22e091b94a647d004d252f Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.734936 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.737057 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.737269 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-vd2wl" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.737853 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.738170 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.738482 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.738966 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.744683 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.752242 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.783503 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-42x4h" event={"ID":"1936b589-6677-4ce8-8281-58612a3a5687","Type":"ContainerStarted","Data":"e332884553931e1d08f6e7774f378c4248c545bbbf22e091b94a647d004d252f"} Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.784808 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-dtxl5" event={"ID":"88daacf3-e024-4e18-8631-406b6e504a5f","Type":"ContainerStarted","Data":"2506fcf6b0666e77632a7f478646c6a81df87879ca3624c6eb911d281430fe1d"} Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.861716 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ef7fab2e-f9fb-429f-9d47-e03f68165a13-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " pod="openstack/rabbitmq-server-0" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.861801 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ef7fab2e-f9fb-429f-9d47-e03f68165a13-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " pod="openstack/rabbitmq-server-0" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.861832 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ef7fab2e-f9fb-429f-9d47-e03f68165a13-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " pod="openstack/rabbitmq-server-0" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.861853 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ef7fab2e-f9fb-429f-9d47-e03f68165a13-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " pod="openstack/rabbitmq-server-0" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.861885 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef7fab2e-f9fb-429f-9d47-e03f68165a13-config-data\") pod \"rabbitmq-server-0\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " pod="openstack/rabbitmq-server-0" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.861908 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ef7fab2e-f9fb-429f-9d47-e03f68165a13-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " pod="openstack/rabbitmq-server-0" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.861938 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4djh2\" (UniqueName: \"kubernetes.io/projected/ef7fab2e-f9fb-429f-9d47-e03f68165a13-kube-api-access-4djh2\") pod \"rabbitmq-server-0\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " pod="openstack/rabbitmq-server-0" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.861963 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ef7fab2e-f9fb-429f-9d47-e03f68165a13-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " pod="openstack/rabbitmq-server-0" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.862024 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " pod="openstack/rabbitmq-server-0" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.862051 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ef7fab2e-f9fb-429f-9d47-e03f68165a13-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " pod="openstack/rabbitmq-server-0" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.862069 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ef7fab2e-f9fb-429f-9d47-e03f68165a13-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " pod="openstack/rabbitmq-server-0" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.962862 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ef7fab2e-f9fb-429f-9d47-e03f68165a13-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " pod="openstack/rabbitmq-server-0" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.962906 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ef7fab2e-f9fb-429f-9d47-e03f68165a13-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " pod="openstack/rabbitmq-server-0" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.962922 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ef7fab2e-f9fb-429f-9d47-e03f68165a13-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " pod="openstack/rabbitmq-server-0" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.962950 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef7fab2e-f9fb-429f-9d47-e03f68165a13-config-data\") pod \"rabbitmq-server-0\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " pod="openstack/rabbitmq-server-0" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.962971 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ef7fab2e-f9fb-429f-9d47-e03f68165a13-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " pod="openstack/rabbitmq-server-0" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.963901 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ef7fab2e-f9fb-429f-9d47-e03f68165a13-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " pod="openstack/rabbitmq-server-0" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.964239 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef7fab2e-f9fb-429f-9d47-e03f68165a13-config-data\") pod \"rabbitmq-server-0\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " pod="openstack/rabbitmq-server-0" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.964256 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ef7fab2e-f9fb-429f-9d47-e03f68165a13-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " pod="openstack/rabbitmq-server-0" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.962993 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4djh2\" (UniqueName: \"kubernetes.io/projected/ef7fab2e-f9fb-429f-9d47-e03f68165a13-kube-api-access-4djh2\") pod \"rabbitmq-server-0\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " pod="openstack/rabbitmq-server-0" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.964326 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ef7fab2e-f9fb-429f-9d47-e03f68165a13-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " pod="openstack/rabbitmq-server-0" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.964371 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " pod="openstack/rabbitmq-server-0" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.964389 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ef7fab2e-f9fb-429f-9d47-e03f68165a13-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " pod="openstack/rabbitmq-server-0" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.964407 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ef7fab2e-f9fb-429f-9d47-e03f68165a13-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " pod="openstack/rabbitmq-server-0" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.964482 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ef7fab2e-f9fb-429f-9d47-e03f68165a13-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " pod="openstack/rabbitmq-server-0" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.966063 4700 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-server-0" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.971033 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ef7fab2e-f9fb-429f-9d47-e03f68165a13-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " pod="openstack/rabbitmq-server-0" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.971674 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ef7fab2e-f9fb-429f-9d47-e03f68165a13-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " pod="openstack/rabbitmq-server-0" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.979490 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ef7fab2e-f9fb-429f-9d47-e03f68165a13-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " pod="openstack/rabbitmq-server-0" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.980014 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ef7fab2e-f9fb-429f-9d47-e03f68165a13-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " pod="openstack/rabbitmq-server-0" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.981354 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ef7fab2e-f9fb-429f-9d47-e03f68165a13-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " pod="openstack/rabbitmq-server-0" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.991530 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ef7fab2e-f9fb-429f-9d47-e03f68165a13-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " pod="openstack/rabbitmq-server-0" Oct 07 11:35:49 crc kubenswrapper[4700]: I1007 11:35:49.992377 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4djh2\" (UniqueName: \"kubernetes.io/projected/ef7fab2e-f9fb-429f-9d47-e03f68165a13-kube-api-access-4djh2\") pod \"rabbitmq-server-0\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " pod="openstack/rabbitmq-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.012557 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " pod="openstack/rabbitmq-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.061497 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.062638 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.071674 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.072150 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.072282 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.072386 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.072483 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.072713 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.072789 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-g79lm" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.072842 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.078278 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.177081 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ca1c2675-0718-4979-98b8-9227bc9c5f18-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.177132 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ca1c2675-0718-4979-98b8-9227bc9c5f18-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.177155 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ca1c2675-0718-4979-98b8-9227bc9c5f18-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.177181 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ca1c2675-0718-4979-98b8-9227bc9c5f18-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.177205 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ca1c2675-0718-4979-98b8-9227bc9c5f18-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.177222 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ca1c2675-0718-4979-98b8-9227bc9c5f18-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.177242 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca1c2675-0718-4979-98b8-9227bc9c5f18-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.177267 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.177335 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ca1c2675-0718-4979-98b8-9227bc9c5f18-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.177355 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ca1c2675-0718-4979-98b8-9227bc9c5f18-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.177377 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fvnd\" (UniqueName: \"kubernetes.io/projected/ca1c2675-0718-4979-98b8-9227bc9c5f18-kube-api-access-9fvnd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.279064 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ca1c2675-0718-4979-98b8-9227bc9c5f18-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.279546 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ca1c2675-0718-4979-98b8-9227bc9c5f18-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.279572 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ca1c2675-0718-4979-98b8-9227bc9c5f18-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.279592 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ca1c2675-0718-4979-98b8-9227bc9c5f18-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.279613 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ca1c2675-0718-4979-98b8-9227bc9c5f18-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.279621 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ca1c2675-0718-4979-98b8-9227bc9c5f18-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.279629 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ca1c2675-0718-4979-98b8-9227bc9c5f18-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.280174 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca1c2675-0718-4979-98b8-9227bc9c5f18-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.280261 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.280395 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ca1c2675-0718-4979-98b8-9227bc9c5f18-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.280439 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ca1c2675-0718-4979-98b8-9227bc9c5f18-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.280484 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fvnd\" (UniqueName: \"kubernetes.io/projected/ca1c2675-0718-4979-98b8-9227bc9c5f18-kube-api-access-9fvnd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.280914 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ca1c2675-0718-4979-98b8-9227bc9c5f18-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.281468 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca1c2675-0718-4979-98b8-9227bc9c5f18-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.282155 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ca1c2675-0718-4979-98b8-9227bc9c5f18-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.282379 4700 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.282710 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ca1c2675-0718-4979-98b8-9227bc9c5f18-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.284095 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ca1c2675-0718-4979-98b8-9227bc9c5f18-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.284778 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ca1c2675-0718-4979-98b8-9227bc9c5f18-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.287379 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ca1c2675-0718-4979-98b8-9227bc9c5f18-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.287441 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ca1c2675-0718-4979-98b8-9227bc9c5f18-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.299542 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fvnd\" (UniqueName: \"kubernetes.io/projected/ca1c2675-0718-4979-98b8-9227bc9c5f18-kube-api-access-9fvnd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.306788 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:35:50 crc kubenswrapper[4700]: I1007 11:35:50.393949 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.256414 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.258718 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.262869 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.263067 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.263187 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-dtrsz" Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.263410 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.269505 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.271509 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.281396 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.397073 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"0706b451-8379-454a-bf71-483b779cb17b\") " pod="openstack/openstack-galera-0" Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.397141 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0706b451-8379-454a-bf71-483b779cb17b-config-data-default\") pod \"openstack-galera-0\" (UID: \"0706b451-8379-454a-bf71-483b779cb17b\") " pod="openstack/openstack-galera-0" Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.397172 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0706b451-8379-454a-bf71-483b779cb17b-kolla-config\") pod \"openstack-galera-0\" (UID: \"0706b451-8379-454a-bf71-483b779cb17b\") " pod="openstack/openstack-galera-0" Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.397203 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/0706b451-8379-454a-bf71-483b779cb17b-secrets\") pod \"openstack-galera-0\" (UID: \"0706b451-8379-454a-bf71-483b779cb17b\") " pod="openstack/openstack-galera-0" Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.397238 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0706b451-8379-454a-bf71-483b779cb17b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0706b451-8379-454a-bf71-483b779cb17b\") " pod="openstack/openstack-galera-0" Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.397260 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0706b451-8379-454a-bf71-483b779cb17b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0706b451-8379-454a-bf71-483b779cb17b\") " pod="openstack/openstack-galera-0" Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.397322 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0706b451-8379-454a-bf71-483b779cb17b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0706b451-8379-454a-bf71-483b779cb17b\") " pod="openstack/openstack-galera-0" Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.397351 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0706b451-8379-454a-bf71-483b779cb17b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0706b451-8379-454a-bf71-483b779cb17b\") " pod="openstack/openstack-galera-0" Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.397381 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hkx4\" (UniqueName: \"kubernetes.io/projected/0706b451-8379-454a-bf71-483b779cb17b-kube-api-access-8hkx4\") pod \"openstack-galera-0\" (UID: \"0706b451-8379-454a-bf71-483b779cb17b\") " pod="openstack/openstack-galera-0" Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.499108 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hkx4\" (UniqueName: \"kubernetes.io/projected/0706b451-8379-454a-bf71-483b779cb17b-kube-api-access-8hkx4\") pod \"openstack-galera-0\" (UID: \"0706b451-8379-454a-bf71-483b779cb17b\") " pod="openstack/openstack-galera-0" Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.499177 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"0706b451-8379-454a-bf71-483b779cb17b\") " pod="openstack/openstack-galera-0" Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.499206 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0706b451-8379-454a-bf71-483b779cb17b-config-data-default\") pod \"openstack-galera-0\" (UID: \"0706b451-8379-454a-bf71-483b779cb17b\") " pod="openstack/openstack-galera-0" Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.499242 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/0706b451-8379-454a-bf71-483b779cb17b-secrets\") pod \"openstack-galera-0\" (UID: \"0706b451-8379-454a-bf71-483b779cb17b\") " pod="openstack/openstack-galera-0" Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.499261 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0706b451-8379-454a-bf71-483b779cb17b-kolla-config\") pod \"openstack-galera-0\" (UID: \"0706b451-8379-454a-bf71-483b779cb17b\") " pod="openstack/openstack-galera-0" Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.499311 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0706b451-8379-454a-bf71-483b779cb17b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0706b451-8379-454a-bf71-483b779cb17b\") " pod="openstack/openstack-galera-0" Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.499356 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0706b451-8379-454a-bf71-483b779cb17b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0706b451-8379-454a-bf71-483b779cb17b\") " pod="openstack/openstack-galera-0" Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.499399 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0706b451-8379-454a-bf71-483b779cb17b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0706b451-8379-454a-bf71-483b779cb17b\") " pod="openstack/openstack-galera-0" Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.499432 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0706b451-8379-454a-bf71-483b779cb17b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0706b451-8379-454a-bf71-483b779cb17b\") " pod="openstack/openstack-galera-0" Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.500162 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0706b451-8379-454a-bf71-483b779cb17b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0706b451-8379-454a-bf71-483b779cb17b\") " pod="openstack/openstack-galera-0" Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.500587 4700 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"0706b451-8379-454a-bf71-483b779cb17b\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-galera-0" Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.503079 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0706b451-8379-454a-bf71-483b779cb17b-kolla-config\") pod \"openstack-galera-0\" (UID: \"0706b451-8379-454a-bf71-483b779cb17b\") " pod="openstack/openstack-galera-0" Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.504521 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0706b451-8379-454a-bf71-483b779cb17b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0706b451-8379-454a-bf71-483b779cb17b\") " pod="openstack/openstack-galera-0" Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.504963 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0706b451-8379-454a-bf71-483b779cb17b-config-data-default\") pod \"openstack-galera-0\" (UID: \"0706b451-8379-454a-bf71-483b779cb17b\") " pod="openstack/openstack-galera-0" Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.510021 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0706b451-8379-454a-bf71-483b779cb17b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0706b451-8379-454a-bf71-483b779cb17b\") " pod="openstack/openstack-galera-0" Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.516831 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hkx4\" (UniqueName: \"kubernetes.io/projected/0706b451-8379-454a-bf71-483b779cb17b-kube-api-access-8hkx4\") pod \"openstack-galera-0\" (UID: \"0706b451-8379-454a-bf71-483b779cb17b\") " pod="openstack/openstack-galera-0" Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.517748 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/0706b451-8379-454a-bf71-483b779cb17b-secrets\") pod \"openstack-galera-0\" (UID: \"0706b451-8379-454a-bf71-483b779cb17b\") " pod="openstack/openstack-galera-0" Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.518802 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0706b451-8379-454a-bf71-483b779cb17b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0706b451-8379-454a-bf71-483b779cb17b\") " pod="openstack/openstack-galera-0" Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.531656 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"0706b451-8379-454a-bf71-483b779cb17b\") " pod="openstack/openstack-galera-0" Oct 07 11:35:51 crc kubenswrapper[4700]: I1007 11:35:51.590505 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.463019 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.467514 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.471721 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-wbl6q" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.471774 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.473222 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.473762 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.477436 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.617182 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/793ba797-8da0-4e56-8dcc-14d7d2b0e217-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"793ba797-8da0-4e56-8dcc-14d7d2b0e217\") " pod="openstack/openstack-cell1-galera-0" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.617644 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwcp5\" (UniqueName: \"kubernetes.io/projected/793ba797-8da0-4e56-8dcc-14d7d2b0e217-kube-api-access-fwcp5\") pod \"openstack-cell1-galera-0\" (UID: \"793ba797-8da0-4e56-8dcc-14d7d2b0e217\") " pod="openstack/openstack-cell1-galera-0" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.617731 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"793ba797-8da0-4e56-8dcc-14d7d2b0e217\") " pod="openstack/openstack-cell1-galera-0" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.617773 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/793ba797-8da0-4e56-8dcc-14d7d2b0e217-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"793ba797-8da0-4e56-8dcc-14d7d2b0e217\") " pod="openstack/openstack-cell1-galera-0" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.617815 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/793ba797-8da0-4e56-8dcc-14d7d2b0e217-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"793ba797-8da0-4e56-8dcc-14d7d2b0e217\") " pod="openstack/openstack-cell1-galera-0" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.617844 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/793ba797-8da0-4e56-8dcc-14d7d2b0e217-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"793ba797-8da0-4e56-8dcc-14d7d2b0e217\") " pod="openstack/openstack-cell1-galera-0" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.617885 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/793ba797-8da0-4e56-8dcc-14d7d2b0e217-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"793ba797-8da0-4e56-8dcc-14d7d2b0e217\") " pod="openstack/openstack-cell1-galera-0" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.617943 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/793ba797-8da0-4e56-8dcc-14d7d2b0e217-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"793ba797-8da0-4e56-8dcc-14d7d2b0e217\") " pod="openstack/openstack-cell1-galera-0" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.618011 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/793ba797-8da0-4e56-8dcc-14d7d2b0e217-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"793ba797-8da0-4e56-8dcc-14d7d2b0e217\") " pod="openstack/openstack-cell1-galera-0" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.719903 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/793ba797-8da0-4e56-8dcc-14d7d2b0e217-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"793ba797-8da0-4e56-8dcc-14d7d2b0e217\") " pod="openstack/openstack-cell1-galera-0" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.719974 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/793ba797-8da0-4e56-8dcc-14d7d2b0e217-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"793ba797-8da0-4e56-8dcc-14d7d2b0e217\") " pod="openstack/openstack-cell1-galera-0" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.719995 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwcp5\" (UniqueName: \"kubernetes.io/projected/793ba797-8da0-4e56-8dcc-14d7d2b0e217-kube-api-access-fwcp5\") pod \"openstack-cell1-galera-0\" (UID: \"793ba797-8da0-4e56-8dcc-14d7d2b0e217\") " pod="openstack/openstack-cell1-galera-0" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.720044 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"793ba797-8da0-4e56-8dcc-14d7d2b0e217\") " pod="openstack/openstack-cell1-galera-0" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.720068 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/793ba797-8da0-4e56-8dcc-14d7d2b0e217-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"793ba797-8da0-4e56-8dcc-14d7d2b0e217\") " pod="openstack/openstack-cell1-galera-0" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.720090 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/793ba797-8da0-4e56-8dcc-14d7d2b0e217-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"793ba797-8da0-4e56-8dcc-14d7d2b0e217\") " pod="openstack/openstack-cell1-galera-0" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.720107 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/793ba797-8da0-4e56-8dcc-14d7d2b0e217-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"793ba797-8da0-4e56-8dcc-14d7d2b0e217\") " pod="openstack/openstack-cell1-galera-0" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.720129 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/793ba797-8da0-4e56-8dcc-14d7d2b0e217-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"793ba797-8da0-4e56-8dcc-14d7d2b0e217\") " pod="openstack/openstack-cell1-galera-0" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.720156 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/793ba797-8da0-4e56-8dcc-14d7d2b0e217-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"793ba797-8da0-4e56-8dcc-14d7d2b0e217\") " pod="openstack/openstack-cell1-galera-0" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.720258 4700 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"793ba797-8da0-4e56-8dcc-14d7d2b0e217\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-cell1-galera-0" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.720866 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/793ba797-8da0-4e56-8dcc-14d7d2b0e217-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"793ba797-8da0-4e56-8dcc-14d7d2b0e217\") " pod="openstack/openstack-cell1-galera-0" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.720967 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/793ba797-8da0-4e56-8dcc-14d7d2b0e217-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"793ba797-8da0-4e56-8dcc-14d7d2b0e217\") " pod="openstack/openstack-cell1-galera-0" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.721861 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/793ba797-8da0-4e56-8dcc-14d7d2b0e217-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"793ba797-8da0-4e56-8dcc-14d7d2b0e217\") " pod="openstack/openstack-cell1-galera-0" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.722027 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/793ba797-8da0-4e56-8dcc-14d7d2b0e217-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"793ba797-8da0-4e56-8dcc-14d7d2b0e217\") " pod="openstack/openstack-cell1-galera-0" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.725994 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/793ba797-8da0-4e56-8dcc-14d7d2b0e217-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"793ba797-8da0-4e56-8dcc-14d7d2b0e217\") " pod="openstack/openstack-cell1-galera-0" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.726988 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/793ba797-8da0-4e56-8dcc-14d7d2b0e217-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"793ba797-8da0-4e56-8dcc-14d7d2b0e217\") " pod="openstack/openstack-cell1-galera-0" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.729808 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/793ba797-8da0-4e56-8dcc-14d7d2b0e217-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"793ba797-8da0-4e56-8dcc-14d7d2b0e217\") " pod="openstack/openstack-cell1-galera-0" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.740139 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwcp5\" (UniqueName: \"kubernetes.io/projected/793ba797-8da0-4e56-8dcc-14d7d2b0e217-kube-api-access-fwcp5\") pod \"openstack-cell1-galera-0\" (UID: \"793ba797-8da0-4e56-8dcc-14d7d2b0e217\") " pod="openstack/openstack-cell1-galera-0" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.764468 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"793ba797-8da0-4e56-8dcc-14d7d2b0e217\") " pod="openstack/openstack-cell1-galera-0" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.783639 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.887189 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.888208 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.889759 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.889904 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.890368 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-hw8zc" Oct 07 11:35:52 crc kubenswrapper[4700]: I1007 11:35:52.896628 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 07 11:35:53 crc kubenswrapper[4700]: I1007 11:35:53.029298 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/74b58ae8-a57b-4c4d-9ca1-97bfa7ca8a81-memcached-tls-certs\") pod \"memcached-0\" (UID: \"74b58ae8-a57b-4c4d-9ca1-97bfa7ca8a81\") " pod="openstack/memcached-0" Oct 07 11:35:53 crc kubenswrapper[4700]: I1007 11:35:53.029376 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74b58ae8-a57b-4c4d-9ca1-97bfa7ca8a81-combined-ca-bundle\") pod \"memcached-0\" (UID: \"74b58ae8-a57b-4c4d-9ca1-97bfa7ca8a81\") " pod="openstack/memcached-0" Oct 07 11:35:53 crc kubenswrapper[4700]: I1007 11:35:53.029439 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74b58ae8-a57b-4c4d-9ca1-97bfa7ca8a81-config-data\") pod \"memcached-0\" (UID: \"74b58ae8-a57b-4c4d-9ca1-97bfa7ca8a81\") " pod="openstack/memcached-0" Oct 07 11:35:53 crc kubenswrapper[4700]: I1007 11:35:53.029459 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/74b58ae8-a57b-4c4d-9ca1-97bfa7ca8a81-kolla-config\") pod \"memcached-0\" (UID: \"74b58ae8-a57b-4c4d-9ca1-97bfa7ca8a81\") " pod="openstack/memcached-0" Oct 07 11:35:53 crc kubenswrapper[4700]: I1007 11:35:53.029488 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqnn9\" (UniqueName: \"kubernetes.io/projected/74b58ae8-a57b-4c4d-9ca1-97bfa7ca8a81-kube-api-access-fqnn9\") pod \"memcached-0\" (UID: \"74b58ae8-a57b-4c4d-9ca1-97bfa7ca8a81\") " pod="openstack/memcached-0" Oct 07 11:35:53 crc kubenswrapper[4700]: I1007 11:35:53.131332 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/74b58ae8-a57b-4c4d-9ca1-97bfa7ca8a81-memcached-tls-certs\") pod \"memcached-0\" (UID: \"74b58ae8-a57b-4c4d-9ca1-97bfa7ca8a81\") " pod="openstack/memcached-0" Oct 07 11:35:53 crc kubenswrapper[4700]: I1007 11:35:53.131391 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74b58ae8-a57b-4c4d-9ca1-97bfa7ca8a81-combined-ca-bundle\") pod \"memcached-0\" (UID: \"74b58ae8-a57b-4c4d-9ca1-97bfa7ca8a81\") " pod="openstack/memcached-0" Oct 07 11:35:53 crc kubenswrapper[4700]: I1007 11:35:53.131494 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74b58ae8-a57b-4c4d-9ca1-97bfa7ca8a81-config-data\") pod \"memcached-0\" (UID: \"74b58ae8-a57b-4c4d-9ca1-97bfa7ca8a81\") " pod="openstack/memcached-0" Oct 07 11:35:53 crc kubenswrapper[4700]: I1007 11:35:53.131527 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/74b58ae8-a57b-4c4d-9ca1-97bfa7ca8a81-kolla-config\") pod \"memcached-0\" (UID: \"74b58ae8-a57b-4c4d-9ca1-97bfa7ca8a81\") " pod="openstack/memcached-0" Oct 07 11:35:53 crc kubenswrapper[4700]: I1007 11:35:53.131565 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqnn9\" (UniqueName: \"kubernetes.io/projected/74b58ae8-a57b-4c4d-9ca1-97bfa7ca8a81-kube-api-access-fqnn9\") pod \"memcached-0\" (UID: \"74b58ae8-a57b-4c4d-9ca1-97bfa7ca8a81\") " pod="openstack/memcached-0" Oct 07 11:35:53 crc kubenswrapper[4700]: I1007 11:35:53.132765 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/74b58ae8-a57b-4c4d-9ca1-97bfa7ca8a81-kolla-config\") pod \"memcached-0\" (UID: \"74b58ae8-a57b-4c4d-9ca1-97bfa7ca8a81\") " pod="openstack/memcached-0" Oct 07 11:35:53 crc kubenswrapper[4700]: I1007 11:35:53.133102 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74b58ae8-a57b-4c4d-9ca1-97bfa7ca8a81-config-data\") pod \"memcached-0\" (UID: \"74b58ae8-a57b-4c4d-9ca1-97bfa7ca8a81\") " pod="openstack/memcached-0" Oct 07 11:35:53 crc kubenswrapper[4700]: I1007 11:35:53.135399 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/74b58ae8-a57b-4c4d-9ca1-97bfa7ca8a81-memcached-tls-certs\") pod \"memcached-0\" (UID: \"74b58ae8-a57b-4c4d-9ca1-97bfa7ca8a81\") " pod="openstack/memcached-0" Oct 07 11:35:53 crc kubenswrapper[4700]: I1007 11:35:53.138109 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74b58ae8-a57b-4c4d-9ca1-97bfa7ca8a81-combined-ca-bundle\") pod \"memcached-0\" (UID: \"74b58ae8-a57b-4c4d-9ca1-97bfa7ca8a81\") " pod="openstack/memcached-0" Oct 07 11:35:53 crc kubenswrapper[4700]: I1007 11:35:53.157495 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqnn9\" (UniqueName: \"kubernetes.io/projected/74b58ae8-a57b-4c4d-9ca1-97bfa7ca8a81-kube-api-access-fqnn9\") pod \"memcached-0\" (UID: \"74b58ae8-a57b-4c4d-9ca1-97bfa7ca8a81\") " pod="openstack/memcached-0" Oct 07 11:35:53 crc kubenswrapper[4700]: I1007 11:35:53.205599 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 07 11:35:54 crc kubenswrapper[4700]: I1007 11:35:54.876125 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 11:35:54 crc kubenswrapper[4700]: I1007 11:35:54.877588 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 11:35:54 crc kubenswrapper[4700]: I1007 11:35:54.880348 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-zjw58" Oct 07 11:35:54 crc kubenswrapper[4700]: I1007 11:35:54.908581 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 11:35:54 crc kubenswrapper[4700]: I1007 11:35:54.966511 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbs5p\" (UniqueName: \"kubernetes.io/projected/67b87725-3618-4837-b1b5-c98afe5de4a4-kube-api-access-rbs5p\") pod \"kube-state-metrics-0\" (UID: \"67b87725-3618-4837-b1b5-c98afe5de4a4\") " pod="openstack/kube-state-metrics-0" Oct 07 11:35:55 crc kubenswrapper[4700]: I1007 11:35:55.068256 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbs5p\" (UniqueName: \"kubernetes.io/projected/67b87725-3618-4837-b1b5-c98afe5de4a4-kube-api-access-rbs5p\") pod \"kube-state-metrics-0\" (UID: \"67b87725-3618-4837-b1b5-c98afe5de4a4\") " pod="openstack/kube-state-metrics-0" Oct 07 11:35:55 crc kubenswrapper[4700]: I1007 11:35:55.098663 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbs5p\" (UniqueName: \"kubernetes.io/projected/67b87725-3618-4837-b1b5-c98afe5de4a4-kube-api-access-rbs5p\") pod \"kube-state-metrics-0\" (UID: \"67b87725-3618-4837-b1b5-c98afe5de4a4\") " pod="openstack/kube-state-metrics-0" Oct 07 11:35:55 crc kubenswrapper[4700]: I1007 11:35:55.206232 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 11:35:58 crc kubenswrapper[4700]: I1007 11:35:58.795950 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-m9nzp"] Oct 07 11:35:58 crc kubenswrapper[4700]: I1007 11:35:58.797440 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m9nzp" Oct 07 11:35:58 crc kubenswrapper[4700]: I1007 11:35:58.801033 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 07 11:35:58 crc kubenswrapper[4700]: I1007 11:35:58.801264 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-vk89f" Oct 07 11:35:58 crc kubenswrapper[4700]: I1007 11:35:58.801456 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 07 11:35:58 crc kubenswrapper[4700]: I1007 11:35:58.803578 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-m9nzp"] Oct 07 11:35:58 crc kubenswrapper[4700]: I1007 11:35:58.825567 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-ksvhb"] Oct 07 11:35:58 crc kubenswrapper[4700]: I1007 11:35:58.827126 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-ksvhb" Oct 07 11:35:58 crc kubenswrapper[4700]: I1007 11:35:58.832580 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-ksvhb"] Oct 07 11:35:58 crc kubenswrapper[4700]: I1007 11:35:58.933862 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88c8n\" (UniqueName: \"kubernetes.io/projected/36a3b431-4387-4ba7-a2c1-e72622594a8c-kube-api-access-88c8n\") pod \"ovn-controller-ovs-ksvhb\" (UID: \"36a3b431-4387-4ba7-a2c1-e72622594a8c\") " pod="openstack/ovn-controller-ovs-ksvhb" Oct 07 11:35:58 crc kubenswrapper[4700]: I1007 11:35:58.933912 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f39e97ad-dbbb-45d4-a595-8f675165ed7d-combined-ca-bundle\") pod \"ovn-controller-m9nzp\" (UID: \"f39e97ad-dbbb-45d4-a595-8f675165ed7d\") " pod="openstack/ovn-controller-m9nzp" Oct 07 11:35:58 crc kubenswrapper[4700]: I1007 11:35:58.933934 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f39e97ad-dbbb-45d4-a595-8f675165ed7d-var-run-ovn\") pod \"ovn-controller-m9nzp\" (UID: \"f39e97ad-dbbb-45d4-a595-8f675165ed7d\") " pod="openstack/ovn-controller-m9nzp" Oct 07 11:35:58 crc kubenswrapper[4700]: I1007 11:35:58.933954 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/36a3b431-4387-4ba7-a2c1-e72622594a8c-etc-ovs\") pod \"ovn-controller-ovs-ksvhb\" (UID: \"36a3b431-4387-4ba7-a2c1-e72622594a8c\") " pod="openstack/ovn-controller-ovs-ksvhb" Oct 07 11:35:58 crc kubenswrapper[4700]: I1007 11:35:58.933972 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f39e97ad-dbbb-45d4-a595-8f675165ed7d-var-run\") pod \"ovn-controller-m9nzp\" (UID: \"f39e97ad-dbbb-45d4-a595-8f675165ed7d\") " pod="openstack/ovn-controller-m9nzp" Oct 07 11:35:58 crc kubenswrapper[4700]: I1007 11:35:58.933991 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/36a3b431-4387-4ba7-a2c1-e72622594a8c-var-log\") pod \"ovn-controller-ovs-ksvhb\" (UID: \"36a3b431-4387-4ba7-a2c1-e72622594a8c\") " pod="openstack/ovn-controller-ovs-ksvhb" Oct 07 11:35:58 crc kubenswrapper[4700]: I1007 11:35:58.934016 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/36a3b431-4387-4ba7-a2c1-e72622594a8c-var-lib\") pod \"ovn-controller-ovs-ksvhb\" (UID: \"36a3b431-4387-4ba7-a2c1-e72622594a8c\") " pod="openstack/ovn-controller-ovs-ksvhb" Oct 07 11:35:58 crc kubenswrapper[4700]: I1007 11:35:58.934058 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f39e97ad-dbbb-45d4-a595-8f675165ed7d-var-log-ovn\") pod \"ovn-controller-m9nzp\" (UID: \"f39e97ad-dbbb-45d4-a595-8f675165ed7d\") " pod="openstack/ovn-controller-m9nzp" Oct 07 11:35:58 crc kubenswrapper[4700]: I1007 11:35:58.934079 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f39e97ad-dbbb-45d4-a595-8f675165ed7d-ovn-controller-tls-certs\") pod \"ovn-controller-m9nzp\" (UID: \"f39e97ad-dbbb-45d4-a595-8f675165ed7d\") " pod="openstack/ovn-controller-m9nzp" Oct 07 11:35:58 crc kubenswrapper[4700]: I1007 11:35:58.934105 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36a3b431-4387-4ba7-a2c1-e72622594a8c-scripts\") pod \"ovn-controller-ovs-ksvhb\" (UID: \"36a3b431-4387-4ba7-a2c1-e72622594a8c\") " pod="openstack/ovn-controller-ovs-ksvhb" Oct 07 11:35:58 crc kubenswrapper[4700]: I1007 11:35:58.934129 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/36a3b431-4387-4ba7-a2c1-e72622594a8c-var-run\") pod \"ovn-controller-ovs-ksvhb\" (UID: \"36a3b431-4387-4ba7-a2c1-e72622594a8c\") " pod="openstack/ovn-controller-ovs-ksvhb" Oct 07 11:35:58 crc kubenswrapper[4700]: I1007 11:35:58.934145 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f39e97ad-dbbb-45d4-a595-8f675165ed7d-scripts\") pod \"ovn-controller-m9nzp\" (UID: \"f39e97ad-dbbb-45d4-a595-8f675165ed7d\") " pod="openstack/ovn-controller-m9nzp" Oct 07 11:35:58 crc kubenswrapper[4700]: I1007 11:35:58.934168 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs8sk\" (UniqueName: \"kubernetes.io/projected/f39e97ad-dbbb-45d4-a595-8f675165ed7d-kube-api-access-vs8sk\") pod \"ovn-controller-m9nzp\" (UID: \"f39e97ad-dbbb-45d4-a595-8f675165ed7d\") " pod="openstack/ovn-controller-m9nzp" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.035384 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/36a3b431-4387-4ba7-a2c1-e72622594a8c-var-lib\") pod \"ovn-controller-ovs-ksvhb\" (UID: \"36a3b431-4387-4ba7-a2c1-e72622594a8c\") " pod="openstack/ovn-controller-ovs-ksvhb" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.035459 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f39e97ad-dbbb-45d4-a595-8f675165ed7d-var-log-ovn\") pod \"ovn-controller-m9nzp\" (UID: \"f39e97ad-dbbb-45d4-a595-8f675165ed7d\") " pod="openstack/ovn-controller-m9nzp" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.035484 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f39e97ad-dbbb-45d4-a595-8f675165ed7d-ovn-controller-tls-certs\") pod \"ovn-controller-m9nzp\" (UID: \"f39e97ad-dbbb-45d4-a595-8f675165ed7d\") " pod="openstack/ovn-controller-m9nzp" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.035514 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36a3b431-4387-4ba7-a2c1-e72622594a8c-scripts\") pod \"ovn-controller-ovs-ksvhb\" (UID: \"36a3b431-4387-4ba7-a2c1-e72622594a8c\") " pod="openstack/ovn-controller-ovs-ksvhb" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.035539 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/36a3b431-4387-4ba7-a2c1-e72622594a8c-var-run\") pod \"ovn-controller-ovs-ksvhb\" (UID: \"36a3b431-4387-4ba7-a2c1-e72622594a8c\") " pod="openstack/ovn-controller-ovs-ksvhb" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.035553 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f39e97ad-dbbb-45d4-a595-8f675165ed7d-scripts\") pod \"ovn-controller-m9nzp\" (UID: \"f39e97ad-dbbb-45d4-a595-8f675165ed7d\") " pod="openstack/ovn-controller-m9nzp" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.035575 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs8sk\" (UniqueName: \"kubernetes.io/projected/f39e97ad-dbbb-45d4-a595-8f675165ed7d-kube-api-access-vs8sk\") pod \"ovn-controller-m9nzp\" (UID: \"f39e97ad-dbbb-45d4-a595-8f675165ed7d\") " pod="openstack/ovn-controller-m9nzp" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.035599 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88c8n\" (UniqueName: \"kubernetes.io/projected/36a3b431-4387-4ba7-a2c1-e72622594a8c-kube-api-access-88c8n\") pod \"ovn-controller-ovs-ksvhb\" (UID: \"36a3b431-4387-4ba7-a2c1-e72622594a8c\") " pod="openstack/ovn-controller-ovs-ksvhb" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.035619 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f39e97ad-dbbb-45d4-a595-8f675165ed7d-combined-ca-bundle\") pod \"ovn-controller-m9nzp\" (UID: \"f39e97ad-dbbb-45d4-a595-8f675165ed7d\") " pod="openstack/ovn-controller-m9nzp" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.035652 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f39e97ad-dbbb-45d4-a595-8f675165ed7d-var-run-ovn\") pod \"ovn-controller-m9nzp\" (UID: \"f39e97ad-dbbb-45d4-a595-8f675165ed7d\") " pod="openstack/ovn-controller-m9nzp" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.035670 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/36a3b431-4387-4ba7-a2c1-e72622594a8c-etc-ovs\") pod \"ovn-controller-ovs-ksvhb\" (UID: \"36a3b431-4387-4ba7-a2c1-e72622594a8c\") " pod="openstack/ovn-controller-ovs-ksvhb" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.035687 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f39e97ad-dbbb-45d4-a595-8f675165ed7d-var-run\") pod \"ovn-controller-m9nzp\" (UID: \"f39e97ad-dbbb-45d4-a595-8f675165ed7d\") " pod="openstack/ovn-controller-m9nzp" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.035709 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/36a3b431-4387-4ba7-a2c1-e72622594a8c-var-log\") pod \"ovn-controller-ovs-ksvhb\" (UID: \"36a3b431-4387-4ba7-a2c1-e72622594a8c\") " pod="openstack/ovn-controller-ovs-ksvhb" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.036264 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/36a3b431-4387-4ba7-a2c1-e72622594a8c-var-log\") pod \"ovn-controller-ovs-ksvhb\" (UID: \"36a3b431-4387-4ba7-a2c1-e72622594a8c\") " pod="openstack/ovn-controller-ovs-ksvhb" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.036416 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/36a3b431-4387-4ba7-a2c1-e72622594a8c-var-lib\") pod \"ovn-controller-ovs-ksvhb\" (UID: \"36a3b431-4387-4ba7-a2c1-e72622594a8c\") " pod="openstack/ovn-controller-ovs-ksvhb" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.036484 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f39e97ad-dbbb-45d4-a595-8f675165ed7d-var-log-ovn\") pod \"ovn-controller-m9nzp\" (UID: \"f39e97ad-dbbb-45d4-a595-8f675165ed7d\") " pod="openstack/ovn-controller-m9nzp" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.037927 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f39e97ad-dbbb-45d4-a595-8f675165ed7d-var-run-ovn\") pod \"ovn-controller-m9nzp\" (UID: \"f39e97ad-dbbb-45d4-a595-8f675165ed7d\") " pod="openstack/ovn-controller-m9nzp" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.038028 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/36a3b431-4387-4ba7-a2c1-e72622594a8c-var-run\") pod \"ovn-controller-ovs-ksvhb\" (UID: \"36a3b431-4387-4ba7-a2c1-e72622594a8c\") " pod="openstack/ovn-controller-ovs-ksvhb" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.038058 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f39e97ad-dbbb-45d4-a595-8f675165ed7d-var-run\") pod \"ovn-controller-m9nzp\" (UID: \"f39e97ad-dbbb-45d4-a595-8f675165ed7d\") " pod="openstack/ovn-controller-m9nzp" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.038497 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.039265 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f39e97ad-dbbb-45d4-a595-8f675165ed7d-scripts\") pod \"ovn-controller-m9nzp\" (UID: \"f39e97ad-dbbb-45d4-a595-8f675165ed7d\") " pod="openstack/ovn-controller-m9nzp" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.040789 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36a3b431-4387-4ba7-a2c1-e72622594a8c-scripts\") pod \"ovn-controller-ovs-ksvhb\" (UID: \"36a3b431-4387-4ba7-a2c1-e72622594a8c\") " pod="openstack/ovn-controller-ovs-ksvhb" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.041636 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.042069 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f39e97ad-dbbb-45d4-a595-8f675165ed7d-ovn-controller-tls-certs\") pod \"ovn-controller-m9nzp\" (UID: \"f39e97ad-dbbb-45d4-a595-8f675165ed7d\") " pod="openstack/ovn-controller-m9nzp" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.044119 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/36a3b431-4387-4ba7-a2c1-e72622594a8c-etc-ovs\") pod \"ovn-controller-ovs-ksvhb\" (UID: \"36a3b431-4387-4ba7-a2c1-e72622594a8c\") " pod="openstack/ovn-controller-ovs-ksvhb" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.045012 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f39e97ad-dbbb-45d4-a595-8f675165ed7d-combined-ca-bundle\") pod \"ovn-controller-m9nzp\" (UID: \"f39e97ad-dbbb-45d4-a595-8f675165ed7d\") " pod="openstack/ovn-controller-m9nzp" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.045281 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.045586 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-cwsqx" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.045769 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.045996 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.045992 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.047846 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.056036 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs8sk\" (UniqueName: \"kubernetes.io/projected/f39e97ad-dbbb-45d4-a595-8f675165ed7d-kube-api-access-vs8sk\") pod \"ovn-controller-m9nzp\" (UID: \"f39e97ad-dbbb-45d4-a595-8f675165ed7d\") " pod="openstack/ovn-controller-m9nzp" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.061652 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88c8n\" (UniqueName: \"kubernetes.io/projected/36a3b431-4387-4ba7-a2c1-e72622594a8c-kube-api-access-88c8n\") pod \"ovn-controller-ovs-ksvhb\" (UID: \"36a3b431-4387-4ba7-a2c1-e72622594a8c\") " pod="openstack/ovn-controller-ovs-ksvhb" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.122725 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m9nzp" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.138064 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/57e7be90-ef51-432f-afa7-edbff56123e0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"57e7be90-ef51-432f-afa7-edbff56123e0\") " pod="openstack/ovsdbserver-nb-0" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.138295 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq2fh\" (UniqueName: \"kubernetes.io/projected/57e7be90-ef51-432f-afa7-edbff56123e0-kube-api-access-bq2fh\") pod \"ovsdbserver-nb-0\" (UID: \"57e7be90-ef51-432f-afa7-edbff56123e0\") " pod="openstack/ovsdbserver-nb-0" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.138352 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57e7be90-ef51-432f-afa7-edbff56123e0-config\") pod \"ovsdbserver-nb-0\" (UID: \"57e7be90-ef51-432f-afa7-edbff56123e0\") " pod="openstack/ovsdbserver-nb-0" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.138430 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"57e7be90-ef51-432f-afa7-edbff56123e0\") " pod="openstack/ovsdbserver-nb-0" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.138451 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/57e7be90-ef51-432f-afa7-edbff56123e0-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"57e7be90-ef51-432f-afa7-edbff56123e0\") " pod="openstack/ovsdbserver-nb-0" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.138508 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/57e7be90-ef51-432f-afa7-edbff56123e0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"57e7be90-ef51-432f-afa7-edbff56123e0\") " pod="openstack/ovsdbserver-nb-0" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.138531 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57e7be90-ef51-432f-afa7-edbff56123e0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"57e7be90-ef51-432f-afa7-edbff56123e0\") " pod="openstack/ovsdbserver-nb-0" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.138545 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/57e7be90-ef51-432f-afa7-edbff56123e0-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"57e7be90-ef51-432f-afa7-edbff56123e0\") " pod="openstack/ovsdbserver-nb-0" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.146198 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-ksvhb" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.240428 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/57e7be90-ef51-432f-afa7-edbff56123e0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"57e7be90-ef51-432f-afa7-edbff56123e0\") " pod="openstack/ovsdbserver-nb-0" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.240601 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq2fh\" (UniqueName: \"kubernetes.io/projected/57e7be90-ef51-432f-afa7-edbff56123e0-kube-api-access-bq2fh\") pod \"ovsdbserver-nb-0\" (UID: \"57e7be90-ef51-432f-afa7-edbff56123e0\") " pod="openstack/ovsdbserver-nb-0" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.240641 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57e7be90-ef51-432f-afa7-edbff56123e0-config\") pod \"ovsdbserver-nb-0\" (UID: \"57e7be90-ef51-432f-afa7-edbff56123e0\") " pod="openstack/ovsdbserver-nb-0" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.240706 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"57e7be90-ef51-432f-afa7-edbff56123e0\") " pod="openstack/ovsdbserver-nb-0" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.240765 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/57e7be90-ef51-432f-afa7-edbff56123e0-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"57e7be90-ef51-432f-afa7-edbff56123e0\") " pod="openstack/ovsdbserver-nb-0" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.240820 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/57e7be90-ef51-432f-afa7-edbff56123e0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"57e7be90-ef51-432f-afa7-edbff56123e0\") " pod="openstack/ovsdbserver-nb-0" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.240852 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/57e7be90-ef51-432f-afa7-edbff56123e0-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"57e7be90-ef51-432f-afa7-edbff56123e0\") " pod="openstack/ovsdbserver-nb-0" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.240885 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57e7be90-ef51-432f-afa7-edbff56123e0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"57e7be90-ef51-432f-afa7-edbff56123e0\") " pod="openstack/ovsdbserver-nb-0" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.241012 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/57e7be90-ef51-432f-afa7-edbff56123e0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"57e7be90-ef51-432f-afa7-edbff56123e0\") " pod="openstack/ovsdbserver-nb-0" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.241157 4700 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"57e7be90-ef51-432f-afa7-edbff56123e0\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-nb-0" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.241938 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/57e7be90-ef51-432f-afa7-edbff56123e0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"57e7be90-ef51-432f-afa7-edbff56123e0\") " pod="openstack/ovsdbserver-nb-0" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.242079 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57e7be90-ef51-432f-afa7-edbff56123e0-config\") pod \"ovsdbserver-nb-0\" (UID: \"57e7be90-ef51-432f-afa7-edbff56123e0\") " pod="openstack/ovsdbserver-nb-0" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.245108 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/57e7be90-ef51-432f-afa7-edbff56123e0-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"57e7be90-ef51-432f-afa7-edbff56123e0\") " pod="openstack/ovsdbserver-nb-0" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.245685 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57e7be90-ef51-432f-afa7-edbff56123e0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"57e7be90-ef51-432f-afa7-edbff56123e0\") " pod="openstack/ovsdbserver-nb-0" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.253150 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/57e7be90-ef51-432f-afa7-edbff56123e0-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"57e7be90-ef51-432f-afa7-edbff56123e0\") " pod="openstack/ovsdbserver-nb-0" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.263029 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq2fh\" (UniqueName: \"kubernetes.io/projected/57e7be90-ef51-432f-afa7-edbff56123e0-kube-api-access-bq2fh\") pod \"ovsdbserver-nb-0\" (UID: \"57e7be90-ef51-432f-afa7-edbff56123e0\") " pod="openstack/ovsdbserver-nb-0" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.272289 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"57e7be90-ef51-432f-afa7-edbff56123e0\") " pod="openstack/ovsdbserver-nb-0" Oct 07 11:35:59 crc kubenswrapper[4700]: I1007 11:35:59.421016 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 07 11:36:01 crc kubenswrapper[4700]: I1007 11:36:01.681212 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 07 11:36:01 crc kubenswrapper[4700]: I1007 11:36:01.683900 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 07 11:36:01 crc kubenswrapper[4700]: I1007 11:36:01.687948 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 07 11:36:01 crc kubenswrapper[4700]: I1007 11:36:01.688058 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 07 11:36:01 crc kubenswrapper[4700]: I1007 11:36:01.688290 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 07 11:36:01 crc kubenswrapper[4700]: I1007 11:36:01.688801 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-tjqvz" Oct 07 11:36:01 crc kubenswrapper[4700]: I1007 11:36:01.703940 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 07 11:36:01 crc kubenswrapper[4700]: I1007 11:36:01.783720 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f13aad27-7d23-4de6-8de0-c8a61809de5d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f13aad27-7d23-4de6-8de0-c8a61809de5d\") " pod="openstack/ovsdbserver-sb-0" Oct 07 11:36:01 crc kubenswrapper[4700]: I1007 11:36:01.784009 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f13aad27-7d23-4de6-8de0-c8a61809de5d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f13aad27-7d23-4de6-8de0-c8a61809de5d\") " pod="openstack/ovsdbserver-sb-0" Oct 07 11:36:01 crc kubenswrapper[4700]: I1007 11:36:01.784061 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f13aad27-7d23-4de6-8de0-c8a61809de5d\") " pod="openstack/ovsdbserver-sb-0" Oct 07 11:36:01 crc kubenswrapper[4700]: I1007 11:36:01.784165 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f13aad27-7d23-4de6-8de0-c8a61809de5d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f13aad27-7d23-4de6-8de0-c8a61809de5d\") " pod="openstack/ovsdbserver-sb-0" Oct 07 11:36:01 crc kubenswrapper[4700]: I1007 11:36:01.784214 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f13aad27-7d23-4de6-8de0-c8a61809de5d-config\") pod \"ovsdbserver-sb-0\" (UID: \"f13aad27-7d23-4de6-8de0-c8a61809de5d\") " pod="openstack/ovsdbserver-sb-0" Oct 07 11:36:01 crc kubenswrapper[4700]: I1007 11:36:01.784249 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f13aad27-7d23-4de6-8de0-c8a61809de5d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f13aad27-7d23-4de6-8de0-c8a61809de5d\") " pod="openstack/ovsdbserver-sb-0" Oct 07 11:36:01 crc kubenswrapper[4700]: I1007 11:36:01.784270 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpvbp\" (UniqueName: \"kubernetes.io/projected/f13aad27-7d23-4de6-8de0-c8a61809de5d-kube-api-access-vpvbp\") pod \"ovsdbserver-sb-0\" (UID: \"f13aad27-7d23-4de6-8de0-c8a61809de5d\") " pod="openstack/ovsdbserver-sb-0" Oct 07 11:36:01 crc kubenswrapper[4700]: I1007 11:36:01.784406 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f13aad27-7d23-4de6-8de0-c8a61809de5d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f13aad27-7d23-4de6-8de0-c8a61809de5d\") " pod="openstack/ovsdbserver-sb-0" Oct 07 11:36:01 crc kubenswrapper[4700]: I1007 11:36:01.885497 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f13aad27-7d23-4de6-8de0-c8a61809de5d\") " pod="openstack/ovsdbserver-sb-0" Oct 07 11:36:01 crc kubenswrapper[4700]: I1007 11:36:01.885580 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f13aad27-7d23-4de6-8de0-c8a61809de5d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f13aad27-7d23-4de6-8de0-c8a61809de5d\") " pod="openstack/ovsdbserver-sb-0" Oct 07 11:36:01 crc kubenswrapper[4700]: I1007 11:36:01.885622 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f13aad27-7d23-4de6-8de0-c8a61809de5d-config\") pod \"ovsdbserver-sb-0\" (UID: \"f13aad27-7d23-4de6-8de0-c8a61809de5d\") " pod="openstack/ovsdbserver-sb-0" Oct 07 11:36:01 crc kubenswrapper[4700]: I1007 11:36:01.885685 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f13aad27-7d23-4de6-8de0-c8a61809de5d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f13aad27-7d23-4de6-8de0-c8a61809de5d\") " pod="openstack/ovsdbserver-sb-0" Oct 07 11:36:01 crc kubenswrapper[4700]: I1007 11:36:01.885710 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpvbp\" (UniqueName: \"kubernetes.io/projected/f13aad27-7d23-4de6-8de0-c8a61809de5d-kube-api-access-vpvbp\") pod \"ovsdbserver-sb-0\" (UID: \"f13aad27-7d23-4de6-8de0-c8a61809de5d\") " pod="openstack/ovsdbserver-sb-0" Oct 07 11:36:01 crc kubenswrapper[4700]: I1007 11:36:01.885774 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f13aad27-7d23-4de6-8de0-c8a61809de5d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f13aad27-7d23-4de6-8de0-c8a61809de5d\") " pod="openstack/ovsdbserver-sb-0" Oct 07 11:36:01 crc kubenswrapper[4700]: I1007 11:36:01.885797 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f13aad27-7d23-4de6-8de0-c8a61809de5d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f13aad27-7d23-4de6-8de0-c8a61809de5d\") " pod="openstack/ovsdbserver-sb-0" Oct 07 11:36:01 crc kubenswrapper[4700]: I1007 11:36:01.885813 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f13aad27-7d23-4de6-8de0-c8a61809de5d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f13aad27-7d23-4de6-8de0-c8a61809de5d\") " pod="openstack/ovsdbserver-sb-0" Oct 07 11:36:01 crc kubenswrapper[4700]: I1007 11:36:01.886610 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f13aad27-7d23-4de6-8de0-c8a61809de5d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f13aad27-7d23-4de6-8de0-c8a61809de5d\") " pod="openstack/ovsdbserver-sb-0" Oct 07 11:36:01 crc kubenswrapper[4700]: I1007 11:36:01.886951 4700 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f13aad27-7d23-4de6-8de0-c8a61809de5d\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-sb-0" Oct 07 11:36:01 crc kubenswrapper[4700]: I1007 11:36:01.887178 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f13aad27-7d23-4de6-8de0-c8a61809de5d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f13aad27-7d23-4de6-8de0-c8a61809de5d\") " pod="openstack/ovsdbserver-sb-0" Oct 07 11:36:01 crc kubenswrapper[4700]: I1007 11:36:01.887328 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f13aad27-7d23-4de6-8de0-c8a61809de5d-config\") pod \"ovsdbserver-sb-0\" (UID: \"f13aad27-7d23-4de6-8de0-c8a61809de5d\") " pod="openstack/ovsdbserver-sb-0" Oct 07 11:36:01 crc kubenswrapper[4700]: I1007 11:36:01.895151 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f13aad27-7d23-4de6-8de0-c8a61809de5d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f13aad27-7d23-4de6-8de0-c8a61809de5d\") " pod="openstack/ovsdbserver-sb-0" Oct 07 11:36:01 crc kubenswrapper[4700]: I1007 11:36:01.895186 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f13aad27-7d23-4de6-8de0-c8a61809de5d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f13aad27-7d23-4de6-8de0-c8a61809de5d\") " pod="openstack/ovsdbserver-sb-0" Oct 07 11:36:01 crc kubenswrapper[4700]: I1007 11:36:01.905662 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpvbp\" (UniqueName: \"kubernetes.io/projected/f13aad27-7d23-4de6-8de0-c8a61809de5d-kube-api-access-vpvbp\") pod \"ovsdbserver-sb-0\" (UID: \"f13aad27-7d23-4de6-8de0-c8a61809de5d\") " pod="openstack/ovsdbserver-sb-0" Oct 07 11:36:01 crc kubenswrapper[4700]: I1007 11:36:01.906773 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f13aad27-7d23-4de6-8de0-c8a61809de5d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f13aad27-7d23-4de6-8de0-c8a61809de5d\") " pod="openstack/ovsdbserver-sb-0" Oct 07 11:36:01 crc kubenswrapper[4700]: I1007 11:36:01.909708 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f13aad27-7d23-4de6-8de0-c8a61809de5d\") " pod="openstack/ovsdbserver-sb-0" Oct 07 11:36:02 crc kubenswrapper[4700]: I1007 11:36:02.074608 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 07 11:36:02 crc kubenswrapper[4700]: I1007 11:36:02.125105 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 07 11:36:02 crc kubenswrapper[4700]: I1007 11:36:02.899699 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"793ba797-8da0-4e56-8dcc-14d7d2b0e217","Type":"ContainerStarted","Data":"56957f514d8e7d5d07a0f5ccdba415ed6438c2859452a0f3725528ca2d307be6"} Oct 07 11:36:02 crc kubenswrapper[4700]: I1007 11:36:02.902596 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 07 11:36:02 crc kubenswrapper[4700]: I1007 11:36:02.909476 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 07 11:36:02 crc kubenswrapper[4700]: W1007 11:36:02.913225 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74b58ae8_a57b_4c4d_9ca1_97bfa7ca8a81.slice/crio-32993837d37c8f4721404a73a5a2d151145f82ce8beee18e7144822cedbf86cc WatchSource:0}: Error finding container 32993837d37c8f4721404a73a5a2d151145f82ce8beee18e7144822cedbf86cc: Status 404 returned error can't find the container with id 32993837d37c8f4721404a73a5a2d151145f82ce8beee18e7144822cedbf86cc Oct 07 11:36:03 crc kubenswrapper[4700]: I1007 11:36:03.058336 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 11:36:03 crc kubenswrapper[4700]: I1007 11:36:03.155012 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 11:36:03 crc kubenswrapper[4700]: W1007 11:36:03.183822 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca1c2675_0718_4979_98b8_9227bc9c5f18.slice/crio-97dee82d6cecfb4d568af852269a802ff35fe4ba299cfd5641769b22d8a1468e WatchSource:0}: Error finding container 97dee82d6cecfb4d568af852269a802ff35fe4ba299cfd5641769b22d8a1468e: Status 404 returned error can't find the container with id 97dee82d6cecfb4d568af852269a802ff35fe4ba299cfd5641769b22d8a1468e Oct 07 11:36:03 crc kubenswrapper[4700]: I1007 11:36:03.208468 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 11:36:03 crc kubenswrapper[4700]: I1007 11:36:03.213731 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-m9nzp"] Oct 07 11:36:03 crc kubenswrapper[4700]: W1007 11:36:03.283685 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67b87725_3618_4837_b1b5_c98afe5de4a4.slice/crio-ba7afb090d031851d61e5e006f06b697007f89d947e9084f3453d1b4100f8bf3 WatchSource:0}: Error finding container ba7afb090d031851d61e5e006f06b697007f89d947e9084f3453d1b4100f8bf3: Status 404 returned error can't find the container with id ba7afb090d031851d61e5e006f06b697007f89d947e9084f3453d1b4100f8bf3 Oct 07 11:36:03 crc kubenswrapper[4700]: W1007 11:36:03.285042 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf39e97ad_dbbb_45d4_a595_8f675165ed7d.slice/crio-3bc95b16abf4233e2c79bd7c9f0519018d7ea9fb8f5ed2dc11e89fb3a351c235 WatchSource:0}: Error finding container 3bc95b16abf4233e2c79bd7c9f0519018d7ea9fb8f5ed2dc11e89fb3a351c235: Status 404 returned error can't find the container with id 3bc95b16abf4233e2c79bd7c9f0519018d7ea9fb8f5ed2dc11e89fb3a351c235 Oct 07 11:36:03 crc kubenswrapper[4700]: I1007 11:36:03.315008 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 07 11:36:03 crc kubenswrapper[4700]: W1007 11:36:03.327440 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57e7be90_ef51_432f_afa7_edbff56123e0.slice/crio-8a5abcd95aed65f8e929a45e0a3357f6917b4705db46cc5dbee6a3702c38f546 WatchSource:0}: Error finding container 8a5abcd95aed65f8e929a45e0a3357f6917b4705db46cc5dbee6a3702c38f546: Status 404 returned error can't find the container with id 8a5abcd95aed65f8e929a45e0a3357f6917b4705db46cc5dbee6a3702c38f546 Oct 07 11:36:03 crc kubenswrapper[4700]: I1007 11:36:03.413235 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-ksvhb"] Oct 07 11:36:03 crc kubenswrapper[4700]: W1007 11:36:03.416744 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36a3b431_4387_4ba7_a2c1_e72622594a8c.slice/crio-629b8b1df6b0856d6b9ad86fd0e2c7fd5b9b6b24f30cab20db27d96b9e1bf425 WatchSource:0}: Error finding container 629b8b1df6b0856d6b9ad86fd0e2c7fd5b9b6b24f30cab20db27d96b9e1bf425: Status 404 returned error can't find the container with id 629b8b1df6b0856d6b9ad86fd0e2c7fd5b9b6b24f30cab20db27d96b9e1bf425 Oct 07 11:36:03 crc kubenswrapper[4700]: I1007 11:36:03.915454 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"57e7be90-ef51-432f-afa7-edbff56123e0","Type":"ContainerStarted","Data":"8a5abcd95aed65f8e929a45e0a3357f6917b4705db46cc5dbee6a3702c38f546"} Oct 07 11:36:03 crc kubenswrapper[4700]: I1007 11:36:03.917780 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0706b451-8379-454a-bf71-483b779cb17b","Type":"ContainerStarted","Data":"7e8b2a9f81d0587567da00a6794852384a6a05cc0403e6a5d204ee9e822faff8"} Oct 07 11:36:03 crc kubenswrapper[4700]: I1007 11:36:03.919743 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"74b58ae8-a57b-4c4d-9ca1-97bfa7ca8a81","Type":"ContainerStarted","Data":"32993837d37c8f4721404a73a5a2d151145f82ce8beee18e7144822cedbf86cc"} Oct 07 11:36:03 crc kubenswrapper[4700]: I1007 11:36:03.921958 4700 generic.go:334] "Generic (PLEG): container finished" podID="a0f30c73-3f35-4269-9bf7-ea649fd9b616" containerID="55bd9c55f6834ca6ee5fd2a1e0413df6982d0f07725aad1273d9c38f89f28819" exitCode=0 Oct 07 11:36:03 crc kubenswrapper[4700]: I1007 11:36:03.922002 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-846qw" event={"ID":"a0f30c73-3f35-4269-9bf7-ea649fd9b616","Type":"ContainerDied","Data":"55bd9c55f6834ca6ee5fd2a1e0413df6982d0f07725aad1273d9c38f89f28819"} Oct 07 11:36:03 crc kubenswrapper[4700]: I1007 11:36:03.937605 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ef7fab2e-f9fb-429f-9d47-e03f68165a13","Type":"ContainerStarted","Data":"f94595c8f81952bd3c5ae2e91916c5628b3b15be08690c6ed5e0048ae938e164"} Oct 07 11:36:03 crc kubenswrapper[4700]: I1007 11:36:03.940752 4700 generic.go:334] "Generic (PLEG): container finished" podID="88daacf3-e024-4e18-8631-406b6e504a5f" containerID="ace19e82a45ca16d1efe113d34bd3bed1905fbc9ee1a9662ff060f79c20997d1" exitCode=0 Oct 07 11:36:03 crc kubenswrapper[4700]: I1007 11:36:03.940874 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-dtxl5" event={"ID":"88daacf3-e024-4e18-8631-406b6e504a5f","Type":"ContainerDied","Data":"ace19e82a45ca16d1efe113d34bd3bed1905fbc9ee1a9662ff060f79c20997d1"} Oct 07 11:36:03 crc kubenswrapper[4700]: I1007 11:36:03.942162 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m9nzp" event={"ID":"f39e97ad-dbbb-45d4-a595-8f675165ed7d","Type":"ContainerStarted","Data":"3bc95b16abf4233e2c79bd7c9f0519018d7ea9fb8f5ed2dc11e89fb3a351c235"} Oct 07 11:36:03 crc kubenswrapper[4700]: I1007 11:36:03.943435 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 07 11:36:03 crc kubenswrapper[4700]: I1007 11:36:03.944200 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"67b87725-3618-4837-b1b5-c98afe5de4a4","Type":"ContainerStarted","Data":"ba7afb090d031851d61e5e006f06b697007f89d947e9084f3453d1b4100f8bf3"} Oct 07 11:36:03 crc kubenswrapper[4700]: I1007 11:36:03.947056 4700 generic.go:334] "Generic (PLEG): container finished" podID="74a5f38a-0346-47ea-9f20-0272bd55a191" containerID="a147a05e54db7fe13746359862aca208b0bba69c58810fecb6d80480aff22aa7" exitCode=0 Oct 07 11:36:03 crc kubenswrapper[4700]: I1007 11:36:03.947165 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-52b4v" event={"ID":"74a5f38a-0346-47ea-9f20-0272bd55a191","Type":"ContainerDied","Data":"a147a05e54db7fe13746359862aca208b0bba69c58810fecb6d80480aff22aa7"} Oct 07 11:36:03 crc kubenswrapper[4700]: I1007 11:36:03.952024 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-42x4h" event={"ID":"1936b589-6677-4ce8-8281-58612a3a5687","Type":"ContainerDied","Data":"a328b1848b9535811a64d0b1e0569a54d482c5a03ea7e6b6e6f3395fdbc55d22"} Oct 07 11:36:03 crc kubenswrapper[4700]: I1007 11:36:03.952970 4700 generic.go:334] "Generic (PLEG): container finished" podID="1936b589-6677-4ce8-8281-58612a3a5687" containerID="a328b1848b9535811a64d0b1e0569a54d482c5a03ea7e6b6e6f3395fdbc55d22" exitCode=0 Oct 07 11:36:03 crc kubenswrapper[4700]: I1007 11:36:03.989698 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ca1c2675-0718-4979-98b8-9227bc9c5f18","Type":"ContainerStarted","Data":"97dee82d6cecfb4d568af852269a802ff35fe4ba299cfd5641769b22d8a1468e"} Oct 07 11:36:03 crc kubenswrapper[4700]: I1007 11:36:03.990148 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-ksvhb" event={"ID":"36a3b431-4387-4ba7-a2c1-e72622594a8c","Type":"ContainerStarted","Data":"629b8b1df6b0856d6b9ad86fd0e2c7fd5b9b6b24f30cab20db27d96b9e1bf425"} Oct 07 11:36:05 crc kubenswrapper[4700]: W1007 11:36:05.363973 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf13aad27_7d23_4de6_8de0_c8a61809de5d.slice/crio-3022ea1338e25cf8d73113f9835dd0844109ccb9db93dd5af4c94365e908f6e0 WatchSource:0}: Error finding container 3022ea1338e25cf8d73113f9835dd0844109ccb9db93dd5af4c94365e908f6e0: Status 404 returned error can't find the container with id 3022ea1338e25cf8d73113f9835dd0844109ccb9db93dd5af4c94365e908f6e0 Oct 07 11:36:05 crc kubenswrapper[4700]: I1007 11:36:05.529503 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-52b4v" Oct 07 11:36:05 crc kubenswrapper[4700]: I1007 11:36:05.535629 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-846qw" Oct 07 11:36:05 crc kubenswrapper[4700]: I1007 11:36:05.696847 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74a5f38a-0346-47ea-9f20-0272bd55a191-config\") pod \"74a5f38a-0346-47ea-9f20-0272bd55a191\" (UID: \"74a5f38a-0346-47ea-9f20-0272bd55a191\") " Oct 07 11:36:05 crc kubenswrapper[4700]: I1007 11:36:05.697071 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0f30c73-3f35-4269-9bf7-ea649fd9b616-config\") pod \"a0f30c73-3f35-4269-9bf7-ea649fd9b616\" (UID: \"a0f30c73-3f35-4269-9bf7-ea649fd9b616\") " Oct 07 11:36:05 crc kubenswrapper[4700]: I1007 11:36:05.697151 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0f30c73-3f35-4269-9bf7-ea649fd9b616-dns-svc\") pod \"a0f30c73-3f35-4269-9bf7-ea649fd9b616\" (UID: \"a0f30c73-3f35-4269-9bf7-ea649fd9b616\") " Oct 07 11:36:05 crc kubenswrapper[4700]: I1007 11:36:05.697198 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5rnj\" (UniqueName: \"kubernetes.io/projected/74a5f38a-0346-47ea-9f20-0272bd55a191-kube-api-access-p5rnj\") pod \"74a5f38a-0346-47ea-9f20-0272bd55a191\" (UID: \"74a5f38a-0346-47ea-9f20-0272bd55a191\") " Oct 07 11:36:05 crc kubenswrapper[4700]: I1007 11:36:05.697279 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzpgs\" (UniqueName: \"kubernetes.io/projected/a0f30c73-3f35-4269-9bf7-ea649fd9b616-kube-api-access-fzpgs\") pod \"a0f30c73-3f35-4269-9bf7-ea649fd9b616\" (UID: \"a0f30c73-3f35-4269-9bf7-ea649fd9b616\") " Oct 07 11:36:05 crc kubenswrapper[4700]: I1007 11:36:05.700742 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74a5f38a-0346-47ea-9f20-0272bd55a191-kube-api-access-p5rnj" (OuterVolumeSpecName: "kube-api-access-p5rnj") pod "74a5f38a-0346-47ea-9f20-0272bd55a191" (UID: "74a5f38a-0346-47ea-9f20-0272bd55a191"). InnerVolumeSpecName "kube-api-access-p5rnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:36:05 crc kubenswrapper[4700]: I1007 11:36:05.702194 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0f30c73-3f35-4269-9bf7-ea649fd9b616-kube-api-access-fzpgs" (OuterVolumeSpecName: "kube-api-access-fzpgs") pod "a0f30c73-3f35-4269-9bf7-ea649fd9b616" (UID: "a0f30c73-3f35-4269-9bf7-ea649fd9b616"). InnerVolumeSpecName "kube-api-access-fzpgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:36:05 crc kubenswrapper[4700]: I1007 11:36:05.713105 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0f30c73-3f35-4269-9bf7-ea649fd9b616-config" (OuterVolumeSpecName: "config") pod "a0f30c73-3f35-4269-9bf7-ea649fd9b616" (UID: "a0f30c73-3f35-4269-9bf7-ea649fd9b616"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:36:05 crc kubenswrapper[4700]: I1007 11:36:05.728959 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74a5f38a-0346-47ea-9f20-0272bd55a191-config" (OuterVolumeSpecName: "config") pod "74a5f38a-0346-47ea-9f20-0272bd55a191" (UID: "74a5f38a-0346-47ea-9f20-0272bd55a191"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:36:05 crc kubenswrapper[4700]: I1007 11:36:05.747347 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0f30c73-3f35-4269-9bf7-ea649fd9b616-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a0f30c73-3f35-4269-9bf7-ea649fd9b616" (UID: "a0f30c73-3f35-4269-9bf7-ea649fd9b616"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:36:05 crc kubenswrapper[4700]: I1007 11:36:05.799538 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0f30c73-3f35-4269-9bf7-ea649fd9b616-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:05 crc kubenswrapper[4700]: I1007 11:36:05.799834 4700 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0f30c73-3f35-4269-9bf7-ea649fd9b616-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:05 crc kubenswrapper[4700]: I1007 11:36:05.799846 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5rnj\" (UniqueName: \"kubernetes.io/projected/74a5f38a-0346-47ea-9f20-0272bd55a191-kube-api-access-p5rnj\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:05 crc kubenswrapper[4700]: I1007 11:36:05.799858 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzpgs\" (UniqueName: \"kubernetes.io/projected/a0f30c73-3f35-4269-9bf7-ea649fd9b616-kube-api-access-fzpgs\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:05 crc kubenswrapper[4700]: I1007 11:36:05.799868 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74a5f38a-0346-47ea-9f20-0272bd55a191-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:05 crc kubenswrapper[4700]: I1007 11:36:05.989209 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f13aad27-7d23-4de6-8de0-c8a61809de5d","Type":"ContainerStarted","Data":"3022ea1338e25cf8d73113f9835dd0844109ccb9db93dd5af4c94365e908f6e0"} Oct 07 11:36:05 crc kubenswrapper[4700]: I1007 11:36:05.991522 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-846qw" Oct 07 11:36:05 crc kubenswrapper[4700]: I1007 11:36:05.991592 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-846qw" event={"ID":"a0f30c73-3f35-4269-9bf7-ea649fd9b616","Type":"ContainerDied","Data":"5941882517f9c1a252c94fbf3625b2d402e27f49914296eb7ac207f08c594723"} Oct 07 11:36:05 crc kubenswrapper[4700]: I1007 11:36:05.991677 4700 scope.go:117] "RemoveContainer" containerID="55bd9c55f6834ca6ee5fd2a1e0413df6982d0f07725aad1273d9c38f89f28819" Oct 07 11:36:05 crc kubenswrapper[4700]: I1007 11:36:05.995075 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-52b4v" event={"ID":"74a5f38a-0346-47ea-9f20-0272bd55a191","Type":"ContainerDied","Data":"3cd85ad9f4c3f2f91de6220e56db79cc25e8fa52ec57d2cb6aaae2108bf710d4"} Oct 07 11:36:05 crc kubenswrapper[4700]: I1007 11:36:05.995147 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-52b4v" Oct 07 11:36:06 crc kubenswrapper[4700]: I1007 11:36:06.083541 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-52b4v"] Oct 07 11:36:06 crc kubenswrapper[4700]: I1007 11:36:06.088969 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-52b4v"] Oct 07 11:36:06 crc kubenswrapper[4700]: I1007 11:36:06.103253 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-846qw"] Oct 07 11:36:06 crc kubenswrapper[4700]: I1007 11:36:06.109250 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-846qw"] Oct 07 11:36:07 crc kubenswrapper[4700]: I1007 11:36:07.974133 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74a5f38a-0346-47ea-9f20-0272bd55a191" path="/var/lib/kubelet/pods/74a5f38a-0346-47ea-9f20-0272bd55a191/volumes" Oct 07 11:36:07 crc kubenswrapper[4700]: I1007 11:36:07.975438 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0f30c73-3f35-4269-9bf7-ea649fd9b616" path="/var/lib/kubelet/pods/a0f30c73-3f35-4269-9bf7-ea649fd9b616/volumes" Oct 07 11:36:10 crc kubenswrapper[4700]: I1007 11:36:10.238745 4700 scope.go:117] "RemoveContainer" containerID="a147a05e54db7fe13746359862aca208b0bba69c58810fecb6d80480aff22aa7" Oct 07 11:36:12 crc kubenswrapper[4700]: I1007 11:36:12.055804 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-42x4h" event={"ID":"1936b589-6677-4ce8-8281-58612a3a5687","Type":"ContainerStarted","Data":"bbcb9fd665a718770dea6974fac5fed81ddd7235e5bddad8973ff5fe49edc4c2"} Oct 07 11:36:12 crc kubenswrapper[4700]: I1007 11:36:12.056778 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-42x4h" Oct 07 11:36:12 crc kubenswrapper[4700]: I1007 11:36:12.071918 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-dtxl5" event={"ID":"88daacf3-e024-4e18-8631-406b6e504a5f","Type":"ContainerStarted","Data":"e36a8f816e601402ab42c46f48f3465c9853cf0fa6b20480e2cdbdd05c9a6a86"} Oct 07 11:36:12 crc kubenswrapper[4700]: I1007 11:36:12.072201 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-dtxl5" Oct 07 11:36:12 crc kubenswrapper[4700]: I1007 11:36:12.103459 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-42x4h" podStartSLOduration=11.147982952 podStartE2EDuration="24.103438126s" podCreationTimestamp="2025-10-07 11:35:48 +0000 UTC" firstStartedPulling="2025-10-07 11:35:49.7376575 +0000 UTC m=+916.534056489" lastFinishedPulling="2025-10-07 11:36:02.693112674 +0000 UTC m=+929.489511663" observedRunningTime="2025-10-07 11:36:12.098411394 +0000 UTC m=+938.894810413" watchObservedRunningTime="2025-10-07 11:36:12.103438126 +0000 UTC m=+938.899837115" Oct 07 11:36:12 crc kubenswrapper[4700]: I1007 11:36:12.117454 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-dtxl5" podStartSLOduration=10.763273183999999 podStartE2EDuration="24.117439622s" podCreationTimestamp="2025-10-07 11:35:48 +0000 UTC" firstStartedPulling="2025-10-07 11:35:49.313706055 +0000 UTC m=+916.110105044" lastFinishedPulling="2025-10-07 11:36:02.667872503 +0000 UTC m=+929.464271482" observedRunningTime="2025-10-07 11:36:12.114711161 +0000 UTC m=+938.911110140" watchObservedRunningTime="2025-10-07 11:36:12.117439622 +0000 UTC m=+938.913838611" Oct 07 11:36:13 crc kubenswrapper[4700]: I1007 11:36:13.086750 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"67b87725-3618-4837-b1b5-c98afe5de4a4","Type":"ContainerStarted","Data":"3b9924ecb48939a9f995e1ad46667c3e95d3f828df19c72d5316ddcb7558a8c3"} Oct 07 11:36:13 crc kubenswrapper[4700]: I1007 11:36:13.088420 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 07 11:36:13 crc kubenswrapper[4700]: I1007 11:36:13.092948 4700 generic.go:334] "Generic (PLEG): container finished" podID="36a3b431-4387-4ba7-a2c1-e72622594a8c" containerID="de7790a4578663800c7abbb60c35d1a96e9c43b754e7b7279c31bd7831eebe39" exitCode=0 Oct 07 11:36:13 crc kubenswrapper[4700]: I1007 11:36:13.093151 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-ksvhb" event={"ID":"36a3b431-4387-4ba7-a2c1-e72622594a8c","Type":"ContainerDied","Data":"de7790a4578663800c7abbb60c35d1a96e9c43b754e7b7279c31bd7831eebe39"} Oct 07 11:36:13 crc kubenswrapper[4700]: I1007 11:36:13.094743 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"74b58ae8-a57b-4c4d-9ca1-97bfa7ca8a81","Type":"ContainerStarted","Data":"041e53c14f3d01992a562c6c7ce19bd9efbda01f848961e4e9a6ff1f5fb84853"} Oct 07 11:36:13 crc kubenswrapper[4700]: I1007 11:36:13.095135 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 07 11:36:13 crc kubenswrapper[4700]: I1007 11:36:13.099744 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f13aad27-7d23-4de6-8de0-c8a61809de5d","Type":"ContainerStarted","Data":"59021f5652df7afcebf5d62fd01de7f0bcfc3d4a153825fa4d21bdea66c3265b"} Oct 07 11:36:13 crc kubenswrapper[4700]: I1007 11:36:13.105972 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"57e7be90-ef51-432f-afa7-edbff56123e0","Type":"ContainerStarted","Data":"360159a1499f5589bca140e54f6d828628e20262635543d114259208cf3abb47"} Oct 07 11:36:13 crc kubenswrapper[4700]: I1007 11:36:13.107378 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=10.893801009 podStartE2EDuration="19.10736083s" podCreationTimestamp="2025-10-07 11:35:54 +0000 UTC" firstStartedPulling="2025-10-07 11:36:03.288120936 +0000 UTC m=+930.084519925" lastFinishedPulling="2025-10-07 11:36:11.501680757 +0000 UTC m=+938.298079746" observedRunningTime="2025-10-07 11:36:13.102878153 +0000 UTC m=+939.899277152" watchObservedRunningTime="2025-10-07 11:36:13.10736083 +0000 UTC m=+939.903759829" Oct 07 11:36:13 crc kubenswrapper[4700]: I1007 11:36:13.111515 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m9nzp" event={"ID":"f39e97ad-dbbb-45d4-a595-8f675165ed7d","Type":"ContainerStarted","Data":"286a78cc8b478faf4a6b43e3bf9ef303f5e656ba8968d4ca4b67ad07fd2cacce"} Oct 07 11:36:13 crc kubenswrapper[4700]: I1007 11:36:13.112215 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-m9nzp" Oct 07 11:36:13 crc kubenswrapper[4700]: I1007 11:36:13.123365 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=13.598578424 podStartE2EDuration="21.123343239s" podCreationTimestamp="2025-10-07 11:35:52 +0000 UTC" firstStartedPulling="2025-10-07 11:36:02.921045219 +0000 UTC m=+929.717444208" lastFinishedPulling="2025-10-07 11:36:10.445810034 +0000 UTC m=+937.242209023" observedRunningTime="2025-10-07 11:36:13.119690833 +0000 UTC m=+939.916089842" watchObservedRunningTime="2025-10-07 11:36:13.123343239 +0000 UTC m=+939.919742258" Oct 07 11:36:13 crc kubenswrapper[4700]: I1007 11:36:13.132364 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0706b451-8379-454a-bf71-483b779cb17b","Type":"ContainerStarted","Data":"8a1220e5263fbae539eb7adbe157ab3c1427498b94994dd3e9fe66d7f134a853"} Oct 07 11:36:13 crc kubenswrapper[4700]: I1007 11:36:13.135946 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"793ba797-8da0-4e56-8dcc-14d7d2b0e217","Type":"ContainerStarted","Data":"58addc835f737f8b1580a97cb116ff1ced05aca092671d4b1e5f2c189b1fb4cb"} Oct 07 11:36:13 crc kubenswrapper[4700]: I1007 11:36:13.194797 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-m9nzp" podStartSLOduration=7.6726550620000005 podStartE2EDuration="15.194783348s" podCreationTimestamp="2025-10-07 11:35:58 +0000 UTC" firstStartedPulling="2025-10-07 11:36:03.288129016 +0000 UTC m=+930.084528005" lastFinishedPulling="2025-10-07 11:36:10.810257262 +0000 UTC m=+937.606656291" observedRunningTime="2025-10-07 11:36:13.187316593 +0000 UTC m=+939.983715592" watchObservedRunningTime="2025-10-07 11:36:13.194783348 +0000 UTC m=+939.991182337" Oct 07 11:36:14 crc kubenswrapper[4700]: I1007 11:36:14.145663 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ef7fab2e-f9fb-429f-9d47-e03f68165a13","Type":"ContainerStarted","Data":"3ed13cf2f5a8cd7c7fa788fceb5e49f8318d6db89d3b8d267bcf7127ecee2337"} Oct 07 11:36:14 crc kubenswrapper[4700]: I1007 11:36:14.149284 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ca1c2675-0718-4979-98b8-9227bc9c5f18","Type":"ContainerStarted","Data":"a46dc66bd8afce5928f040158bfee6805a39b6a68e5c8e88e1b819c3300cd2ab"} Oct 07 11:36:14 crc kubenswrapper[4700]: I1007 11:36:14.163933 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-ksvhb" event={"ID":"36a3b431-4387-4ba7-a2c1-e72622594a8c","Type":"ContainerStarted","Data":"a643c7fbc66ced13d03c213930a6545cf590a0951dd7a8322a5c84ab17882585"} Oct 07 11:36:14 crc kubenswrapper[4700]: I1007 11:36:14.163984 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-ksvhb" Oct 07 11:36:14 crc kubenswrapper[4700]: I1007 11:36:14.163996 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-ksvhb" event={"ID":"36a3b431-4387-4ba7-a2c1-e72622594a8c","Type":"ContainerStarted","Data":"34e40c3d33f72d2bcd3938e36ea50cd6bc4a6ada00b2862dd505e67a9c278a59"} Oct 07 11:36:14 crc kubenswrapper[4700]: I1007 11:36:14.165915 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-ksvhb" Oct 07 11:36:14 crc kubenswrapper[4700]: I1007 11:36:14.204872 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-ksvhb" podStartSLOduration=9.201271989 podStartE2EDuration="16.204848603s" podCreationTimestamp="2025-10-07 11:35:58 +0000 UTC" firstStartedPulling="2025-10-07 11:36:03.419394482 +0000 UTC m=+930.215793471" lastFinishedPulling="2025-10-07 11:36:10.422971056 +0000 UTC m=+937.219370085" observedRunningTime="2025-10-07 11:36:14.196630858 +0000 UTC m=+940.993029847" watchObservedRunningTime="2025-10-07 11:36:14.204848603 +0000 UTC m=+941.001247592" Oct 07 11:36:15 crc kubenswrapper[4700]: I1007 11:36:15.333808 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 11:36:15 crc kubenswrapper[4700]: I1007 11:36:15.333872 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 11:36:16 crc kubenswrapper[4700]: I1007 11:36:16.180583 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f13aad27-7d23-4de6-8de0-c8a61809de5d","Type":"ContainerStarted","Data":"0ceaf3c643e1944ebe654be80c37128b5dffdc4345eef4494428a53a457e325a"} Oct 07 11:36:16 crc kubenswrapper[4700]: I1007 11:36:16.186754 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"57e7be90-ef51-432f-afa7-edbff56123e0","Type":"ContainerStarted","Data":"f301d723be9e644ba3d685554c53863ad2c40f241ae42072a8c144956a031f7f"} Oct 07 11:36:16 crc kubenswrapper[4700]: I1007 11:36:16.189109 4700 generic.go:334] "Generic (PLEG): container finished" podID="0706b451-8379-454a-bf71-483b779cb17b" containerID="8a1220e5263fbae539eb7adbe157ab3c1427498b94994dd3e9fe66d7f134a853" exitCode=0 Oct 07 11:36:16 crc kubenswrapper[4700]: I1007 11:36:16.189173 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0706b451-8379-454a-bf71-483b779cb17b","Type":"ContainerDied","Data":"8a1220e5263fbae539eb7adbe157ab3c1427498b94994dd3e9fe66d7f134a853"} Oct 07 11:36:16 crc kubenswrapper[4700]: I1007 11:36:16.192469 4700 generic.go:334] "Generic (PLEG): container finished" podID="793ba797-8da0-4e56-8dcc-14d7d2b0e217" containerID="58addc835f737f8b1580a97cb116ff1ced05aca092671d4b1e5f2c189b1fb4cb" exitCode=0 Oct 07 11:36:16 crc kubenswrapper[4700]: I1007 11:36:16.192541 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"793ba797-8da0-4e56-8dcc-14d7d2b0e217","Type":"ContainerDied","Data":"58addc835f737f8b1580a97cb116ff1ced05aca092671d4b1e5f2c189b1fb4cb"} Oct 07 11:36:16 crc kubenswrapper[4700]: I1007 11:36:16.213214 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=6.310445175 podStartE2EDuration="16.213192055s" podCreationTimestamp="2025-10-07 11:36:00 +0000 UTC" firstStartedPulling="2025-10-07 11:36:05.366653614 +0000 UTC m=+932.163052613" lastFinishedPulling="2025-10-07 11:36:15.269400504 +0000 UTC m=+942.065799493" observedRunningTime="2025-10-07 11:36:16.206360296 +0000 UTC m=+943.002759305" watchObservedRunningTime="2025-10-07 11:36:16.213192055 +0000 UTC m=+943.009591084" Oct 07 11:36:16 crc kubenswrapper[4700]: I1007 11:36:16.306150 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=6.379443518 podStartE2EDuration="18.306105397s" podCreationTimestamp="2025-10-07 11:35:58 +0000 UTC" firstStartedPulling="2025-10-07 11:36:03.329430367 +0000 UTC m=+930.125829356" lastFinishedPulling="2025-10-07 11:36:15.256092246 +0000 UTC m=+942.052491235" observedRunningTime="2025-10-07 11:36:16.301326432 +0000 UTC m=+943.097725431" watchObservedRunningTime="2025-10-07 11:36:16.306105397 +0000 UTC m=+943.102504386" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.075231 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.075591 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.137381 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.206125 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0706b451-8379-454a-bf71-483b779cb17b","Type":"ContainerStarted","Data":"4dd868f8ddddaec12ebe96c5db6caf266ed7faf90eaefaffc85899aaa1d22ea1"} Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.209751 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"793ba797-8da0-4e56-8dcc-14d7d2b0e217","Type":"ContainerStarted","Data":"3808d199ee2c98ba84b4aff4b925e7a60d21c2879d20ed2b236b45a5d7bc19ae"} Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.241222 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=18.327752762 podStartE2EDuration="26.24119373s" podCreationTimestamp="2025-10-07 11:35:51 +0000 UTC" firstStartedPulling="2025-10-07 11:36:02.509622771 +0000 UTC m=+929.306021770" lastFinishedPulling="2025-10-07 11:36:10.423063749 +0000 UTC m=+937.219462738" observedRunningTime="2025-10-07 11:36:17.234574246 +0000 UTC m=+944.030973245" watchObservedRunningTime="2025-10-07 11:36:17.24119373 +0000 UTC m=+944.037592749" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.267528 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=19.388827191 podStartE2EDuration="27.267497988s" podCreationTimestamp="2025-10-07 11:35:50 +0000 UTC" firstStartedPulling="2025-10-07 11:36:02.931659867 +0000 UTC m=+929.728058856" lastFinishedPulling="2025-10-07 11:36:10.810330664 +0000 UTC m=+937.606729653" observedRunningTime="2025-10-07 11:36:17.260380772 +0000 UTC m=+944.056779831" watchObservedRunningTime="2025-10-07 11:36:17.267497988 +0000 UTC m=+944.063897017" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.270053 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.421500 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.475066 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.578497 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-42x4h"] Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.578714 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-42x4h" podUID="1936b589-6677-4ce8-8281-58612a3a5687" containerName="dnsmasq-dns" containerID="cri-o://bbcb9fd665a718770dea6974fac5fed81ddd7235e5bddad8973ff5fe49edc4c2" gracePeriod=10 Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.580440 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-42x4h" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.616587 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-44drg"] Oct 07 11:36:17 crc kubenswrapper[4700]: E1007 11:36:17.617013 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74a5f38a-0346-47ea-9f20-0272bd55a191" containerName="init" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.617034 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="74a5f38a-0346-47ea-9f20-0272bd55a191" containerName="init" Oct 07 11:36:17 crc kubenswrapper[4700]: E1007 11:36:17.617060 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f30c73-3f35-4269-9bf7-ea649fd9b616" containerName="init" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.617068 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f30c73-3f35-4269-9bf7-ea649fd9b616" containerName="init" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.617264 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="74a5f38a-0346-47ea-9f20-0272bd55a191" containerName="init" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.617284 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0f30c73-3f35-4269-9bf7-ea649fd9b616" containerName="init" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.617960 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-44drg" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.621083 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.631202 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-mdz96"] Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.632530 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-mdz96" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.640930 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.646365 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-44drg"] Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.651979 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-mdz96"] Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.730726 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/de65476d-b545-432c-a5d2-5b5bd95a9369-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-44drg\" (UID: \"de65476d-b545-432c-a5d2-5b5bd95a9369\") " pod="openstack/ovn-controller-metrics-44drg" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.730988 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de65476d-b545-432c-a5d2-5b5bd95a9369-config\") pod \"ovn-controller-metrics-44drg\" (UID: \"de65476d-b545-432c-a5d2-5b5bd95a9369\") " pod="openstack/ovn-controller-metrics-44drg" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.731121 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k522\" (UniqueName: \"kubernetes.io/projected/de65476d-b545-432c-a5d2-5b5bd95a9369-kube-api-access-7k522\") pod \"ovn-controller-metrics-44drg\" (UID: \"de65476d-b545-432c-a5d2-5b5bd95a9369\") " pod="openstack/ovn-controller-metrics-44drg" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.731195 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7cf822b-19fa-4313-8b37-a2a5faf14587-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-mdz96\" (UID: \"e7cf822b-19fa-4313-8b37-a2a5faf14587\") " pod="openstack/dnsmasq-dns-7f896c8c65-mdz96" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.731347 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/de65476d-b545-432c-a5d2-5b5bd95a9369-ovs-rundir\") pod \"ovn-controller-metrics-44drg\" (UID: \"de65476d-b545-432c-a5d2-5b5bd95a9369\") " pod="openstack/ovn-controller-metrics-44drg" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.731446 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qsgq\" (UniqueName: \"kubernetes.io/projected/e7cf822b-19fa-4313-8b37-a2a5faf14587-kube-api-access-4qsgq\") pod \"dnsmasq-dns-7f896c8c65-mdz96\" (UID: \"e7cf822b-19fa-4313-8b37-a2a5faf14587\") " pod="openstack/dnsmasq-dns-7f896c8c65-mdz96" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.731547 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/de65476d-b545-432c-a5d2-5b5bd95a9369-ovn-rundir\") pod \"ovn-controller-metrics-44drg\" (UID: \"de65476d-b545-432c-a5d2-5b5bd95a9369\") " pod="openstack/ovn-controller-metrics-44drg" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.731646 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7cf822b-19fa-4313-8b37-a2a5faf14587-config\") pod \"dnsmasq-dns-7f896c8c65-mdz96\" (UID: \"e7cf822b-19fa-4313-8b37-a2a5faf14587\") " pod="openstack/dnsmasq-dns-7f896c8c65-mdz96" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.731735 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7cf822b-19fa-4313-8b37-a2a5faf14587-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-mdz96\" (UID: \"e7cf822b-19fa-4313-8b37-a2a5faf14587\") " pod="openstack/dnsmasq-dns-7f896c8c65-mdz96" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.731812 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de65476d-b545-432c-a5d2-5b5bd95a9369-combined-ca-bundle\") pod \"ovn-controller-metrics-44drg\" (UID: \"de65476d-b545-432c-a5d2-5b5bd95a9369\") " pod="openstack/ovn-controller-metrics-44drg" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.833470 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/de65476d-b545-432c-a5d2-5b5bd95a9369-ovs-rundir\") pod \"ovn-controller-metrics-44drg\" (UID: \"de65476d-b545-432c-a5d2-5b5bd95a9369\") " pod="openstack/ovn-controller-metrics-44drg" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.833523 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qsgq\" (UniqueName: \"kubernetes.io/projected/e7cf822b-19fa-4313-8b37-a2a5faf14587-kube-api-access-4qsgq\") pod \"dnsmasq-dns-7f896c8c65-mdz96\" (UID: \"e7cf822b-19fa-4313-8b37-a2a5faf14587\") " pod="openstack/dnsmasq-dns-7f896c8c65-mdz96" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.833542 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/de65476d-b545-432c-a5d2-5b5bd95a9369-ovn-rundir\") pod \"ovn-controller-metrics-44drg\" (UID: \"de65476d-b545-432c-a5d2-5b5bd95a9369\") " pod="openstack/ovn-controller-metrics-44drg" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.833564 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7cf822b-19fa-4313-8b37-a2a5faf14587-config\") pod \"dnsmasq-dns-7f896c8c65-mdz96\" (UID: \"e7cf822b-19fa-4313-8b37-a2a5faf14587\") " pod="openstack/dnsmasq-dns-7f896c8c65-mdz96" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.833592 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7cf822b-19fa-4313-8b37-a2a5faf14587-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-mdz96\" (UID: \"e7cf822b-19fa-4313-8b37-a2a5faf14587\") " pod="openstack/dnsmasq-dns-7f896c8c65-mdz96" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.833609 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de65476d-b545-432c-a5d2-5b5bd95a9369-combined-ca-bundle\") pod \"ovn-controller-metrics-44drg\" (UID: \"de65476d-b545-432c-a5d2-5b5bd95a9369\") " pod="openstack/ovn-controller-metrics-44drg" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.833642 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/de65476d-b545-432c-a5d2-5b5bd95a9369-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-44drg\" (UID: \"de65476d-b545-432c-a5d2-5b5bd95a9369\") " pod="openstack/ovn-controller-metrics-44drg" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.833678 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de65476d-b545-432c-a5d2-5b5bd95a9369-config\") pod \"ovn-controller-metrics-44drg\" (UID: \"de65476d-b545-432c-a5d2-5b5bd95a9369\") " pod="openstack/ovn-controller-metrics-44drg" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.833695 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7cf822b-19fa-4313-8b37-a2a5faf14587-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-mdz96\" (UID: \"e7cf822b-19fa-4313-8b37-a2a5faf14587\") " pod="openstack/dnsmasq-dns-7f896c8c65-mdz96" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.833713 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k522\" (UniqueName: \"kubernetes.io/projected/de65476d-b545-432c-a5d2-5b5bd95a9369-kube-api-access-7k522\") pod \"ovn-controller-metrics-44drg\" (UID: \"de65476d-b545-432c-a5d2-5b5bd95a9369\") " pod="openstack/ovn-controller-metrics-44drg" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.833902 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/de65476d-b545-432c-a5d2-5b5bd95a9369-ovs-rundir\") pod \"ovn-controller-metrics-44drg\" (UID: \"de65476d-b545-432c-a5d2-5b5bd95a9369\") " pod="openstack/ovn-controller-metrics-44drg" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.834843 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de65476d-b545-432c-a5d2-5b5bd95a9369-config\") pod \"ovn-controller-metrics-44drg\" (UID: \"de65476d-b545-432c-a5d2-5b5bd95a9369\") " pod="openstack/ovn-controller-metrics-44drg" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.834916 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/de65476d-b545-432c-a5d2-5b5bd95a9369-ovn-rundir\") pod \"ovn-controller-metrics-44drg\" (UID: \"de65476d-b545-432c-a5d2-5b5bd95a9369\") " pod="openstack/ovn-controller-metrics-44drg" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.834954 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7cf822b-19fa-4313-8b37-a2a5faf14587-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-mdz96\" (UID: \"e7cf822b-19fa-4313-8b37-a2a5faf14587\") " pod="openstack/dnsmasq-dns-7f896c8c65-mdz96" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.835012 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7cf822b-19fa-4313-8b37-a2a5faf14587-config\") pod \"dnsmasq-dns-7f896c8c65-mdz96\" (UID: \"e7cf822b-19fa-4313-8b37-a2a5faf14587\") " pod="openstack/dnsmasq-dns-7f896c8c65-mdz96" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.835415 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7cf822b-19fa-4313-8b37-a2a5faf14587-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-mdz96\" (UID: \"e7cf822b-19fa-4313-8b37-a2a5faf14587\") " pod="openstack/dnsmasq-dns-7f896c8c65-mdz96" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.841473 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de65476d-b545-432c-a5d2-5b5bd95a9369-combined-ca-bundle\") pod \"ovn-controller-metrics-44drg\" (UID: \"de65476d-b545-432c-a5d2-5b5bd95a9369\") " pod="openstack/ovn-controller-metrics-44drg" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.856095 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/de65476d-b545-432c-a5d2-5b5bd95a9369-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-44drg\" (UID: \"de65476d-b545-432c-a5d2-5b5bd95a9369\") " pod="openstack/ovn-controller-metrics-44drg" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.858927 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k522\" (UniqueName: \"kubernetes.io/projected/de65476d-b545-432c-a5d2-5b5bd95a9369-kube-api-access-7k522\") pod \"ovn-controller-metrics-44drg\" (UID: \"de65476d-b545-432c-a5d2-5b5bd95a9369\") " pod="openstack/ovn-controller-metrics-44drg" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.862283 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qsgq\" (UniqueName: \"kubernetes.io/projected/e7cf822b-19fa-4313-8b37-a2a5faf14587-kube-api-access-4qsgq\") pod \"dnsmasq-dns-7f896c8c65-mdz96\" (UID: \"e7cf822b-19fa-4313-8b37-a2a5faf14587\") " pod="openstack/dnsmasq-dns-7f896c8c65-mdz96" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.979927 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-44drg" Oct 07 11:36:17 crc kubenswrapper[4700]: I1007 11:36:17.980564 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-mdz96" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.032655 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-dtxl5"] Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.032697 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-xmw8w"] Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.034082 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-xmw8w"] Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.034165 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-xmw8w" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.034458 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-dtxl5" podUID="88daacf3-e024-4e18-8631-406b6e504a5f" containerName="dnsmasq-dns" containerID="cri-o://e36a8f816e601402ab42c46f48f3465c9853cf0fa6b20480e2cdbdd05c9a6a86" gracePeriod=10 Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.039567 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.039989 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-dtxl5" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.149168 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65318c76-6964-4a52-848b-7fea1e6c98ef-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-xmw8w\" (UID: \"65318c76-6964-4a52-848b-7fea1e6c98ef\") " pod="openstack/dnsmasq-dns-86db49b7ff-xmw8w" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.163874 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/65318c76-6964-4a52-848b-7fea1e6c98ef-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-xmw8w\" (UID: \"65318c76-6964-4a52-848b-7fea1e6c98ef\") " pod="openstack/dnsmasq-dns-86db49b7ff-xmw8w" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.164482 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkjkp\" (UniqueName: \"kubernetes.io/projected/65318c76-6964-4a52-848b-7fea1e6c98ef-kube-api-access-hkjkp\") pod \"dnsmasq-dns-86db49b7ff-xmw8w\" (UID: \"65318c76-6964-4a52-848b-7fea1e6c98ef\") " pod="openstack/dnsmasq-dns-86db49b7ff-xmw8w" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.164656 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65318c76-6964-4a52-848b-7fea1e6c98ef-config\") pod \"dnsmasq-dns-86db49b7ff-xmw8w\" (UID: \"65318c76-6964-4a52-848b-7fea1e6c98ef\") " pod="openstack/dnsmasq-dns-86db49b7ff-xmw8w" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.164854 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/65318c76-6964-4a52-848b-7fea1e6c98ef-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-xmw8w\" (UID: \"65318c76-6964-4a52-848b-7fea1e6c98ef\") " pod="openstack/dnsmasq-dns-86db49b7ff-xmw8w" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.207661 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.271085 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65318c76-6964-4a52-848b-7fea1e6c98ef-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-xmw8w\" (UID: \"65318c76-6964-4a52-848b-7fea1e6c98ef\") " pod="openstack/dnsmasq-dns-86db49b7ff-xmw8w" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.271422 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/65318c76-6964-4a52-848b-7fea1e6c98ef-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-xmw8w\" (UID: \"65318c76-6964-4a52-848b-7fea1e6c98ef\") " pod="openstack/dnsmasq-dns-86db49b7ff-xmw8w" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.271463 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkjkp\" (UniqueName: \"kubernetes.io/projected/65318c76-6964-4a52-848b-7fea1e6c98ef-kube-api-access-hkjkp\") pod \"dnsmasq-dns-86db49b7ff-xmw8w\" (UID: \"65318c76-6964-4a52-848b-7fea1e6c98ef\") " pod="openstack/dnsmasq-dns-86db49b7ff-xmw8w" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.271516 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65318c76-6964-4a52-848b-7fea1e6c98ef-config\") pod \"dnsmasq-dns-86db49b7ff-xmw8w\" (UID: \"65318c76-6964-4a52-848b-7fea1e6c98ef\") " pod="openstack/dnsmasq-dns-86db49b7ff-xmw8w" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.271565 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/65318c76-6964-4a52-848b-7fea1e6c98ef-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-xmw8w\" (UID: \"65318c76-6964-4a52-848b-7fea1e6c98ef\") " pod="openstack/dnsmasq-dns-86db49b7ff-xmw8w" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.272434 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/65318c76-6964-4a52-848b-7fea1e6c98ef-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-xmw8w\" (UID: \"65318c76-6964-4a52-848b-7fea1e6c98ef\") " pod="openstack/dnsmasq-dns-86db49b7ff-xmw8w" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.272943 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65318c76-6964-4a52-848b-7fea1e6c98ef-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-xmw8w\" (UID: \"65318c76-6964-4a52-848b-7fea1e6c98ef\") " pod="openstack/dnsmasq-dns-86db49b7ff-xmw8w" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.273449 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/65318c76-6964-4a52-848b-7fea1e6c98ef-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-xmw8w\" (UID: \"65318c76-6964-4a52-848b-7fea1e6c98ef\") " pod="openstack/dnsmasq-dns-86db49b7ff-xmw8w" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.274148 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65318c76-6964-4a52-848b-7fea1e6c98ef-config\") pod \"dnsmasq-dns-86db49b7ff-xmw8w\" (UID: \"65318c76-6964-4a52-848b-7fea1e6c98ef\") " pod="openstack/dnsmasq-dns-86db49b7ff-xmw8w" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.318291 4700 generic.go:334] "Generic (PLEG): container finished" podID="1936b589-6677-4ce8-8281-58612a3a5687" containerID="bbcb9fd665a718770dea6974fac5fed81ddd7235e5bddad8973ff5fe49edc4c2" exitCode=0 Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.318418 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-42x4h" event={"ID":"1936b589-6677-4ce8-8281-58612a3a5687","Type":"ContainerDied","Data":"bbcb9fd665a718770dea6974fac5fed81ddd7235e5bddad8973ff5fe49edc4c2"} Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.319001 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkjkp\" (UniqueName: \"kubernetes.io/projected/65318c76-6964-4a52-848b-7fea1e6c98ef-kube-api-access-hkjkp\") pod \"dnsmasq-dns-86db49b7ff-xmw8w\" (UID: \"65318c76-6964-4a52-848b-7fea1e6c98ef\") " pod="openstack/dnsmasq-dns-86db49b7ff-xmw8w" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.349674 4700 generic.go:334] "Generic (PLEG): container finished" podID="88daacf3-e024-4e18-8631-406b6e504a5f" containerID="e36a8f816e601402ab42c46f48f3465c9853cf0fa6b20480e2cdbdd05c9a6a86" exitCode=0 Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.349871 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-dtxl5" event={"ID":"88daacf3-e024-4e18-8631-406b6e504a5f","Type":"ContainerDied","Data":"e36a8f816e601402ab42c46f48f3465c9853cf0fa6b20480e2cdbdd05c9a6a86"} Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.350627 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.468878 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.481068 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-xmw8w" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.626703 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.628148 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.632631 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-h2v6c" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.632685 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.632649 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.633523 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.645684 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.679867 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b5f78cd-b302-4c30-87c5-82954e351d55-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"2b5f78cd-b302-4c30-87c5-82954e351d55\") " pod="openstack/ovn-northd-0" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.679912 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b5f78cd-b302-4c30-87c5-82954e351d55-config\") pod \"ovn-northd-0\" (UID: \"2b5f78cd-b302-4c30-87c5-82954e351d55\") " pod="openstack/ovn-northd-0" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.679932 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b5f78cd-b302-4c30-87c5-82954e351d55-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"2b5f78cd-b302-4c30-87c5-82954e351d55\") " pod="openstack/ovn-northd-0" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.679974 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b5f78cd-b302-4c30-87c5-82954e351d55-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"2b5f78cd-b302-4c30-87c5-82954e351d55\") " pod="openstack/ovn-northd-0" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.680008 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b5f78cd-b302-4c30-87c5-82954e351d55-scripts\") pod \"ovn-northd-0\" (UID: \"2b5f78cd-b302-4c30-87c5-82954e351d55\") " pod="openstack/ovn-northd-0" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.680029 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfw6c\" (UniqueName: \"kubernetes.io/projected/2b5f78cd-b302-4c30-87c5-82954e351d55-kube-api-access-xfw6c\") pod \"ovn-northd-0\" (UID: \"2b5f78cd-b302-4c30-87c5-82954e351d55\") " pod="openstack/ovn-northd-0" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.680060 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2b5f78cd-b302-4c30-87c5-82954e351d55-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"2b5f78cd-b302-4c30-87c5-82954e351d55\") " pod="openstack/ovn-northd-0" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.738024 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-44drg"] Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.794319 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2b5f78cd-b302-4c30-87c5-82954e351d55-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"2b5f78cd-b302-4c30-87c5-82954e351d55\") " pod="openstack/ovn-northd-0" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.794427 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b5f78cd-b302-4c30-87c5-82954e351d55-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"2b5f78cd-b302-4c30-87c5-82954e351d55\") " pod="openstack/ovn-northd-0" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.794451 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b5f78cd-b302-4c30-87c5-82954e351d55-config\") pod \"ovn-northd-0\" (UID: \"2b5f78cd-b302-4c30-87c5-82954e351d55\") " pod="openstack/ovn-northd-0" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.794481 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b5f78cd-b302-4c30-87c5-82954e351d55-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"2b5f78cd-b302-4c30-87c5-82954e351d55\") " pod="openstack/ovn-northd-0" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.794513 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b5f78cd-b302-4c30-87c5-82954e351d55-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"2b5f78cd-b302-4c30-87c5-82954e351d55\") " pod="openstack/ovn-northd-0" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.794566 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b5f78cd-b302-4c30-87c5-82954e351d55-scripts\") pod \"ovn-northd-0\" (UID: \"2b5f78cd-b302-4c30-87c5-82954e351d55\") " pod="openstack/ovn-northd-0" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.794586 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfw6c\" (UniqueName: \"kubernetes.io/projected/2b5f78cd-b302-4c30-87c5-82954e351d55-kube-api-access-xfw6c\") pod \"ovn-northd-0\" (UID: \"2b5f78cd-b302-4c30-87c5-82954e351d55\") " pod="openstack/ovn-northd-0" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.795657 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2b5f78cd-b302-4c30-87c5-82954e351d55-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"2b5f78cd-b302-4c30-87c5-82954e351d55\") " pod="openstack/ovn-northd-0" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.798983 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b5f78cd-b302-4c30-87c5-82954e351d55-scripts\") pod \"ovn-northd-0\" (UID: \"2b5f78cd-b302-4c30-87c5-82954e351d55\") " pod="openstack/ovn-northd-0" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.801568 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b5f78cd-b302-4c30-87c5-82954e351d55-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"2b5f78cd-b302-4c30-87c5-82954e351d55\") " pod="openstack/ovn-northd-0" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.805701 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b5f78cd-b302-4c30-87c5-82954e351d55-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"2b5f78cd-b302-4c30-87c5-82954e351d55\") " pod="openstack/ovn-northd-0" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.807239 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b5f78cd-b302-4c30-87c5-82954e351d55-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"2b5f78cd-b302-4c30-87c5-82954e351d55\") " pod="openstack/ovn-northd-0" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.823113 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b5f78cd-b302-4c30-87c5-82954e351d55-config\") pod \"ovn-northd-0\" (UID: \"2b5f78cd-b302-4c30-87c5-82954e351d55\") " pod="openstack/ovn-northd-0" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.834666 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfw6c\" (UniqueName: \"kubernetes.io/projected/2b5f78cd-b302-4c30-87c5-82954e351d55-kube-api-access-xfw6c\") pod \"ovn-northd-0\" (UID: \"2b5f78cd-b302-4c30-87c5-82954e351d55\") " pod="openstack/ovn-northd-0" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.914113 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-42x4h" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.940704 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-dtxl5" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.950049 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 07 11:36:18 crc kubenswrapper[4700]: I1007 11:36:18.988145 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-mdz96"] Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:18.996758 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1936b589-6677-4ce8-8281-58612a3a5687-dns-svc\") pod \"1936b589-6677-4ce8-8281-58612a3a5687\" (UID: \"1936b589-6677-4ce8-8281-58612a3a5687\") " Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:18.996815 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1936b589-6677-4ce8-8281-58612a3a5687-config\") pod \"1936b589-6677-4ce8-8281-58612a3a5687\" (UID: \"1936b589-6677-4ce8-8281-58612a3a5687\") " Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:18.996906 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88daacf3-e024-4e18-8631-406b6e504a5f-dns-svc\") pod \"88daacf3-e024-4e18-8631-406b6e504a5f\" (UID: \"88daacf3-e024-4e18-8631-406b6e504a5f\") " Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:18.996951 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88daacf3-e024-4e18-8631-406b6e504a5f-config\") pod \"88daacf3-e024-4e18-8631-406b6e504a5f\" (UID: \"88daacf3-e024-4e18-8631-406b6e504a5f\") " Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:18.997037 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tthlx\" (UniqueName: \"kubernetes.io/projected/88daacf3-e024-4e18-8631-406b6e504a5f-kube-api-access-tthlx\") pod \"88daacf3-e024-4e18-8631-406b6e504a5f\" (UID: \"88daacf3-e024-4e18-8631-406b6e504a5f\") " Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:18.997083 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqjxc\" (UniqueName: \"kubernetes.io/projected/1936b589-6677-4ce8-8281-58612a3a5687-kube-api-access-xqjxc\") pod \"1936b589-6677-4ce8-8281-58612a3a5687\" (UID: \"1936b589-6677-4ce8-8281-58612a3a5687\") " Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:19.005456 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88daacf3-e024-4e18-8631-406b6e504a5f-kube-api-access-tthlx" (OuterVolumeSpecName: "kube-api-access-tthlx") pod "88daacf3-e024-4e18-8631-406b6e504a5f" (UID: "88daacf3-e024-4e18-8631-406b6e504a5f"). InnerVolumeSpecName "kube-api-access-tthlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:19.011466 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1936b589-6677-4ce8-8281-58612a3a5687-kube-api-access-xqjxc" (OuterVolumeSpecName: "kube-api-access-xqjxc") pod "1936b589-6677-4ce8-8281-58612a3a5687" (UID: "1936b589-6677-4ce8-8281-58612a3a5687"). InnerVolumeSpecName "kube-api-access-xqjxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:19.069265 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88daacf3-e024-4e18-8631-406b6e504a5f-config" (OuterVolumeSpecName: "config") pod "88daacf3-e024-4e18-8631-406b6e504a5f" (UID: "88daacf3-e024-4e18-8631-406b6e504a5f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:19.089648 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88daacf3-e024-4e18-8631-406b6e504a5f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "88daacf3-e024-4e18-8631-406b6e504a5f" (UID: "88daacf3-e024-4e18-8631-406b6e504a5f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:19.099292 4700 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88daacf3-e024-4e18-8631-406b6e504a5f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:19.099348 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88daacf3-e024-4e18-8631-406b6e504a5f-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:19.099363 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tthlx\" (UniqueName: \"kubernetes.io/projected/88daacf3-e024-4e18-8631-406b6e504a5f-kube-api-access-tthlx\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:19.099376 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqjxc\" (UniqueName: \"kubernetes.io/projected/1936b589-6677-4ce8-8281-58612a3a5687-kube-api-access-xqjxc\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:19.111854 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1936b589-6677-4ce8-8281-58612a3a5687-config" (OuterVolumeSpecName: "config") pod "1936b589-6677-4ce8-8281-58612a3a5687" (UID: "1936b589-6677-4ce8-8281-58612a3a5687"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:19.116269 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1936b589-6677-4ce8-8281-58612a3a5687-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1936b589-6677-4ce8-8281-58612a3a5687" (UID: "1936b589-6677-4ce8-8281-58612a3a5687"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:19.136499 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-xmw8w"] Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:19.201589 4700 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1936b589-6677-4ce8-8281-58612a3a5687-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:19.201620 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1936b589-6677-4ce8-8281-58612a3a5687-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:19.360098 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-42x4h" event={"ID":"1936b589-6677-4ce8-8281-58612a3a5687","Type":"ContainerDied","Data":"e332884553931e1d08f6e7774f378c4248c545bbbf22e091b94a647d004d252f"} Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:19.360145 4700 scope.go:117] "RemoveContainer" containerID="bbcb9fd665a718770dea6974fac5fed81ddd7235e5bddad8973ff5fe49edc4c2" Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:19.360253 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-42x4h" Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:19.374323 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-xmw8w" event={"ID":"65318c76-6964-4a52-848b-7fea1e6c98ef","Type":"ContainerStarted","Data":"fd9150110a2921114bc069617e2b796cd9ffa82bf8756d7820d1599fa1231032"} Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:19.377386 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-44drg" event={"ID":"de65476d-b545-432c-a5d2-5b5bd95a9369","Type":"ContainerStarted","Data":"6b0d78f947d1bf6b235d03c18312b232caa6cd07da50c86bc7aa549f1add708d"} Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:19.377428 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-44drg" event={"ID":"de65476d-b545-432c-a5d2-5b5bd95a9369","Type":"ContainerStarted","Data":"8e26865d58f7e115286b8be16bb4a281a6f0d79b48156d9ad0b024fb9ff5062e"} Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:19.381967 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-dtxl5" event={"ID":"88daacf3-e024-4e18-8631-406b6e504a5f","Type":"ContainerDied","Data":"2506fcf6b0666e77632a7f478646c6a81df87879ca3624c6eb911d281430fe1d"} Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:19.381996 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-dtxl5" Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:19.386051 4700 scope.go:117] "RemoveContainer" containerID="a328b1848b9535811a64d0b1e0569a54d482c5a03ea7e6b6e6f3395fdbc55d22" Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:19.396374 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-42x4h"] Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:19.400504 4700 generic.go:334] "Generic (PLEG): container finished" podID="e7cf822b-19fa-4313-8b37-a2a5faf14587" containerID="6b88e3b321d163b31884cb8c8ad7a51e70893093ff03f97f6832b45a6179a6e2" exitCode=0 Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:19.401368 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-mdz96" event={"ID":"e7cf822b-19fa-4313-8b37-a2a5faf14587","Type":"ContainerDied","Data":"6b88e3b321d163b31884cb8c8ad7a51e70893093ff03f97f6832b45a6179a6e2"} Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:19.401418 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-mdz96" event={"ID":"e7cf822b-19fa-4313-8b37-a2a5faf14587","Type":"ContainerStarted","Data":"ed7e133d4b36511dc2a373d73e05a9f8f04d84ee84cf571e2ad916340a59879c"} Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:19.401604 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-42x4h"] Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:19.419077 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-44drg" podStartSLOduration=2.419056057 podStartE2EDuration="2.419056057s" podCreationTimestamp="2025-10-07 11:36:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:36:19.409625141 +0000 UTC m=+946.206024140" watchObservedRunningTime="2025-10-07 11:36:19.419056057 +0000 UTC m=+946.215455046" Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:19.469623 4700 scope.go:117] "RemoveContainer" containerID="e36a8f816e601402ab42c46f48f3465c9853cf0fa6b20480e2cdbdd05c9a6a86" Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:19.484107 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-dtxl5"] Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:19.505247 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-dtxl5"] Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:19.528012 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:19.536551 4700 scope.go:117] "RemoveContainer" containerID="ace19e82a45ca16d1efe113d34bd3bed1905fbc9ee1a9662ff060f79c20997d1" Oct 07 11:36:19 crc kubenswrapper[4700]: W1007 11:36:19.556848 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b5f78cd_b302_4c30_87c5_82954e351d55.slice/crio-b4766d6b9e6e2abacab31de0bef4beec97f5c22f04500dd6f4499d28e1b23cb9 WatchSource:0}: Error finding container b4766d6b9e6e2abacab31de0bef4beec97f5c22f04500dd6f4499d28e1b23cb9: Status 404 returned error can't find the container with id b4766d6b9e6e2abacab31de0bef4beec97f5c22f04500dd6f4499d28e1b23cb9 Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:19.967572 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1936b589-6677-4ce8-8281-58612a3a5687" path="/var/lib/kubelet/pods/1936b589-6677-4ce8-8281-58612a3a5687/volumes" Oct 07 11:36:19 crc kubenswrapper[4700]: I1007 11:36:19.968500 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88daacf3-e024-4e18-8631-406b6e504a5f" path="/var/lib/kubelet/pods/88daacf3-e024-4e18-8631-406b6e504a5f/volumes" Oct 07 11:36:20 crc kubenswrapper[4700]: I1007 11:36:20.410374 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-mdz96" event={"ID":"e7cf822b-19fa-4313-8b37-a2a5faf14587","Type":"ContainerStarted","Data":"47af6b6ff9ef5fd7f618a5997659b15d4a8e586766205c3395d4e2052a2bb5e8"} Oct 07 11:36:20 crc kubenswrapper[4700]: I1007 11:36:20.410452 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f896c8c65-mdz96" Oct 07 11:36:20 crc kubenswrapper[4700]: I1007 11:36:20.430007 4700 generic.go:334] "Generic (PLEG): container finished" podID="65318c76-6964-4a52-848b-7fea1e6c98ef" containerID="36b780bb61470f22da2f3e4d8176d71469aec84575f76ea7a28fd0722ae25ed1" exitCode=0 Oct 07 11:36:20 crc kubenswrapper[4700]: I1007 11:36:20.430817 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-xmw8w" event={"ID":"65318c76-6964-4a52-848b-7fea1e6c98ef","Type":"ContainerDied","Data":"36b780bb61470f22da2f3e4d8176d71469aec84575f76ea7a28fd0722ae25ed1"} Oct 07 11:36:20 crc kubenswrapper[4700]: I1007 11:36:20.433543 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f896c8c65-mdz96" podStartSLOduration=3.433526288 podStartE2EDuration="3.433526288s" podCreationTimestamp="2025-10-07 11:36:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:36:20.429356669 +0000 UTC m=+947.225755678" watchObservedRunningTime="2025-10-07 11:36:20.433526288 +0000 UTC m=+947.229925277" Oct 07 11:36:20 crc kubenswrapper[4700]: I1007 11:36:20.451291 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"2b5f78cd-b302-4c30-87c5-82954e351d55","Type":"ContainerStarted","Data":"b4766d6b9e6e2abacab31de0bef4beec97f5c22f04500dd6f4499d28e1b23cb9"} Oct 07 11:36:21 crc kubenswrapper[4700]: I1007 11:36:21.472198 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-xmw8w" event={"ID":"65318c76-6964-4a52-848b-7fea1e6c98ef","Type":"ContainerStarted","Data":"78d9d2c3fa06aabc96342b11798408a0aaa74c63c96a422f4c87cf9b062cd211"} Oct 07 11:36:21 crc kubenswrapper[4700]: I1007 11:36:21.472555 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-xmw8w" Oct 07 11:36:21 crc kubenswrapper[4700]: I1007 11:36:21.474314 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"2b5f78cd-b302-4c30-87c5-82954e351d55","Type":"ContainerStarted","Data":"9cdcd4b5719cd5686a629446d87c4bd1875cc37a21e754b3024a0a65eeada78c"} Oct 07 11:36:21 crc kubenswrapper[4700]: I1007 11:36:21.474354 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"2b5f78cd-b302-4c30-87c5-82954e351d55","Type":"ContainerStarted","Data":"88c960117e542b5bcf771b358124c83c6cbb034c781eee8199a13a51d094436f"} Oct 07 11:36:21 crc kubenswrapper[4700]: I1007 11:36:21.496905 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-xmw8w" podStartSLOduration=4.496886028 podStartE2EDuration="4.496886028s" podCreationTimestamp="2025-10-07 11:36:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:36:21.491451436 +0000 UTC m=+948.287850455" watchObservedRunningTime="2025-10-07 11:36:21.496886028 +0000 UTC m=+948.293285017" Oct 07 11:36:21 crc kubenswrapper[4700]: I1007 11:36:21.512695 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.999866088 podStartE2EDuration="3.512672461s" podCreationTimestamp="2025-10-07 11:36:18 +0000 UTC" firstStartedPulling="2025-10-07 11:36:19.564853913 +0000 UTC m=+946.361252902" lastFinishedPulling="2025-10-07 11:36:21.077660286 +0000 UTC m=+947.874059275" observedRunningTime="2025-10-07 11:36:21.508280816 +0000 UTC m=+948.304679805" watchObservedRunningTime="2025-10-07 11:36:21.512672461 +0000 UTC m=+948.309071450" Oct 07 11:36:21 crc kubenswrapper[4700]: I1007 11:36:21.591409 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 07 11:36:21 crc kubenswrapper[4700]: I1007 11:36:21.591790 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 07 11:36:21 crc kubenswrapper[4700]: I1007 11:36:21.662217 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 07 11:36:22 crc kubenswrapper[4700]: I1007 11:36:22.485085 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 07 11:36:22 crc kubenswrapper[4700]: I1007 11:36:22.544233 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 07 11:36:22 crc kubenswrapper[4700]: I1007 11:36:22.784670 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 07 11:36:22 crc kubenswrapper[4700]: I1007 11:36:22.784785 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 07 11:36:22 crc kubenswrapper[4700]: I1007 11:36:22.944624 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-bbffr"] Oct 07 11:36:22 crc kubenswrapper[4700]: E1007 11:36:22.945058 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88daacf3-e024-4e18-8631-406b6e504a5f" containerName="init" Oct 07 11:36:22 crc kubenswrapper[4700]: I1007 11:36:22.945085 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="88daacf3-e024-4e18-8631-406b6e504a5f" containerName="init" Oct 07 11:36:22 crc kubenswrapper[4700]: E1007 11:36:22.945101 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1936b589-6677-4ce8-8281-58612a3a5687" containerName="init" Oct 07 11:36:22 crc kubenswrapper[4700]: I1007 11:36:22.945114 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="1936b589-6677-4ce8-8281-58612a3a5687" containerName="init" Oct 07 11:36:22 crc kubenswrapper[4700]: E1007 11:36:22.945131 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88daacf3-e024-4e18-8631-406b6e504a5f" containerName="dnsmasq-dns" Oct 07 11:36:22 crc kubenswrapper[4700]: I1007 11:36:22.945165 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="88daacf3-e024-4e18-8631-406b6e504a5f" containerName="dnsmasq-dns" Oct 07 11:36:22 crc kubenswrapper[4700]: E1007 11:36:22.945187 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1936b589-6677-4ce8-8281-58612a3a5687" containerName="dnsmasq-dns" Oct 07 11:36:22 crc kubenswrapper[4700]: I1007 11:36:22.945200 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="1936b589-6677-4ce8-8281-58612a3a5687" containerName="dnsmasq-dns" Oct 07 11:36:22 crc kubenswrapper[4700]: I1007 11:36:22.945504 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="88daacf3-e024-4e18-8631-406b6e504a5f" containerName="dnsmasq-dns" Oct 07 11:36:22 crc kubenswrapper[4700]: I1007 11:36:22.945529 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="1936b589-6677-4ce8-8281-58612a3a5687" containerName="dnsmasq-dns" Oct 07 11:36:22 crc kubenswrapper[4700]: I1007 11:36:22.946396 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bbffr" Oct 07 11:36:22 crc kubenswrapper[4700]: I1007 11:36:22.954339 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-bbffr"] Oct 07 11:36:23 crc kubenswrapper[4700]: I1007 11:36:23.094129 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz6q5\" (UniqueName: \"kubernetes.io/projected/ea8dd361-88e8-4530-9d61-ac84770c792e-kube-api-access-jz6q5\") pod \"keystone-db-create-bbffr\" (UID: \"ea8dd361-88e8-4530-9d61-ac84770c792e\") " pod="openstack/keystone-db-create-bbffr" Oct 07 11:36:23 crc kubenswrapper[4700]: I1007 11:36:23.139065 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-qn762"] Oct 07 11:36:23 crc kubenswrapper[4700]: I1007 11:36:23.142375 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-qn762" Oct 07 11:36:23 crc kubenswrapper[4700]: I1007 11:36:23.153271 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-qn762"] Oct 07 11:36:23 crc kubenswrapper[4700]: I1007 11:36:23.195627 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz6q5\" (UniqueName: \"kubernetes.io/projected/ea8dd361-88e8-4530-9d61-ac84770c792e-kube-api-access-jz6q5\") pod \"keystone-db-create-bbffr\" (UID: \"ea8dd361-88e8-4530-9d61-ac84770c792e\") " pod="openstack/keystone-db-create-bbffr" Oct 07 11:36:23 crc kubenswrapper[4700]: I1007 11:36:23.227726 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz6q5\" (UniqueName: \"kubernetes.io/projected/ea8dd361-88e8-4530-9d61-ac84770c792e-kube-api-access-jz6q5\") pod \"keystone-db-create-bbffr\" (UID: \"ea8dd361-88e8-4530-9d61-ac84770c792e\") " pod="openstack/keystone-db-create-bbffr" Oct 07 11:36:23 crc kubenswrapper[4700]: I1007 11:36:23.267768 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bbffr" Oct 07 11:36:23 crc kubenswrapper[4700]: I1007 11:36:23.297710 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s2v6\" (UniqueName: \"kubernetes.io/projected/f9008225-ce95-4591-9a50-8f2982a231a5-kube-api-access-6s2v6\") pod \"placement-db-create-qn762\" (UID: \"f9008225-ce95-4591-9a50-8f2982a231a5\") " pod="openstack/placement-db-create-qn762" Oct 07 11:36:23 crc kubenswrapper[4700]: I1007 11:36:23.401369 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s2v6\" (UniqueName: \"kubernetes.io/projected/f9008225-ce95-4591-9a50-8f2982a231a5-kube-api-access-6s2v6\") pod \"placement-db-create-qn762\" (UID: \"f9008225-ce95-4591-9a50-8f2982a231a5\") " pod="openstack/placement-db-create-qn762" Oct 07 11:36:23 crc kubenswrapper[4700]: I1007 11:36:23.422722 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s2v6\" (UniqueName: \"kubernetes.io/projected/f9008225-ce95-4591-9a50-8f2982a231a5-kube-api-access-6s2v6\") pod \"placement-db-create-qn762\" (UID: \"f9008225-ce95-4591-9a50-8f2982a231a5\") " pod="openstack/placement-db-create-qn762" Oct 07 11:36:23 crc kubenswrapper[4700]: I1007 11:36:23.475135 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-qn762" Oct 07 11:36:23 crc kubenswrapper[4700]: I1007 11:36:23.729466 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-666b6646f7-dtxl5" podUID="88daacf3-e024-4e18-8631-406b6e504a5f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.100:5353: i/o timeout" Oct 07 11:36:23 crc kubenswrapper[4700]: I1007 11:36:23.765256 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-bbffr"] Oct 07 11:36:23 crc kubenswrapper[4700]: W1007 11:36:23.780215 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea8dd361_88e8_4530_9d61_ac84770c792e.slice/crio-6c6dc2ce7424634e7abd7176897321a22e55965038c80d0e6a6009ea6964f601 WatchSource:0}: Error finding container 6c6dc2ce7424634e7abd7176897321a22e55965038c80d0e6a6009ea6964f601: Status 404 returned error can't find the container with id 6c6dc2ce7424634e7abd7176897321a22e55965038c80d0e6a6009ea6964f601 Oct 07 11:36:24 crc kubenswrapper[4700]: I1007 11:36:24.064233 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-qn762"] Oct 07 11:36:24 crc kubenswrapper[4700]: W1007 11:36:24.069119 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9008225_ce95_4591_9a50_8f2982a231a5.slice/crio-4bc16fc5d84510900613c227907ef8179a297d44cde28306bc82d90b43e9f78e WatchSource:0}: Error finding container 4bc16fc5d84510900613c227907ef8179a297d44cde28306bc82d90b43e9f78e: Status 404 returned error can't find the container with id 4bc16fc5d84510900613c227907ef8179a297d44cde28306bc82d90b43e9f78e Oct 07 11:36:24 crc kubenswrapper[4700]: I1007 11:36:24.399136 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 07 11:36:24 crc kubenswrapper[4700]: I1007 11:36:24.454933 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="793ba797-8da0-4e56-8dcc-14d7d2b0e217" containerName="galera" probeResult="failure" output=< Oct 07 11:36:24 crc kubenswrapper[4700]: wsrep_local_state_comment (Joined) differs from Synced Oct 07 11:36:24 crc kubenswrapper[4700]: > Oct 07 11:36:24 crc kubenswrapper[4700]: I1007 11:36:24.526539 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bbffr" event={"ID":"ea8dd361-88e8-4530-9d61-ac84770c792e","Type":"ContainerStarted","Data":"6c6dc2ce7424634e7abd7176897321a22e55965038c80d0e6a6009ea6964f601"} Oct 07 11:36:24 crc kubenswrapper[4700]: I1007 11:36:24.529037 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-qn762" event={"ID":"f9008225-ce95-4591-9a50-8f2982a231a5","Type":"ContainerStarted","Data":"4bc16fc5d84510900613c227907ef8179a297d44cde28306bc82d90b43e9f78e"} Oct 07 11:36:25 crc kubenswrapper[4700]: I1007 11:36:25.211089 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 07 11:36:25 crc kubenswrapper[4700]: I1007 11:36:25.336729 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-mdz96"] Oct 07 11:36:25 crc kubenswrapper[4700]: I1007 11:36:25.337016 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f896c8c65-mdz96" podUID="e7cf822b-19fa-4313-8b37-a2a5faf14587" containerName="dnsmasq-dns" containerID="cri-o://47af6b6ff9ef5fd7f618a5997659b15d4a8e586766205c3395d4e2052a2bb5e8" gracePeriod=10 Oct 07 11:36:25 crc kubenswrapper[4700]: I1007 11:36:25.345446 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f896c8c65-mdz96" Oct 07 11:36:25 crc kubenswrapper[4700]: I1007 11:36:25.388622 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-brvlc"] Oct 07 11:36:25 crc kubenswrapper[4700]: I1007 11:36:25.390128 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-brvlc" Oct 07 11:36:25 crc kubenswrapper[4700]: I1007 11:36:25.399453 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-brvlc"] Oct 07 11:36:25 crc kubenswrapper[4700]: I1007 11:36:25.538601 4700 generic.go:334] "Generic (PLEG): container finished" podID="e7cf822b-19fa-4313-8b37-a2a5faf14587" containerID="47af6b6ff9ef5fd7f618a5997659b15d4a8e586766205c3395d4e2052a2bb5e8" exitCode=0 Oct 07 11:36:25 crc kubenswrapper[4700]: I1007 11:36:25.538659 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-mdz96" event={"ID":"e7cf822b-19fa-4313-8b37-a2a5faf14587","Type":"ContainerDied","Data":"47af6b6ff9ef5fd7f618a5997659b15d4a8e586766205c3395d4e2052a2bb5e8"} Oct 07 11:36:25 crc kubenswrapper[4700]: I1007 11:36:25.550500 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f6ec6b3-bf89-4f27-a2df-d07a61eee130-dns-svc\") pod \"dnsmasq-dns-698758b865-brvlc\" (UID: \"0f6ec6b3-bf89-4f27-a2df-d07a61eee130\") " pod="openstack/dnsmasq-dns-698758b865-brvlc" Oct 07 11:36:25 crc kubenswrapper[4700]: I1007 11:36:25.550572 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f6ec6b3-bf89-4f27-a2df-d07a61eee130-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-brvlc\" (UID: \"0f6ec6b3-bf89-4f27-a2df-d07a61eee130\") " pod="openstack/dnsmasq-dns-698758b865-brvlc" Oct 07 11:36:25 crc kubenswrapper[4700]: I1007 11:36:25.550653 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f6ec6b3-bf89-4f27-a2df-d07a61eee130-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-brvlc\" (UID: \"0f6ec6b3-bf89-4f27-a2df-d07a61eee130\") " pod="openstack/dnsmasq-dns-698758b865-brvlc" Oct 07 11:36:25 crc kubenswrapper[4700]: I1007 11:36:25.550732 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f6ec6b3-bf89-4f27-a2df-d07a61eee130-config\") pod \"dnsmasq-dns-698758b865-brvlc\" (UID: \"0f6ec6b3-bf89-4f27-a2df-d07a61eee130\") " pod="openstack/dnsmasq-dns-698758b865-brvlc" Oct 07 11:36:25 crc kubenswrapper[4700]: I1007 11:36:25.550756 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7c7t\" (UniqueName: \"kubernetes.io/projected/0f6ec6b3-bf89-4f27-a2df-d07a61eee130-kube-api-access-j7c7t\") pod \"dnsmasq-dns-698758b865-brvlc\" (UID: \"0f6ec6b3-bf89-4f27-a2df-d07a61eee130\") " pod="openstack/dnsmasq-dns-698758b865-brvlc" Oct 07 11:36:25 crc kubenswrapper[4700]: I1007 11:36:25.652258 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7c7t\" (UniqueName: \"kubernetes.io/projected/0f6ec6b3-bf89-4f27-a2df-d07a61eee130-kube-api-access-j7c7t\") pod \"dnsmasq-dns-698758b865-brvlc\" (UID: \"0f6ec6b3-bf89-4f27-a2df-d07a61eee130\") " pod="openstack/dnsmasq-dns-698758b865-brvlc" Oct 07 11:36:25 crc kubenswrapper[4700]: I1007 11:36:25.652352 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f6ec6b3-bf89-4f27-a2df-d07a61eee130-dns-svc\") pod \"dnsmasq-dns-698758b865-brvlc\" (UID: \"0f6ec6b3-bf89-4f27-a2df-d07a61eee130\") " pod="openstack/dnsmasq-dns-698758b865-brvlc" Oct 07 11:36:25 crc kubenswrapper[4700]: I1007 11:36:25.652411 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f6ec6b3-bf89-4f27-a2df-d07a61eee130-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-brvlc\" (UID: \"0f6ec6b3-bf89-4f27-a2df-d07a61eee130\") " pod="openstack/dnsmasq-dns-698758b865-brvlc" Oct 07 11:36:25 crc kubenswrapper[4700]: I1007 11:36:25.652459 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f6ec6b3-bf89-4f27-a2df-d07a61eee130-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-brvlc\" (UID: \"0f6ec6b3-bf89-4f27-a2df-d07a61eee130\") " pod="openstack/dnsmasq-dns-698758b865-brvlc" Oct 07 11:36:25 crc kubenswrapper[4700]: I1007 11:36:25.652555 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f6ec6b3-bf89-4f27-a2df-d07a61eee130-config\") pod \"dnsmasq-dns-698758b865-brvlc\" (UID: \"0f6ec6b3-bf89-4f27-a2df-d07a61eee130\") " pod="openstack/dnsmasq-dns-698758b865-brvlc" Oct 07 11:36:25 crc kubenswrapper[4700]: I1007 11:36:25.653669 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f6ec6b3-bf89-4f27-a2df-d07a61eee130-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-brvlc\" (UID: \"0f6ec6b3-bf89-4f27-a2df-d07a61eee130\") " pod="openstack/dnsmasq-dns-698758b865-brvlc" Oct 07 11:36:25 crc kubenswrapper[4700]: I1007 11:36:25.653686 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f6ec6b3-bf89-4f27-a2df-d07a61eee130-dns-svc\") pod \"dnsmasq-dns-698758b865-brvlc\" (UID: \"0f6ec6b3-bf89-4f27-a2df-d07a61eee130\") " pod="openstack/dnsmasq-dns-698758b865-brvlc" Oct 07 11:36:25 crc kubenswrapper[4700]: I1007 11:36:25.653733 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f6ec6b3-bf89-4f27-a2df-d07a61eee130-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-brvlc\" (UID: \"0f6ec6b3-bf89-4f27-a2df-d07a61eee130\") " pod="openstack/dnsmasq-dns-698758b865-brvlc" Oct 07 11:36:25 crc kubenswrapper[4700]: I1007 11:36:25.653805 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f6ec6b3-bf89-4f27-a2df-d07a61eee130-config\") pod \"dnsmasq-dns-698758b865-brvlc\" (UID: \"0f6ec6b3-bf89-4f27-a2df-d07a61eee130\") " pod="openstack/dnsmasq-dns-698758b865-brvlc" Oct 07 11:36:25 crc kubenswrapper[4700]: I1007 11:36:25.683778 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7c7t\" (UniqueName: \"kubernetes.io/projected/0f6ec6b3-bf89-4f27-a2df-d07a61eee130-kube-api-access-j7c7t\") pod \"dnsmasq-dns-698758b865-brvlc\" (UID: \"0f6ec6b3-bf89-4f27-a2df-d07a61eee130\") " pod="openstack/dnsmasq-dns-698758b865-brvlc" Oct 07 11:36:25 crc kubenswrapper[4700]: I1007 11:36:25.714360 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-brvlc" Oct 07 11:36:26 crc kubenswrapper[4700]: I1007 11:36:26.183016 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-brvlc"] Oct 07 11:36:26 crc kubenswrapper[4700]: W1007 11:36:26.188764 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f6ec6b3_bf89_4f27_a2df_d07a61eee130.slice/crio-250e9a5e1f2f5cc9fcf59cb967649e997a04b0f4bcf6783c484211000c33e1f6 WatchSource:0}: Error finding container 250e9a5e1f2f5cc9fcf59cb967649e997a04b0f4bcf6783c484211000c33e1f6: Status 404 returned error can't find the container with id 250e9a5e1f2f5cc9fcf59cb967649e997a04b0f4bcf6783c484211000c33e1f6 Oct 07 11:36:26 crc kubenswrapper[4700]: I1007 11:36:26.507984 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 07 11:36:26 crc kubenswrapper[4700]: I1007 11:36:26.516150 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 07 11:36:26 crc kubenswrapper[4700]: I1007 11:36:26.519159 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 07 11:36:26 crc kubenswrapper[4700]: I1007 11:36:26.519335 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 07 11:36:26 crc kubenswrapper[4700]: I1007 11:36:26.519452 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 07 11:36:26 crc kubenswrapper[4700]: I1007 11:36:26.519578 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-f4lt4" Oct 07 11:36:26 crc kubenswrapper[4700]: I1007 11:36:26.521663 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 07 11:36:26 crc kubenswrapper[4700]: I1007 11:36:26.562582 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-brvlc" event={"ID":"0f6ec6b3-bf89-4f27-a2df-d07a61eee130","Type":"ContainerStarted","Data":"250e9a5e1f2f5cc9fcf59cb967649e997a04b0f4bcf6783c484211000c33e1f6"} Oct 07 11:36:26 crc kubenswrapper[4700]: I1007 11:36:26.670220 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6555d4a9-f098-43b2-9b50-f7c9d855cf6a-cache\") pod \"swift-storage-0\" (UID: \"6555d4a9-f098-43b2-9b50-f7c9d855cf6a\") " pod="openstack/swift-storage-0" Oct 07 11:36:26 crc kubenswrapper[4700]: I1007 11:36:26.670874 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6555d4a9-f098-43b2-9b50-f7c9d855cf6a-lock\") pod \"swift-storage-0\" (UID: \"6555d4a9-f098-43b2-9b50-f7c9d855cf6a\") " pod="openstack/swift-storage-0" Oct 07 11:36:26 crc kubenswrapper[4700]: I1007 11:36:26.671068 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jztt7\" (UniqueName: \"kubernetes.io/projected/6555d4a9-f098-43b2-9b50-f7c9d855cf6a-kube-api-access-jztt7\") pod \"swift-storage-0\" (UID: \"6555d4a9-f098-43b2-9b50-f7c9d855cf6a\") " pod="openstack/swift-storage-0" Oct 07 11:36:26 crc kubenswrapper[4700]: I1007 11:36:26.671119 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"6555d4a9-f098-43b2-9b50-f7c9d855cf6a\") " pod="openstack/swift-storage-0" Oct 07 11:36:26 crc kubenswrapper[4700]: I1007 11:36:26.671272 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6555d4a9-f098-43b2-9b50-f7c9d855cf6a-etc-swift\") pod \"swift-storage-0\" (UID: \"6555d4a9-f098-43b2-9b50-f7c9d855cf6a\") " pod="openstack/swift-storage-0" Oct 07 11:36:26 crc kubenswrapper[4700]: I1007 11:36:26.772690 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6555d4a9-f098-43b2-9b50-f7c9d855cf6a-etc-swift\") pod \"swift-storage-0\" (UID: \"6555d4a9-f098-43b2-9b50-f7c9d855cf6a\") " pod="openstack/swift-storage-0" Oct 07 11:36:26 crc kubenswrapper[4700]: I1007 11:36:26.772774 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6555d4a9-f098-43b2-9b50-f7c9d855cf6a-cache\") pod \"swift-storage-0\" (UID: \"6555d4a9-f098-43b2-9b50-f7c9d855cf6a\") " pod="openstack/swift-storage-0" Oct 07 11:36:26 crc kubenswrapper[4700]: E1007 11:36:26.772892 4700 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 07 11:36:26 crc kubenswrapper[4700]: I1007 11:36:26.772917 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6555d4a9-f098-43b2-9b50-f7c9d855cf6a-lock\") pod \"swift-storage-0\" (UID: \"6555d4a9-f098-43b2-9b50-f7c9d855cf6a\") " pod="openstack/swift-storage-0" Oct 07 11:36:26 crc kubenswrapper[4700]: I1007 11:36:26.773002 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jztt7\" (UniqueName: \"kubernetes.io/projected/6555d4a9-f098-43b2-9b50-f7c9d855cf6a-kube-api-access-jztt7\") pod \"swift-storage-0\" (UID: \"6555d4a9-f098-43b2-9b50-f7c9d855cf6a\") " pod="openstack/swift-storage-0" Oct 07 11:36:26 crc kubenswrapper[4700]: I1007 11:36:26.773041 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"6555d4a9-f098-43b2-9b50-f7c9d855cf6a\") " pod="openstack/swift-storage-0" Oct 07 11:36:26 crc kubenswrapper[4700]: E1007 11:36:26.772927 4700 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 07 11:36:26 crc kubenswrapper[4700]: I1007 11:36:26.773484 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6555d4a9-f098-43b2-9b50-f7c9d855cf6a-lock\") pod \"swift-storage-0\" (UID: \"6555d4a9-f098-43b2-9b50-f7c9d855cf6a\") " pod="openstack/swift-storage-0" Oct 07 11:36:26 crc kubenswrapper[4700]: E1007 11:36:26.773492 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6555d4a9-f098-43b2-9b50-f7c9d855cf6a-etc-swift podName:6555d4a9-f098-43b2-9b50-f7c9d855cf6a nodeName:}" failed. No retries permitted until 2025-10-07 11:36:27.273453683 +0000 UTC m=+954.069852682 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6555d4a9-f098-43b2-9b50-f7c9d855cf6a-etc-swift") pod "swift-storage-0" (UID: "6555d4a9-f098-43b2-9b50-f7c9d855cf6a") : configmap "swift-ring-files" not found Oct 07 11:36:26 crc kubenswrapper[4700]: I1007 11:36:26.773323 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6555d4a9-f098-43b2-9b50-f7c9d855cf6a-cache\") pod \"swift-storage-0\" (UID: \"6555d4a9-f098-43b2-9b50-f7c9d855cf6a\") " pod="openstack/swift-storage-0" Oct 07 11:36:26 crc kubenswrapper[4700]: I1007 11:36:26.773565 4700 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"6555d4a9-f098-43b2-9b50-f7c9d855cf6a\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/swift-storage-0" Oct 07 11:36:26 crc kubenswrapper[4700]: I1007 11:36:26.801586 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"6555d4a9-f098-43b2-9b50-f7c9d855cf6a\") " pod="openstack/swift-storage-0" Oct 07 11:36:26 crc kubenswrapper[4700]: I1007 11:36:26.804243 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jztt7\" (UniqueName: \"kubernetes.io/projected/6555d4a9-f098-43b2-9b50-f7c9d855cf6a-kube-api-access-jztt7\") pod \"swift-storage-0\" (UID: \"6555d4a9-f098-43b2-9b50-f7c9d855cf6a\") " pod="openstack/swift-storage-0" Oct 07 11:36:27 crc kubenswrapper[4700]: I1007 11:36:27.004098 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-pfsbd"] Oct 07 11:36:27 crc kubenswrapper[4700]: I1007 11:36:27.006868 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-pfsbd" Oct 07 11:36:27 crc kubenswrapper[4700]: I1007 11:36:27.008323 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 07 11:36:27 crc kubenswrapper[4700]: I1007 11:36:27.009533 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 07 11:36:27 crc kubenswrapper[4700]: I1007 11:36:27.009705 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 07 11:36:27 crc kubenswrapper[4700]: I1007 11:36:27.022216 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-pfsbd"] Oct 07 11:36:27 crc kubenswrapper[4700]: I1007 11:36:27.179208 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3352f4b9-00aa-419c-a354-1fb7b7120ad5-swiftconf\") pod \"swift-ring-rebalance-pfsbd\" (UID: \"3352f4b9-00aa-419c-a354-1fb7b7120ad5\") " pod="openstack/swift-ring-rebalance-pfsbd" Oct 07 11:36:27 crc kubenswrapper[4700]: I1007 11:36:27.179358 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3352f4b9-00aa-419c-a354-1fb7b7120ad5-combined-ca-bundle\") pod \"swift-ring-rebalance-pfsbd\" (UID: \"3352f4b9-00aa-419c-a354-1fb7b7120ad5\") " pod="openstack/swift-ring-rebalance-pfsbd" Oct 07 11:36:27 crc kubenswrapper[4700]: I1007 11:36:27.179407 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8tlv\" (UniqueName: \"kubernetes.io/projected/3352f4b9-00aa-419c-a354-1fb7b7120ad5-kube-api-access-z8tlv\") pod \"swift-ring-rebalance-pfsbd\" (UID: \"3352f4b9-00aa-419c-a354-1fb7b7120ad5\") " pod="openstack/swift-ring-rebalance-pfsbd" Oct 07 11:36:27 crc kubenswrapper[4700]: I1007 11:36:27.179440 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3352f4b9-00aa-419c-a354-1fb7b7120ad5-etc-swift\") pod \"swift-ring-rebalance-pfsbd\" (UID: \"3352f4b9-00aa-419c-a354-1fb7b7120ad5\") " pod="openstack/swift-ring-rebalance-pfsbd" Oct 07 11:36:27 crc kubenswrapper[4700]: I1007 11:36:27.179489 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3352f4b9-00aa-419c-a354-1fb7b7120ad5-ring-data-devices\") pod \"swift-ring-rebalance-pfsbd\" (UID: \"3352f4b9-00aa-419c-a354-1fb7b7120ad5\") " pod="openstack/swift-ring-rebalance-pfsbd" Oct 07 11:36:27 crc kubenswrapper[4700]: I1007 11:36:27.179615 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3352f4b9-00aa-419c-a354-1fb7b7120ad5-dispersionconf\") pod \"swift-ring-rebalance-pfsbd\" (UID: \"3352f4b9-00aa-419c-a354-1fb7b7120ad5\") " pod="openstack/swift-ring-rebalance-pfsbd" Oct 07 11:36:27 crc kubenswrapper[4700]: I1007 11:36:27.179718 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3352f4b9-00aa-419c-a354-1fb7b7120ad5-scripts\") pod \"swift-ring-rebalance-pfsbd\" (UID: \"3352f4b9-00aa-419c-a354-1fb7b7120ad5\") " pod="openstack/swift-ring-rebalance-pfsbd" Oct 07 11:36:27 crc kubenswrapper[4700]: I1007 11:36:27.281391 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3352f4b9-00aa-419c-a354-1fb7b7120ad5-combined-ca-bundle\") pod \"swift-ring-rebalance-pfsbd\" (UID: \"3352f4b9-00aa-419c-a354-1fb7b7120ad5\") " pod="openstack/swift-ring-rebalance-pfsbd" Oct 07 11:36:27 crc kubenswrapper[4700]: I1007 11:36:27.281533 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8tlv\" (UniqueName: \"kubernetes.io/projected/3352f4b9-00aa-419c-a354-1fb7b7120ad5-kube-api-access-z8tlv\") pod \"swift-ring-rebalance-pfsbd\" (UID: \"3352f4b9-00aa-419c-a354-1fb7b7120ad5\") " pod="openstack/swift-ring-rebalance-pfsbd" Oct 07 11:36:27 crc kubenswrapper[4700]: I1007 11:36:27.281575 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3352f4b9-00aa-419c-a354-1fb7b7120ad5-etc-swift\") pod \"swift-ring-rebalance-pfsbd\" (UID: \"3352f4b9-00aa-419c-a354-1fb7b7120ad5\") " pod="openstack/swift-ring-rebalance-pfsbd" Oct 07 11:36:27 crc kubenswrapper[4700]: I1007 11:36:27.281635 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3352f4b9-00aa-419c-a354-1fb7b7120ad5-ring-data-devices\") pod \"swift-ring-rebalance-pfsbd\" (UID: \"3352f4b9-00aa-419c-a354-1fb7b7120ad5\") " pod="openstack/swift-ring-rebalance-pfsbd" Oct 07 11:36:27 crc kubenswrapper[4700]: I1007 11:36:27.281703 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3352f4b9-00aa-419c-a354-1fb7b7120ad5-dispersionconf\") pod \"swift-ring-rebalance-pfsbd\" (UID: \"3352f4b9-00aa-419c-a354-1fb7b7120ad5\") " pod="openstack/swift-ring-rebalance-pfsbd" Oct 07 11:36:27 crc kubenswrapper[4700]: I1007 11:36:27.281760 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3352f4b9-00aa-419c-a354-1fb7b7120ad5-scripts\") pod \"swift-ring-rebalance-pfsbd\" (UID: \"3352f4b9-00aa-419c-a354-1fb7b7120ad5\") " pod="openstack/swift-ring-rebalance-pfsbd" Oct 07 11:36:27 crc kubenswrapper[4700]: I1007 11:36:27.281847 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3352f4b9-00aa-419c-a354-1fb7b7120ad5-swiftconf\") pod \"swift-ring-rebalance-pfsbd\" (UID: \"3352f4b9-00aa-419c-a354-1fb7b7120ad5\") " pod="openstack/swift-ring-rebalance-pfsbd" Oct 07 11:36:27 crc kubenswrapper[4700]: I1007 11:36:27.281892 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6555d4a9-f098-43b2-9b50-f7c9d855cf6a-etc-swift\") pod \"swift-storage-0\" (UID: \"6555d4a9-f098-43b2-9b50-f7c9d855cf6a\") " pod="openstack/swift-storage-0" Oct 07 11:36:27 crc kubenswrapper[4700]: I1007 11:36:27.282261 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3352f4b9-00aa-419c-a354-1fb7b7120ad5-etc-swift\") pod \"swift-ring-rebalance-pfsbd\" (UID: \"3352f4b9-00aa-419c-a354-1fb7b7120ad5\") " pod="openstack/swift-ring-rebalance-pfsbd" Oct 07 11:36:27 crc kubenswrapper[4700]: E1007 11:36:27.282374 4700 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 07 11:36:27 crc kubenswrapper[4700]: E1007 11:36:27.282413 4700 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 07 11:36:27 crc kubenswrapper[4700]: E1007 11:36:27.282501 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6555d4a9-f098-43b2-9b50-f7c9d855cf6a-etc-swift podName:6555d4a9-f098-43b2-9b50-f7c9d855cf6a nodeName:}" failed. No retries permitted until 2025-10-07 11:36:28.282471206 +0000 UTC m=+955.078870235 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6555d4a9-f098-43b2-9b50-f7c9d855cf6a-etc-swift") pod "swift-storage-0" (UID: "6555d4a9-f098-43b2-9b50-f7c9d855cf6a") : configmap "swift-ring-files" not found Oct 07 11:36:27 crc kubenswrapper[4700]: I1007 11:36:27.282981 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3352f4b9-00aa-419c-a354-1fb7b7120ad5-scripts\") pod \"swift-ring-rebalance-pfsbd\" (UID: \"3352f4b9-00aa-419c-a354-1fb7b7120ad5\") " pod="openstack/swift-ring-rebalance-pfsbd" Oct 07 11:36:27 crc kubenswrapper[4700]: I1007 11:36:27.283031 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3352f4b9-00aa-419c-a354-1fb7b7120ad5-ring-data-devices\") pod \"swift-ring-rebalance-pfsbd\" (UID: \"3352f4b9-00aa-419c-a354-1fb7b7120ad5\") " pod="openstack/swift-ring-rebalance-pfsbd" Oct 07 11:36:27 crc kubenswrapper[4700]: I1007 11:36:27.286596 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3352f4b9-00aa-419c-a354-1fb7b7120ad5-dispersionconf\") pod \"swift-ring-rebalance-pfsbd\" (UID: \"3352f4b9-00aa-419c-a354-1fb7b7120ad5\") " pod="openstack/swift-ring-rebalance-pfsbd" Oct 07 11:36:27 crc kubenswrapper[4700]: I1007 11:36:27.287962 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3352f4b9-00aa-419c-a354-1fb7b7120ad5-swiftconf\") pod \"swift-ring-rebalance-pfsbd\" (UID: \"3352f4b9-00aa-419c-a354-1fb7b7120ad5\") " pod="openstack/swift-ring-rebalance-pfsbd" Oct 07 11:36:27 crc kubenswrapper[4700]: I1007 11:36:27.288714 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3352f4b9-00aa-419c-a354-1fb7b7120ad5-combined-ca-bundle\") pod \"swift-ring-rebalance-pfsbd\" (UID: \"3352f4b9-00aa-419c-a354-1fb7b7120ad5\") " pod="openstack/swift-ring-rebalance-pfsbd" Oct 07 11:36:27 crc kubenswrapper[4700]: I1007 11:36:27.321926 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8tlv\" (UniqueName: \"kubernetes.io/projected/3352f4b9-00aa-419c-a354-1fb7b7120ad5-kube-api-access-z8tlv\") pod \"swift-ring-rebalance-pfsbd\" (UID: \"3352f4b9-00aa-419c-a354-1fb7b7120ad5\") " pod="openstack/swift-ring-rebalance-pfsbd" Oct 07 11:36:27 crc kubenswrapper[4700]: I1007 11:36:27.330258 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-pfsbd" Oct 07 11:36:27 crc kubenswrapper[4700]: I1007 11:36:27.586428 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bbffr" event={"ID":"ea8dd361-88e8-4530-9d61-ac84770c792e","Type":"ContainerStarted","Data":"fddd95d90b055b59f30571370ffbadef4dd2bfa0f1f04155a4586d921eee3bcd"} Oct 07 11:36:27 crc kubenswrapper[4700]: I1007 11:36:27.590585 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-qn762" event={"ID":"f9008225-ce95-4591-9a50-8f2982a231a5","Type":"ContainerStarted","Data":"de7e272971a3f058c8529c2a1e7dc497030c88992d5aedb6dfd742b8a505cbd8"} Oct 07 11:36:27 crc kubenswrapper[4700]: I1007 11:36:27.605550 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-bbffr" podStartSLOduration=5.60553096 podStartE2EDuration="5.60553096s" podCreationTimestamp="2025-10-07 11:36:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:36:27.602636515 +0000 UTC m=+954.399035524" watchObservedRunningTime="2025-10-07 11:36:27.60553096 +0000 UTC m=+954.401929949" Oct 07 11:36:27 crc kubenswrapper[4700]: I1007 11:36:27.779740 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-pfsbd"] Oct 07 11:36:27 crc kubenswrapper[4700]: W1007 11:36:27.787255 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3352f4b9_00aa_419c_a354_1fb7b7120ad5.slice/crio-f65936672fa6ebc4a68d059798745a491207caf6d5a2fa4cf6b844deb13601f2 WatchSource:0}: Error finding container f65936672fa6ebc4a68d059798745a491207caf6d5a2fa4cf6b844deb13601f2: Status 404 returned error can't find the container with id f65936672fa6ebc4a68d059798745a491207caf6d5a2fa4cf6b844deb13601f2 Oct 07 11:36:27 crc kubenswrapper[4700]: I1007 11:36:27.981646 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7f896c8c65-mdz96" podUID="e7cf822b-19fa-4313-8b37-a2a5faf14587" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: connect: connection refused" Oct 07 11:36:28 crc kubenswrapper[4700]: I1007 11:36:28.296767 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6555d4a9-f098-43b2-9b50-f7c9d855cf6a-etc-swift\") pod \"swift-storage-0\" (UID: \"6555d4a9-f098-43b2-9b50-f7c9d855cf6a\") " pod="openstack/swift-storage-0" Oct 07 11:36:28 crc kubenswrapper[4700]: E1007 11:36:28.297132 4700 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 07 11:36:28 crc kubenswrapper[4700]: E1007 11:36:28.297148 4700 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 07 11:36:28 crc kubenswrapper[4700]: E1007 11:36:28.297188 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6555d4a9-f098-43b2-9b50-f7c9d855cf6a-etc-swift podName:6555d4a9-f098-43b2-9b50-f7c9d855cf6a nodeName:}" failed. No retries permitted until 2025-10-07 11:36:30.297176582 +0000 UTC m=+957.093575571 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6555d4a9-f098-43b2-9b50-f7c9d855cf6a-etc-swift") pod "swift-storage-0" (UID: "6555d4a9-f098-43b2-9b50-f7c9d855cf6a") : configmap "swift-ring-files" not found Oct 07 11:36:28 crc kubenswrapper[4700]: I1007 11:36:28.482558 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-xmw8w" Oct 07 11:36:28 crc kubenswrapper[4700]: I1007 11:36:28.487710 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-v9hfj"] Oct 07 11:36:28 crc kubenswrapper[4700]: I1007 11:36:28.488672 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-v9hfj" Oct 07 11:36:28 crc kubenswrapper[4700]: I1007 11:36:28.496273 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-v9hfj"] Oct 07 11:36:28 crc kubenswrapper[4700]: I1007 11:36:28.600851 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f44jn\" (UniqueName: \"kubernetes.io/projected/c5e9d9fd-6daa-4dd6-a1d7-3dbdf6384b09-kube-api-access-f44jn\") pod \"glance-db-create-v9hfj\" (UID: \"c5e9d9fd-6daa-4dd6-a1d7-3dbdf6384b09\") " pod="openstack/glance-db-create-v9hfj" Oct 07 11:36:28 crc kubenswrapper[4700]: I1007 11:36:28.600896 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-mdz96" event={"ID":"e7cf822b-19fa-4313-8b37-a2a5faf14587","Type":"ContainerDied","Data":"ed7e133d4b36511dc2a373d73e05a9f8f04d84ee84cf571e2ad916340a59879c"} Oct 07 11:36:28 crc kubenswrapper[4700]: I1007 11:36:28.601477 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed7e133d4b36511dc2a373d73e05a9f8f04d84ee84cf571e2ad916340a59879c" Oct 07 11:36:28 crc kubenswrapper[4700]: I1007 11:36:28.602579 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-pfsbd" event={"ID":"3352f4b9-00aa-419c-a354-1fb7b7120ad5","Type":"ContainerStarted","Data":"f65936672fa6ebc4a68d059798745a491207caf6d5a2fa4cf6b844deb13601f2"} Oct 07 11:36:28 crc kubenswrapper[4700]: I1007 11:36:28.604986 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-brvlc" event={"ID":"0f6ec6b3-bf89-4f27-a2df-d07a61eee130","Type":"ContainerStarted","Data":"7cc34004825b9b0178a2d81e0a58de64c4845da4f033babfe04f2c97492042e4"} Oct 07 11:36:28 crc kubenswrapper[4700]: I1007 11:36:28.624345 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-qn762" podStartSLOduration=5.624328044 podStartE2EDuration="5.624328044s" podCreationTimestamp="2025-10-07 11:36:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:36:28.618466291 +0000 UTC m=+955.414865280" watchObservedRunningTime="2025-10-07 11:36:28.624328044 +0000 UTC m=+955.420727023" Oct 07 11:36:28 crc kubenswrapper[4700]: I1007 11:36:28.702737 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f44jn\" (UniqueName: \"kubernetes.io/projected/c5e9d9fd-6daa-4dd6-a1d7-3dbdf6384b09-kube-api-access-f44jn\") pod \"glance-db-create-v9hfj\" (UID: \"c5e9d9fd-6daa-4dd6-a1d7-3dbdf6384b09\") " pod="openstack/glance-db-create-v9hfj" Oct 07 11:36:28 crc kubenswrapper[4700]: I1007 11:36:28.719529 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-mdz96" Oct 07 11:36:28 crc kubenswrapper[4700]: I1007 11:36:28.729532 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f44jn\" (UniqueName: \"kubernetes.io/projected/c5e9d9fd-6daa-4dd6-a1d7-3dbdf6384b09-kube-api-access-f44jn\") pod \"glance-db-create-v9hfj\" (UID: \"c5e9d9fd-6daa-4dd6-a1d7-3dbdf6384b09\") " pod="openstack/glance-db-create-v9hfj" Oct 07 11:36:28 crc kubenswrapper[4700]: I1007 11:36:28.803571 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qsgq\" (UniqueName: \"kubernetes.io/projected/e7cf822b-19fa-4313-8b37-a2a5faf14587-kube-api-access-4qsgq\") pod \"e7cf822b-19fa-4313-8b37-a2a5faf14587\" (UID: \"e7cf822b-19fa-4313-8b37-a2a5faf14587\") " Oct 07 11:36:28 crc kubenswrapper[4700]: I1007 11:36:28.803727 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7cf822b-19fa-4313-8b37-a2a5faf14587-ovsdbserver-sb\") pod \"e7cf822b-19fa-4313-8b37-a2a5faf14587\" (UID: \"e7cf822b-19fa-4313-8b37-a2a5faf14587\") " Oct 07 11:36:28 crc kubenswrapper[4700]: I1007 11:36:28.803818 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7cf822b-19fa-4313-8b37-a2a5faf14587-config\") pod \"e7cf822b-19fa-4313-8b37-a2a5faf14587\" (UID: \"e7cf822b-19fa-4313-8b37-a2a5faf14587\") " Oct 07 11:36:28 crc kubenswrapper[4700]: I1007 11:36:28.803886 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7cf822b-19fa-4313-8b37-a2a5faf14587-dns-svc\") pod \"e7cf822b-19fa-4313-8b37-a2a5faf14587\" (UID: \"e7cf822b-19fa-4313-8b37-a2a5faf14587\") " Oct 07 11:36:28 crc kubenswrapper[4700]: I1007 11:36:28.808120 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7cf822b-19fa-4313-8b37-a2a5faf14587-kube-api-access-4qsgq" (OuterVolumeSpecName: "kube-api-access-4qsgq") pod "e7cf822b-19fa-4313-8b37-a2a5faf14587" (UID: "e7cf822b-19fa-4313-8b37-a2a5faf14587"). InnerVolumeSpecName "kube-api-access-4qsgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:36:28 crc kubenswrapper[4700]: I1007 11:36:28.811382 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-v9hfj" Oct 07 11:36:28 crc kubenswrapper[4700]: I1007 11:36:28.844217 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7cf822b-19fa-4313-8b37-a2a5faf14587-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e7cf822b-19fa-4313-8b37-a2a5faf14587" (UID: "e7cf822b-19fa-4313-8b37-a2a5faf14587"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:36:28 crc kubenswrapper[4700]: I1007 11:36:28.849533 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7cf822b-19fa-4313-8b37-a2a5faf14587-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e7cf822b-19fa-4313-8b37-a2a5faf14587" (UID: "e7cf822b-19fa-4313-8b37-a2a5faf14587"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:36:28 crc kubenswrapper[4700]: I1007 11:36:28.854612 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7cf822b-19fa-4313-8b37-a2a5faf14587-config" (OuterVolumeSpecName: "config") pod "e7cf822b-19fa-4313-8b37-a2a5faf14587" (UID: "e7cf822b-19fa-4313-8b37-a2a5faf14587"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:36:28 crc kubenswrapper[4700]: I1007 11:36:28.908592 4700 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7cf822b-19fa-4313-8b37-a2a5faf14587-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:28 crc kubenswrapper[4700]: I1007 11:36:28.908627 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qsgq\" (UniqueName: \"kubernetes.io/projected/e7cf822b-19fa-4313-8b37-a2a5faf14587-kube-api-access-4qsgq\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:28 crc kubenswrapper[4700]: I1007 11:36:28.908641 4700 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7cf822b-19fa-4313-8b37-a2a5faf14587-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:28 crc kubenswrapper[4700]: I1007 11:36:28.908655 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7cf822b-19fa-4313-8b37-a2a5faf14587-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:29 crc kubenswrapper[4700]: I1007 11:36:29.355928 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-v9hfj"] Oct 07 11:36:29 crc kubenswrapper[4700]: W1007 11:36:29.368633 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5e9d9fd_6daa_4dd6_a1d7_3dbdf6384b09.slice/crio-0b930aa7d3fba353a955ec884527ce6f2b1931c46a5fecde29d6106ee45d433c WatchSource:0}: Error finding container 0b930aa7d3fba353a955ec884527ce6f2b1931c46a5fecde29d6106ee45d433c: Status 404 returned error can't find the container with id 0b930aa7d3fba353a955ec884527ce6f2b1931c46a5fecde29d6106ee45d433c Oct 07 11:36:29 crc kubenswrapper[4700]: I1007 11:36:29.615756 4700 generic.go:334] "Generic (PLEG): container finished" podID="0f6ec6b3-bf89-4f27-a2df-d07a61eee130" containerID="7cc34004825b9b0178a2d81e0a58de64c4845da4f033babfe04f2c97492042e4" exitCode=0 Oct 07 11:36:29 crc kubenswrapper[4700]: I1007 11:36:29.615804 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-brvlc" event={"ID":"0f6ec6b3-bf89-4f27-a2df-d07a61eee130","Type":"ContainerDied","Data":"7cc34004825b9b0178a2d81e0a58de64c4845da4f033babfe04f2c97492042e4"} Oct 07 11:36:29 crc kubenswrapper[4700]: I1007 11:36:29.619301 4700 generic.go:334] "Generic (PLEG): container finished" podID="ea8dd361-88e8-4530-9d61-ac84770c792e" containerID="fddd95d90b055b59f30571370ffbadef4dd2bfa0f1f04155a4586d921eee3bcd" exitCode=0 Oct 07 11:36:29 crc kubenswrapper[4700]: I1007 11:36:29.619385 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bbffr" event={"ID":"ea8dd361-88e8-4530-9d61-ac84770c792e","Type":"ContainerDied","Data":"fddd95d90b055b59f30571370ffbadef4dd2bfa0f1f04155a4586d921eee3bcd"} Oct 07 11:36:29 crc kubenswrapper[4700]: I1007 11:36:29.625088 4700 generic.go:334] "Generic (PLEG): container finished" podID="f9008225-ce95-4591-9a50-8f2982a231a5" containerID="de7e272971a3f058c8529c2a1e7dc497030c88992d5aedb6dfd742b8a505cbd8" exitCode=0 Oct 07 11:36:29 crc kubenswrapper[4700]: I1007 11:36:29.625350 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-qn762" event={"ID":"f9008225-ce95-4591-9a50-8f2982a231a5","Type":"ContainerDied","Data":"de7e272971a3f058c8529c2a1e7dc497030c88992d5aedb6dfd742b8a505cbd8"} Oct 07 11:36:29 crc kubenswrapper[4700]: I1007 11:36:29.629598 4700 generic.go:334] "Generic (PLEG): container finished" podID="c5e9d9fd-6daa-4dd6-a1d7-3dbdf6384b09" containerID="65d073bc1e52cf98a23725406668688e5c82fb318f5179cbc95e90e6c7ac74ce" exitCode=0 Oct 07 11:36:29 crc kubenswrapper[4700]: I1007 11:36:29.629683 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-mdz96" Oct 07 11:36:29 crc kubenswrapper[4700]: I1007 11:36:29.629773 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-v9hfj" event={"ID":"c5e9d9fd-6daa-4dd6-a1d7-3dbdf6384b09","Type":"ContainerDied","Data":"65d073bc1e52cf98a23725406668688e5c82fb318f5179cbc95e90e6c7ac74ce"} Oct 07 11:36:29 crc kubenswrapper[4700]: I1007 11:36:29.629889 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-v9hfj" event={"ID":"c5e9d9fd-6daa-4dd6-a1d7-3dbdf6384b09","Type":"ContainerStarted","Data":"0b930aa7d3fba353a955ec884527ce6f2b1931c46a5fecde29d6106ee45d433c"} Oct 07 11:36:29 crc kubenswrapper[4700]: I1007 11:36:29.714295 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-mdz96"] Oct 07 11:36:29 crc kubenswrapper[4700]: I1007 11:36:29.721900 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-mdz96"] Oct 07 11:36:29 crc kubenswrapper[4700]: I1007 11:36:29.970239 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7cf822b-19fa-4313-8b37-a2a5faf14587" path="/var/lib/kubelet/pods/e7cf822b-19fa-4313-8b37-a2a5faf14587/volumes" Oct 07 11:36:30 crc kubenswrapper[4700]: I1007 11:36:30.337473 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6555d4a9-f098-43b2-9b50-f7c9d855cf6a-etc-swift\") pod \"swift-storage-0\" (UID: \"6555d4a9-f098-43b2-9b50-f7c9d855cf6a\") " pod="openstack/swift-storage-0" Oct 07 11:36:30 crc kubenswrapper[4700]: E1007 11:36:30.337739 4700 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 07 11:36:30 crc kubenswrapper[4700]: E1007 11:36:30.337757 4700 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 07 11:36:30 crc kubenswrapper[4700]: E1007 11:36:30.337810 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6555d4a9-f098-43b2-9b50-f7c9d855cf6a-etc-swift podName:6555d4a9-f098-43b2-9b50-f7c9d855cf6a nodeName:}" failed. No retries permitted until 2025-10-07 11:36:34.337792228 +0000 UTC m=+961.134191227 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6555d4a9-f098-43b2-9b50-f7c9d855cf6a-etc-swift") pod "swift-storage-0" (UID: "6555d4a9-f098-43b2-9b50-f7c9d855cf6a") : configmap "swift-ring-files" not found Oct 07 11:36:31 crc kubenswrapper[4700]: I1007 11:36:31.660128 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-qn762" event={"ID":"f9008225-ce95-4591-9a50-8f2982a231a5","Type":"ContainerDied","Data":"4bc16fc5d84510900613c227907ef8179a297d44cde28306bc82d90b43e9f78e"} Oct 07 11:36:31 crc kubenswrapper[4700]: I1007 11:36:31.661521 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bc16fc5d84510900613c227907ef8179a297d44cde28306bc82d90b43e9f78e" Oct 07 11:36:31 crc kubenswrapper[4700]: I1007 11:36:31.663785 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-v9hfj" event={"ID":"c5e9d9fd-6daa-4dd6-a1d7-3dbdf6384b09","Type":"ContainerDied","Data":"0b930aa7d3fba353a955ec884527ce6f2b1931c46a5fecde29d6106ee45d433c"} Oct 07 11:36:31 crc kubenswrapper[4700]: I1007 11:36:31.663830 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b930aa7d3fba353a955ec884527ce6f2b1931c46a5fecde29d6106ee45d433c" Oct 07 11:36:31 crc kubenswrapper[4700]: I1007 11:36:31.667094 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bbffr" event={"ID":"ea8dd361-88e8-4530-9d61-ac84770c792e","Type":"ContainerDied","Data":"6c6dc2ce7424634e7abd7176897321a22e55965038c80d0e6a6009ea6964f601"} Oct 07 11:36:31 crc kubenswrapper[4700]: I1007 11:36:31.667231 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c6dc2ce7424634e7abd7176897321a22e55965038c80d0e6a6009ea6964f601" Oct 07 11:36:31 crc kubenswrapper[4700]: I1007 11:36:31.881156 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bbffr" Oct 07 11:36:31 crc kubenswrapper[4700]: I1007 11:36:31.932885 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-qn762" Oct 07 11:36:31 crc kubenswrapper[4700]: I1007 11:36:31.960694 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-v9hfj" Oct 07 11:36:31 crc kubenswrapper[4700]: I1007 11:36:31.961650 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jz6q5\" (UniqueName: \"kubernetes.io/projected/ea8dd361-88e8-4530-9d61-ac84770c792e-kube-api-access-jz6q5\") pod \"ea8dd361-88e8-4530-9d61-ac84770c792e\" (UID: \"ea8dd361-88e8-4530-9d61-ac84770c792e\") " Oct 07 11:36:31 crc kubenswrapper[4700]: I1007 11:36:31.969728 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea8dd361-88e8-4530-9d61-ac84770c792e-kube-api-access-jz6q5" (OuterVolumeSpecName: "kube-api-access-jz6q5") pod "ea8dd361-88e8-4530-9d61-ac84770c792e" (UID: "ea8dd361-88e8-4530-9d61-ac84770c792e"). InnerVolumeSpecName "kube-api-access-jz6q5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:36:32 crc kubenswrapper[4700]: I1007 11:36:32.063519 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6s2v6\" (UniqueName: \"kubernetes.io/projected/f9008225-ce95-4591-9a50-8f2982a231a5-kube-api-access-6s2v6\") pod \"f9008225-ce95-4591-9a50-8f2982a231a5\" (UID: \"f9008225-ce95-4591-9a50-8f2982a231a5\") " Oct 07 11:36:32 crc kubenswrapper[4700]: I1007 11:36:32.063612 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f44jn\" (UniqueName: \"kubernetes.io/projected/c5e9d9fd-6daa-4dd6-a1d7-3dbdf6384b09-kube-api-access-f44jn\") pod \"c5e9d9fd-6daa-4dd6-a1d7-3dbdf6384b09\" (UID: \"c5e9d9fd-6daa-4dd6-a1d7-3dbdf6384b09\") " Oct 07 11:36:32 crc kubenswrapper[4700]: I1007 11:36:32.063957 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jz6q5\" (UniqueName: \"kubernetes.io/projected/ea8dd361-88e8-4530-9d61-ac84770c792e-kube-api-access-jz6q5\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:32 crc kubenswrapper[4700]: I1007 11:36:32.067135 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5e9d9fd-6daa-4dd6-a1d7-3dbdf6384b09-kube-api-access-f44jn" (OuterVolumeSpecName: "kube-api-access-f44jn") pod "c5e9d9fd-6daa-4dd6-a1d7-3dbdf6384b09" (UID: "c5e9d9fd-6daa-4dd6-a1d7-3dbdf6384b09"). InnerVolumeSpecName "kube-api-access-f44jn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:36:32 crc kubenswrapper[4700]: I1007 11:36:32.068058 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9008225-ce95-4591-9a50-8f2982a231a5-kube-api-access-6s2v6" (OuterVolumeSpecName: "kube-api-access-6s2v6") pod "f9008225-ce95-4591-9a50-8f2982a231a5" (UID: "f9008225-ce95-4591-9a50-8f2982a231a5"). InnerVolumeSpecName "kube-api-access-6s2v6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:36:32 crc kubenswrapper[4700]: I1007 11:36:32.165200 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6s2v6\" (UniqueName: \"kubernetes.io/projected/f9008225-ce95-4591-9a50-8f2982a231a5-kube-api-access-6s2v6\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:32 crc kubenswrapper[4700]: I1007 11:36:32.165240 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f44jn\" (UniqueName: \"kubernetes.io/projected/c5e9d9fd-6daa-4dd6-a1d7-3dbdf6384b09-kube-api-access-f44jn\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:32 crc kubenswrapper[4700]: I1007 11:36:32.679572 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-brvlc" event={"ID":"0f6ec6b3-bf89-4f27-a2df-d07a61eee130","Type":"ContainerStarted","Data":"8d3af7cd8017204a38e31f2c00f0163c15c749089e772805f1c7805c37f96833"} Oct 07 11:36:32 crc kubenswrapper[4700]: I1007 11:36:32.679663 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-brvlc" Oct 07 11:36:32 crc kubenswrapper[4700]: I1007 11:36:32.682031 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-pfsbd" event={"ID":"3352f4b9-00aa-419c-a354-1fb7b7120ad5","Type":"ContainerStarted","Data":"baadb84b9e24cfcf1fa69e883c00f643d6148fb15bc571fb5d5549af4fc424da"} Oct 07 11:36:32 crc kubenswrapper[4700]: I1007 11:36:32.682091 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-qn762" Oct 07 11:36:32 crc kubenswrapper[4700]: I1007 11:36:32.682096 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bbffr" Oct 07 11:36:32 crc kubenswrapper[4700]: I1007 11:36:32.682129 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-v9hfj" Oct 07 11:36:32 crc kubenswrapper[4700]: I1007 11:36:32.768744 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-pfsbd" podStartSLOduration=2.873403132 podStartE2EDuration="6.768721589s" podCreationTimestamp="2025-10-07 11:36:26 +0000 UTC" firstStartedPulling="2025-10-07 11:36:27.791558389 +0000 UTC m=+954.587957378" lastFinishedPulling="2025-10-07 11:36:31.686876836 +0000 UTC m=+958.483275835" observedRunningTime="2025-10-07 11:36:32.76344977 +0000 UTC m=+959.559848759" watchObservedRunningTime="2025-10-07 11:36:32.768721589 +0000 UTC m=+959.565120578" Oct 07 11:36:32 crc kubenswrapper[4700]: I1007 11:36:32.770173 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-brvlc" podStartSLOduration=7.770158706 podStartE2EDuration="7.770158706s" podCreationTimestamp="2025-10-07 11:36:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:36:32.738530908 +0000 UTC m=+959.534929897" watchObservedRunningTime="2025-10-07 11:36:32.770158706 +0000 UTC m=+959.566557695" Oct 07 11:36:32 crc kubenswrapper[4700]: I1007 11:36:32.872568 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 07 11:36:34 crc kubenswrapper[4700]: I1007 11:36:34.036822 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 07 11:36:34 crc kubenswrapper[4700]: I1007 11:36:34.409505 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6555d4a9-f098-43b2-9b50-f7c9d855cf6a-etc-swift\") pod \"swift-storage-0\" (UID: \"6555d4a9-f098-43b2-9b50-f7c9d855cf6a\") " pod="openstack/swift-storage-0" Oct 07 11:36:34 crc kubenswrapper[4700]: E1007 11:36:34.409776 4700 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 07 11:36:34 crc kubenswrapper[4700]: E1007 11:36:34.409794 4700 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 07 11:36:34 crc kubenswrapper[4700]: E1007 11:36:34.409850 4700 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6555d4a9-f098-43b2-9b50-f7c9d855cf6a-etc-swift podName:6555d4a9-f098-43b2-9b50-f7c9d855cf6a nodeName:}" failed. No retries permitted until 2025-10-07 11:36:42.409832979 +0000 UTC m=+969.206231968 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6555d4a9-f098-43b2-9b50-f7c9d855cf6a-etc-swift") pod "swift-storage-0" (UID: "6555d4a9-f098-43b2-9b50-f7c9d855cf6a") : configmap "swift-ring-files" not found Oct 07 11:36:38 crc kubenswrapper[4700]: I1007 11:36:38.621605 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-ddb5-account-create-6bjgp"] Oct 07 11:36:38 crc kubenswrapper[4700]: E1007 11:36:38.622247 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea8dd361-88e8-4530-9d61-ac84770c792e" containerName="mariadb-database-create" Oct 07 11:36:38 crc kubenswrapper[4700]: I1007 11:36:38.622260 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea8dd361-88e8-4530-9d61-ac84770c792e" containerName="mariadb-database-create" Oct 07 11:36:38 crc kubenswrapper[4700]: E1007 11:36:38.622275 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7cf822b-19fa-4313-8b37-a2a5faf14587" containerName="init" Oct 07 11:36:38 crc kubenswrapper[4700]: I1007 11:36:38.622281 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7cf822b-19fa-4313-8b37-a2a5faf14587" containerName="init" Oct 07 11:36:38 crc kubenswrapper[4700]: E1007 11:36:38.622292 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5e9d9fd-6daa-4dd6-a1d7-3dbdf6384b09" containerName="mariadb-database-create" Oct 07 11:36:38 crc kubenswrapper[4700]: I1007 11:36:38.622299 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5e9d9fd-6daa-4dd6-a1d7-3dbdf6384b09" containerName="mariadb-database-create" Oct 07 11:36:38 crc kubenswrapper[4700]: E1007 11:36:38.622330 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9008225-ce95-4591-9a50-8f2982a231a5" containerName="mariadb-database-create" Oct 07 11:36:38 crc kubenswrapper[4700]: I1007 11:36:38.622336 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9008225-ce95-4591-9a50-8f2982a231a5" containerName="mariadb-database-create" Oct 07 11:36:38 crc kubenswrapper[4700]: E1007 11:36:38.622363 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7cf822b-19fa-4313-8b37-a2a5faf14587" containerName="dnsmasq-dns" Oct 07 11:36:38 crc kubenswrapper[4700]: I1007 11:36:38.622369 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7cf822b-19fa-4313-8b37-a2a5faf14587" containerName="dnsmasq-dns" Oct 07 11:36:38 crc kubenswrapper[4700]: I1007 11:36:38.622712 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5e9d9fd-6daa-4dd6-a1d7-3dbdf6384b09" containerName="mariadb-database-create" Oct 07 11:36:38 crc kubenswrapper[4700]: I1007 11:36:38.622745 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea8dd361-88e8-4530-9d61-ac84770c792e" containerName="mariadb-database-create" Oct 07 11:36:38 crc kubenswrapper[4700]: I1007 11:36:38.622761 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7cf822b-19fa-4313-8b37-a2a5faf14587" containerName="dnsmasq-dns" Oct 07 11:36:38 crc kubenswrapper[4700]: I1007 11:36:38.622783 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9008225-ce95-4591-9a50-8f2982a231a5" containerName="mariadb-database-create" Oct 07 11:36:38 crc kubenswrapper[4700]: I1007 11:36:38.623332 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-ddb5-account-create-6bjgp" Oct 07 11:36:38 crc kubenswrapper[4700]: I1007 11:36:38.626217 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-ddb5-account-create-6bjgp"] Oct 07 11:36:38 crc kubenswrapper[4700]: I1007 11:36:38.632130 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 07 11:36:38 crc kubenswrapper[4700]: I1007 11:36:38.687955 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcc7k\" (UniqueName: \"kubernetes.io/projected/3eb24dac-4e3e-44db-bdb3-b5dc85b446af-kube-api-access-pcc7k\") pod \"glance-ddb5-account-create-6bjgp\" (UID: \"3eb24dac-4e3e-44db-bdb3-b5dc85b446af\") " pod="openstack/glance-ddb5-account-create-6bjgp" Oct 07 11:36:38 crc kubenswrapper[4700]: I1007 11:36:38.789762 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcc7k\" (UniqueName: \"kubernetes.io/projected/3eb24dac-4e3e-44db-bdb3-b5dc85b446af-kube-api-access-pcc7k\") pod \"glance-ddb5-account-create-6bjgp\" (UID: \"3eb24dac-4e3e-44db-bdb3-b5dc85b446af\") " pod="openstack/glance-ddb5-account-create-6bjgp" Oct 07 11:36:38 crc kubenswrapper[4700]: I1007 11:36:38.816014 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcc7k\" (UniqueName: \"kubernetes.io/projected/3eb24dac-4e3e-44db-bdb3-b5dc85b446af-kube-api-access-pcc7k\") pod \"glance-ddb5-account-create-6bjgp\" (UID: \"3eb24dac-4e3e-44db-bdb3-b5dc85b446af\") " pod="openstack/glance-ddb5-account-create-6bjgp" Oct 07 11:36:38 crc kubenswrapper[4700]: I1007 11:36:38.941545 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-ddb5-account-create-6bjgp" Oct 07 11:36:39 crc kubenswrapper[4700]: W1007 11:36:39.389058 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3eb24dac_4e3e_44db_bdb3_b5dc85b446af.slice/crio-ec8298e65ec70f637429309939b179464820bf86b5b63695288fb26fa1f3e099 WatchSource:0}: Error finding container ec8298e65ec70f637429309939b179464820bf86b5b63695288fb26fa1f3e099: Status 404 returned error can't find the container with id ec8298e65ec70f637429309939b179464820bf86b5b63695288fb26fa1f3e099 Oct 07 11:36:39 crc kubenswrapper[4700]: I1007 11:36:39.389612 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-ddb5-account-create-6bjgp"] Oct 07 11:36:39 crc kubenswrapper[4700]: I1007 11:36:39.745389 4700 generic.go:334] "Generic (PLEG): container finished" podID="3352f4b9-00aa-419c-a354-1fb7b7120ad5" containerID="baadb84b9e24cfcf1fa69e883c00f643d6148fb15bc571fb5d5549af4fc424da" exitCode=0 Oct 07 11:36:39 crc kubenswrapper[4700]: I1007 11:36:39.745444 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-pfsbd" event={"ID":"3352f4b9-00aa-419c-a354-1fb7b7120ad5","Type":"ContainerDied","Data":"baadb84b9e24cfcf1fa69e883c00f643d6148fb15bc571fb5d5549af4fc424da"} Oct 07 11:36:39 crc kubenswrapper[4700]: I1007 11:36:39.748114 4700 generic.go:334] "Generic (PLEG): container finished" podID="3eb24dac-4e3e-44db-bdb3-b5dc85b446af" containerID="3b2c970b225bc607b376513ede6e3ecd84aa161923b43c27a1cc7917a322c541" exitCode=0 Oct 07 11:36:39 crc kubenswrapper[4700]: I1007 11:36:39.748242 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-ddb5-account-create-6bjgp" event={"ID":"3eb24dac-4e3e-44db-bdb3-b5dc85b446af","Type":"ContainerDied","Data":"3b2c970b225bc607b376513ede6e3ecd84aa161923b43c27a1cc7917a322c541"} Oct 07 11:36:39 crc kubenswrapper[4700]: I1007 11:36:39.748339 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-ddb5-account-create-6bjgp" event={"ID":"3eb24dac-4e3e-44db-bdb3-b5dc85b446af","Type":"ContainerStarted","Data":"ec8298e65ec70f637429309939b179464820bf86b5b63695288fb26fa1f3e099"} Oct 07 11:36:40 crc kubenswrapper[4700]: I1007 11:36:40.715485 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-brvlc" Oct 07 11:36:40 crc kubenswrapper[4700]: I1007 11:36:40.819818 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-xmw8w"] Oct 07 11:36:40 crc kubenswrapper[4700]: I1007 11:36:40.820143 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-xmw8w" podUID="65318c76-6964-4a52-848b-7fea1e6c98ef" containerName="dnsmasq-dns" containerID="cri-o://78d9d2c3fa06aabc96342b11798408a0aaa74c63c96a422f4c87cf9b062cd211" gracePeriod=10 Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.355838 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-ddb5-account-create-6bjgp" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.455344 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcc7k\" (UniqueName: \"kubernetes.io/projected/3eb24dac-4e3e-44db-bdb3-b5dc85b446af-kube-api-access-pcc7k\") pod \"3eb24dac-4e3e-44db-bdb3-b5dc85b446af\" (UID: \"3eb24dac-4e3e-44db-bdb3-b5dc85b446af\") " Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.460576 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-xmw8w" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.462236 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eb24dac-4e3e-44db-bdb3-b5dc85b446af-kube-api-access-pcc7k" (OuterVolumeSpecName: "kube-api-access-pcc7k") pod "3eb24dac-4e3e-44db-bdb3-b5dc85b446af" (UID: "3eb24dac-4e3e-44db-bdb3-b5dc85b446af"). InnerVolumeSpecName "kube-api-access-pcc7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.468696 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-pfsbd" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.556958 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65318c76-6964-4a52-848b-7fea1e6c98ef-dns-svc\") pod \"65318c76-6964-4a52-848b-7fea1e6c98ef\" (UID: \"65318c76-6964-4a52-848b-7fea1e6c98ef\") " Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.557469 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8tlv\" (UniqueName: \"kubernetes.io/projected/3352f4b9-00aa-419c-a354-1fb7b7120ad5-kube-api-access-z8tlv\") pod \"3352f4b9-00aa-419c-a354-1fb7b7120ad5\" (UID: \"3352f4b9-00aa-419c-a354-1fb7b7120ad5\") " Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.557506 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3352f4b9-00aa-419c-a354-1fb7b7120ad5-scripts\") pod \"3352f4b9-00aa-419c-a354-1fb7b7120ad5\" (UID: \"3352f4b9-00aa-419c-a354-1fb7b7120ad5\") " Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.557526 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/65318c76-6964-4a52-848b-7fea1e6c98ef-ovsdbserver-nb\") pod \"65318c76-6964-4a52-848b-7fea1e6c98ef\" (UID: \"65318c76-6964-4a52-848b-7fea1e6c98ef\") " Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.557576 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/65318c76-6964-4a52-848b-7fea1e6c98ef-ovsdbserver-sb\") pod \"65318c76-6964-4a52-848b-7fea1e6c98ef\" (UID: \"65318c76-6964-4a52-848b-7fea1e6c98ef\") " Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.557593 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3352f4b9-00aa-419c-a354-1fb7b7120ad5-swiftconf\") pod \"3352f4b9-00aa-419c-a354-1fb7b7120ad5\" (UID: \"3352f4b9-00aa-419c-a354-1fb7b7120ad5\") " Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.557612 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3352f4b9-00aa-419c-a354-1fb7b7120ad5-combined-ca-bundle\") pod \"3352f4b9-00aa-419c-a354-1fb7b7120ad5\" (UID: \"3352f4b9-00aa-419c-a354-1fb7b7120ad5\") " Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.557718 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3352f4b9-00aa-419c-a354-1fb7b7120ad5-etc-swift\") pod \"3352f4b9-00aa-419c-a354-1fb7b7120ad5\" (UID: \"3352f4b9-00aa-419c-a354-1fb7b7120ad5\") " Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.557739 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65318c76-6964-4a52-848b-7fea1e6c98ef-config\") pod \"65318c76-6964-4a52-848b-7fea1e6c98ef\" (UID: \"65318c76-6964-4a52-848b-7fea1e6c98ef\") " Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.557779 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkjkp\" (UniqueName: \"kubernetes.io/projected/65318c76-6964-4a52-848b-7fea1e6c98ef-kube-api-access-hkjkp\") pod \"65318c76-6964-4a52-848b-7fea1e6c98ef\" (UID: \"65318c76-6964-4a52-848b-7fea1e6c98ef\") " Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.557804 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3352f4b9-00aa-419c-a354-1fb7b7120ad5-ring-data-devices\") pod \"3352f4b9-00aa-419c-a354-1fb7b7120ad5\" (UID: \"3352f4b9-00aa-419c-a354-1fb7b7120ad5\") " Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.557862 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3352f4b9-00aa-419c-a354-1fb7b7120ad5-dispersionconf\") pod \"3352f4b9-00aa-419c-a354-1fb7b7120ad5\" (UID: \"3352f4b9-00aa-419c-a354-1fb7b7120ad5\") " Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.558197 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcc7k\" (UniqueName: \"kubernetes.io/projected/3eb24dac-4e3e-44db-bdb3-b5dc85b446af-kube-api-access-pcc7k\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.561125 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3352f4b9-00aa-419c-a354-1fb7b7120ad5-kube-api-access-z8tlv" (OuterVolumeSpecName: "kube-api-access-z8tlv") pod "3352f4b9-00aa-419c-a354-1fb7b7120ad5" (UID: "3352f4b9-00aa-419c-a354-1fb7b7120ad5"). InnerVolumeSpecName "kube-api-access-z8tlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.561163 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3352f4b9-00aa-419c-a354-1fb7b7120ad5-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "3352f4b9-00aa-419c-a354-1fb7b7120ad5" (UID: "3352f4b9-00aa-419c-a354-1fb7b7120ad5"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.562329 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3352f4b9-00aa-419c-a354-1fb7b7120ad5-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "3352f4b9-00aa-419c-a354-1fb7b7120ad5" (UID: "3352f4b9-00aa-419c-a354-1fb7b7120ad5"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.563630 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65318c76-6964-4a52-848b-7fea1e6c98ef-kube-api-access-hkjkp" (OuterVolumeSpecName: "kube-api-access-hkjkp") pod "65318c76-6964-4a52-848b-7fea1e6c98ef" (UID: "65318c76-6964-4a52-848b-7fea1e6c98ef"). InnerVolumeSpecName "kube-api-access-hkjkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.566252 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3352f4b9-00aa-419c-a354-1fb7b7120ad5-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "3352f4b9-00aa-419c-a354-1fb7b7120ad5" (UID: "3352f4b9-00aa-419c-a354-1fb7b7120ad5"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.585590 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3352f4b9-00aa-419c-a354-1fb7b7120ad5-scripts" (OuterVolumeSpecName: "scripts") pod "3352f4b9-00aa-419c-a354-1fb7b7120ad5" (UID: "3352f4b9-00aa-419c-a354-1fb7b7120ad5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.585655 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3352f4b9-00aa-419c-a354-1fb7b7120ad5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3352f4b9-00aa-419c-a354-1fb7b7120ad5" (UID: "3352f4b9-00aa-419c-a354-1fb7b7120ad5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.597651 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3352f4b9-00aa-419c-a354-1fb7b7120ad5-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "3352f4b9-00aa-419c-a354-1fb7b7120ad5" (UID: "3352f4b9-00aa-419c-a354-1fb7b7120ad5"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.606166 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65318c76-6964-4a52-848b-7fea1e6c98ef-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "65318c76-6964-4a52-848b-7fea1e6c98ef" (UID: "65318c76-6964-4a52-848b-7fea1e6c98ef"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.608261 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65318c76-6964-4a52-848b-7fea1e6c98ef-config" (OuterVolumeSpecName: "config") pod "65318c76-6964-4a52-848b-7fea1e6c98ef" (UID: "65318c76-6964-4a52-848b-7fea1e6c98ef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.609705 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65318c76-6964-4a52-848b-7fea1e6c98ef-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "65318c76-6964-4a52-848b-7fea1e6c98ef" (UID: "65318c76-6964-4a52-848b-7fea1e6c98ef"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.618004 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65318c76-6964-4a52-848b-7fea1e6c98ef-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "65318c76-6964-4a52-848b-7fea1e6c98ef" (UID: "65318c76-6964-4a52-848b-7fea1e6c98ef"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.659332 4700 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3352f4b9-00aa-419c-a354-1fb7b7120ad5-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.659365 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65318c76-6964-4a52-848b-7fea1e6c98ef-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.659374 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkjkp\" (UniqueName: \"kubernetes.io/projected/65318c76-6964-4a52-848b-7fea1e6c98ef-kube-api-access-hkjkp\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.659385 4700 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3352f4b9-00aa-419c-a354-1fb7b7120ad5-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.659394 4700 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3352f4b9-00aa-419c-a354-1fb7b7120ad5-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.659402 4700 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65318c76-6964-4a52-848b-7fea1e6c98ef-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.659410 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8tlv\" (UniqueName: \"kubernetes.io/projected/3352f4b9-00aa-419c-a354-1fb7b7120ad5-kube-api-access-z8tlv\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.659420 4700 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3352f4b9-00aa-419c-a354-1fb7b7120ad5-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.659427 4700 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/65318c76-6964-4a52-848b-7fea1e6c98ef-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.659435 4700 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/65318c76-6964-4a52-848b-7fea1e6c98ef-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.659443 4700 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3352f4b9-00aa-419c-a354-1fb7b7120ad5-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.659452 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3352f4b9-00aa-419c-a354-1fb7b7120ad5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.836126 4700 generic.go:334] "Generic (PLEG): container finished" podID="65318c76-6964-4a52-848b-7fea1e6c98ef" containerID="78d9d2c3fa06aabc96342b11798408a0aaa74c63c96a422f4c87cf9b062cd211" exitCode=0 Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.836180 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-xmw8w" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.836195 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-xmw8w" event={"ID":"65318c76-6964-4a52-848b-7fea1e6c98ef","Type":"ContainerDied","Data":"78d9d2c3fa06aabc96342b11798408a0aaa74c63c96a422f4c87cf9b062cd211"} Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.836256 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-xmw8w" event={"ID":"65318c76-6964-4a52-848b-7fea1e6c98ef","Type":"ContainerDied","Data":"fd9150110a2921114bc069617e2b796cd9ffa82bf8756d7820d1599fa1231032"} Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.836288 4700 scope.go:117] "RemoveContainer" containerID="78d9d2c3fa06aabc96342b11798408a0aaa74c63c96a422f4c87cf9b062cd211" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.840124 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-pfsbd" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.840132 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-pfsbd" event={"ID":"3352f4b9-00aa-419c-a354-1fb7b7120ad5","Type":"ContainerDied","Data":"f65936672fa6ebc4a68d059798745a491207caf6d5a2fa4cf6b844deb13601f2"} Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.840233 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f65936672fa6ebc4a68d059798745a491207caf6d5a2fa4cf6b844deb13601f2" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.842116 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-ddb5-account-create-6bjgp" event={"ID":"3eb24dac-4e3e-44db-bdb3-b5dc85b446af","Type":"ContainerDied","Data":"ec8298e65ec70f637429309939b179464820bf86b5b63695288fb26fa1f3e099"} Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.842133 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec8298e65ec70f637429309939b179464820bf86b5b63695288fb26fa1f3e099" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.842167 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-ddb5-account-create-6bjgp" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.884627 4700 scope.go:117] "RemoveContainer" containerID="36b780bb61470f22da2f3e4d8176d71469aec84575f76ea7a28fd0722ae25ed1" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.886541 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-xmw8w"] Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.893709 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-xmw8w"] Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.914502 4700 scope.go:117] "RemoveContainer" containerID="78d9d2c3fa06aabc96342b11798408a0aaa74c63c96a422f4c87cf9b062cd211" Oct 07 11:36:41 crc kubenswrapper[4700]: E1007 11:36:41.915058 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78d9d2c3fa06aabc96342b11798408a0aaa74c63c96a422f4c87cf9b062cd211\": container with ID starting with 78d9d2c3fa06aabc96342b11798408a0aaa74c63c96a422f4c87cf9b062cd211 not found: ID does not exist" containerID="78d9d2c3fa06aabc96342b11798408a0aaa74c63c96a422f4c87cf9b062cd211" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.915131 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78d9d2c3fa06aabc96342b11798408a0aaa74c63c96a422f4c87cf9b062cd211"} err="failed to get container status \"78d9d2c3fa06aabc96342b11798408a0aaa74c63c96a422f4c87cf9b062cd211\": rpc error: code = NotFound desc = could not find container \"78d9d2c3fa06aabc96342b11798408a0aaa74c63c96a422f4c87cf9b062cd211\": container with ID starting with 78d9d2c3fa06aabc96342b11798408a0aaa74c63c96a422f4c87cf9b062cd211 not found: ID does not exist" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.915172 4700 scope.go:117] "RemoveContainer" containerID="36b780bb61470f22da2f3e4d8176d71469aec84575f76ea7a28fd0722ae25ed1" Oct 07 11:36:41 crc kubenswrapper[4700]: E1007 11:36:41.915549 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36b780bb61470f22da2f3e4d8176d71469aec84575f76ea7a28fd0722ae25ed1\": container with ID starting with 36b780bb61470f22da2f3e4d8176d71469aec84575f76ea7a28fd0722ae25ed1 not found: ID does not exist" containerID="36b780bb61470f22da2f3e4d8176d71469aec84575f76ea7a28fd0722ae25ed1" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.915590 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36b780bb61470f22da2f3e4d8176d71469aec84575f76ea7a28fd0722ae25ed1"} err="failed to get container status \"36b780bb61470f22da2f3e4d8176d71469aec84575f76ea7a28fd0722ae25ed1\": rpc error: code = NotFound desc = could not find container \"36b780bb61470f22da2f3e4d8176d71469aec84575f76ea7a28fd0722ae25ed1\": container with ID starting with 36b780bb61470f22da2f3e4d8176d71469aec84575f76ea7a28fd0722ae25ed1 not found: ID does not exist" Oct 07 11:36:41 crc kubenswrapper[4700]: I1007 11:36:41.972747 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65318c76-6964-4a52-848b-7fea1e6c98ef" path="/var/lib/kubelet/pods/65318c76-6964-4a52-848b-7fea1e6c98ef/volumes" Oct 07 11:36:42 crc kubenswrapper[4700]: I1007 11:36:42.471619 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6555d4a9-f098-43b2-9b50-f7c9d855cf6a-etc-swift\") pod \"swift-storage-0\" (UID: \"6555d4a9-f098-43b2-9b50-f7c9d855cf6a\") " pod="openstack/swift-storage-0" Oct 07 11:36:42 crc kubenswrapper[4700]: I1007 11:36:42.481401 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6555d4a9-f098-43b2-9b50-f7c9d855cf6a-etc-swift\") pod \"swift-storage-0\" (UID: \"6555d4a9-f098-43b2-9b50-f7c9d855cf6a\") " pod="openstack/swift-storage-0" Oct 07 11:36:42 crc kubenswrapper[4700]: I1007 11:36:42.732649 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:42.998391 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-4970-account-create-b52jf"] Oct 07 11:36:43 crc kubenswrapper[4700]: E1007 11:36:42.999090 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eb24dac-4e3e-44db-bdb3-b5dc85b446af" containerName="mariadb-account-create" Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:42.999106 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eb24dac-4e3e-44db-bdb3-b5dc85b446af" containerName="mariadb-account-create" Oct 07 11:36:43 crc kubenswrapper[4700]: E1007 11:36:42.999132 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65318c76-6964-4a52-848b-7fea1e6c98ef" containerName="dnsmasq-dns" Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:42.999140 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="65318c76-6964-4a52-848b-7fea1e6c98ef" containerName="dnsmasq-dns" Oct 07 11:36:43 crc kubenswrapper[4700]: E1007 11:36:42.999162 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65318c76-6964-4a52-848b-7fea1e6c98ef" containerName="init" Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:42.999171 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="65318c76-6964-4a52-848b-7fea1e6c98ef" containerName="init" Oct 07 11:36:43 crc kubenswrapper[4700]: E1007 11:36:42.999190 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3352f4b9-00aa-419c-a354-1fb7b7120ad5" containerName="swift-ring-rebalance" Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:42.999198 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="3352f4b9-00aa-419c-a354-1fb7b7120ad5" containerName="swift-ring-rebalance" Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:42.999417 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="65318c76-6964-4a52-848b-7fea1e6c98ef" containerName="dnsmasq-dns" Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:42.999435 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="3352f4b9-00aa-419c-a354-1fb7b7120ad5" containerName="swift-ring-rebalance" Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:42.999445 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eb24dac-4e3e-44db-bdb3-b5dc85b446af" containerName="mariadb-account-create" Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:43.000288 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4970-account-create-b52jf" Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:43.005027 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:43.009794 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4970-account-create-b52jf"] Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:43.081674 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn56t\" (UniqueName: \"kubernetes.io/projected/b7923001-b1a5-4e98-8776-9acdf9f161a1-kube-api-access-nn56t\") pod \"keystone-4970-account-create-b52jf\" (UID: \"b7923001-b1a5-4e98-8776-9acdf9f161a1\") " pod="openstack/keystone-4970-account-create-b52jf" Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:43.183255 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn56t\" (UniqueName: \"kubernetes.io/projected/b7923001-b1a5-4e98-8776-9acdf9f161a1-kube-api-access-nn56t\") pod \"keystone-4970-account-create-b52jf\" (UID: \"b7923001-b1a5-4e98-8776-9acdf9f161a1\") " pod="openstack/keystone-4970-account-create-b52jf" Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:43.192860 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5355-account-create-2pblh"] Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:43.193883 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5355-account-create-2pblh" Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:43.196170 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:43.206261 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn56t\" (UniqueName: \"kubernetes.io/projected/b7923001-b1a5-4e98-8776-9acdf9f161a1-kube-api-access-nn56t\") pod \"keystone-4970-account-create-b52jf\" (UID: \"b7923001-b1a5-4e98-8776-9acdf9f161a1\") " pod="openstack/keystone-4970-account-create-b52jf" Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:43.206474 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5355-account-create-2pblh"] Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:43.284789 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgth5\" (UniqueName: \"kubernetes.io/projected/cefeb69f-ade8-41be-9525-ed5d017c8b2b-kube-api-access-bgth5\") pod \"placement-5355-account-create-2pblh\" (UID: \"cefeb69f-ade8-41be-9525-ed5d017c8b2b\") " pod="openstack/placement-5355-account-create-2pblh" Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:43.323927 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4970-account-create-b52jf" Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:43.345288 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:43.386576 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgth5\" (UniqueName: \"kubernetes.io/projected/cefeb69f-ade8-41be-9525-ed5d017c8b2b-kube-api-access-bgth5\") pod \"placement-5355-account-create-2pblh\" (UID: \"cefeb69f-ade8-41be-9525-ed5d017c8b2b\") " pod="openstack/placement-5355-account-create-2pblh" Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:43.407020 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgth5\" (UniqueName: \"kubernetes.io/projected/cefeb69f-ade8-41be-9525-ed5d017c8b2b-kube-api-access-bgth5\") pod \"placement-5355-account-create-2pblh\" (UID: \"cefeb69f-ade8-41be-9525-ed5d017c8b2b\") " pod="openstack/placement-5355-account-create-2pblh" Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:43.539621 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5355-account-create-2pblh" Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:43.790776 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-24ghv"] Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:43.792011 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-24ghv" Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:43.796631 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:43.796960 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9x8lb" Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:43.805870 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4970-account-create-b52jf"] Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:43.814666 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-24ghv"] Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:43.873117 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4970-account-create-b52jf" event={"ID":"b7923001-b1a5-4e98-8776-9acdf9f161a1","Type":"ContainerStarted","Data":"b752da63f29d63acf5c30f38b9523fc1d52c51d3b374acf89bfb4f32a450da7a"} Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:43.874235 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6555d4a9-f098-43b2-9b50-f7c9d855cf6a","Type":"ContainerStarted","Data":"96edd8f72c1a42a974e03a105dddfe67a447ec02e81e0e010591b31ea484ee9c"} Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:43.896032 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40ee0582-037f-452a-a529-bedd3e1f51c9-config-data\") pod \"glance-db-sync-24ghv\" (UID: \"40ee0582-037f-452a-a529-bedd3e1f51c9\") " pod="openstack/glance-db-sync-24ghv" Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:43.896100 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40ee0582-037f-452a-a529-bedd3e1f51c9-combined-ca-bundle\") pod \"glance-db-sync-24ghv\" (UID: \"40ee0582-037f-452a-a529-bedd3e1f51c9\") " pod="openstack/glance-db-sync-24ghv" Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:43.896148 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lk88\" (UniqueName: \"kubernetes.io/projected/40ee0582-037f-452a-a529-bedd3e1f51c9-kube-api-access-7lk88\") pod \"glance-db-sync-24ghv\" (UID: \"40ee0582-037f-452a-a529-bedd3e1f51c9\") " pod="openstack/glance-db-sync-24ghv" Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:43.896165 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/40ee0582-037f-452a-a529-bedd3e1f51c9-db-sync-config-data\") pod \"glance-db-sync-24ghv\" (UID: \"40ee0582-037f-452a-a529-bedd3e1f51c9\") " pod="openstack/glance-db-sync-24ghv" Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:43.997663 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40ee0582-037f-452a-a529-bedd3e1f51c9-config-data\") pod \"glance-db-sync-24ghv\" (UID: \"40ee0582-037f-452a-a529-bedd3e1f51c9\") " pod="openstack/glance-db-sync-24ghv" Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:43.997730 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40ee0582-037f-452a-a529-bedd3e1f51c9-combined-ca-bundle\") pod \"glance-db-sync-24ghv\" (UID: \"40ee0582-037f-452a-a529-bedd3e1f51c9\") " pod="openstack/glance-db-sync-24ghv" Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:43.997757 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lk88\" (UniqueName: \"kubernetes.io/projected/40ee0582-037f-452a-a529-bedd3e1f51c9-kube-api-access-7lk88\") pod \"glance-db-sync-24ghv\" (UID: \"40ee0582-037f-452a-a529-bedd3e1f51c9\") " pod="openstack/glance-db-sync-24ghv" Oct 07 11:36:43 crc kubenswrapper[4700]: I1007 11:36:43.997775 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/40ee0582-037f-452a-a529-bedd3e1f51c9-db-sync-config-data\") pod \"glance-db-sync-24ghv\" (UID: \"40ee0582-037f-452a-a529-bedd3e1f51c9\") " pod="openstack/glance-db-sync-24ghv" Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.004801 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40ee0582-037f-452a-a529-bedd3e1f51c9-config-data\") pod \"glance-db-sync-24ghv\" (UID: \"40ee0582-037f-452a-a529-bedd3e1f51c9\") " pod="openstack/glance-db-sync-24ghv" Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.005389 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40ee0582-037f-452a-a529-bedd3e1f51c9-combined-ca-bundle\") pod \"glance-db-sync-24ghv\" (UID: \"40ee0582-037f-452a-a529-bedd3e1f51c9\") " pod="openstack/glance-db-sync-24ghv" Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.019645 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/40ee0582-037f-452a-a529-bedd3e1f51c9-db-sync-config-data\") pod \"glance-db-sync-24ghv\" (UID: \"40ee0582-037f-452a-a529-bedd3e1f51c9\") " pod="openstack/glance-db-sync-24ghv" Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.020916 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lk88\" (UniqueName: \"kubernetes.io/projected/40ee0582-037f-452a-a529-bedd3e1f51c9-kube-api-access-7lk88\") pod \"glance-db-sync-24ghv\" (UID: \"40ee0582-037f-452a-a529-bedd3e1f51c9\") " pod="openstack/glance-db-sync-24ghv" Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.023463 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5355-account-create-2pblh"] Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.115663 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-24ghv" Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.263084 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-ksvhb" Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.270619 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-m9nzp" podUID="f39e97ad-dbbb-45d4-a595-8f675165ed7d" containerName="ovn-controller" probeResult="failure" output=< Oct 07 11:36:44 crc kubenswrapper[4700]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 07 11:36:44 crc kubenswrapper[4700]: > Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.298478 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-ksvhb" Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.525918 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-m9nzp-config-5k4t2"] Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.528653 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m9nzp-config-5k4t2" Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.530960 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-m9nzp-config-5k4t2"] Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.531698 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.607420 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e53cd853-975a-4992-aae8-b337b28df8f8-additional-scripts\") pod \"ovn-controller-m9nzp-config-5k4t2\" (UID: \"e53cd853-975a-4992-aae8-b337b28df8f8\") " pod="openstack/ovn-controller-m9nzp-config-5k4t2" Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.607492 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e53cd853-975a-4992-aae8-b337b28df8f8-var-run-ovn\") pod \"ovn-controller-m9nzp-config-5k4t2\" (UID: \"e53cd853-975a-4992-aae8-b337b28df8f8\") " pod="openstack/ovn-controller-m9nzp-config-5k4t2" Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.607512 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e53cd853-975a-4992-aae8-b337b28df8f8-scripts\") pod \"ovn-controller-m9nzp-config-5k4t2\" (UID: \"e53cd853-975a-4992-aae8-b337b28df8f8\") " pod="openstack/ovn-controller-m9nzp-config-5k4t2" Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.607529 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8726\" (UniqueName: \"kubernetes.io/projected/e53cd853-975a-4992-aae8-b337b28df8f8-kube-api-access-f8726\") pod \"ovn-controller-m9nzp-config-5k4t2\" (UID: \"e53cd853-975a-4992-aae8-b337b28df8f8\") " pod="openstack/ovn-controller-m9nzp-config-5k4t2" Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.607813 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e53cd853-975a-4992-aae8-b337b28df8f8-var-log-ovn\") pod \"ovn-controller-m9nzp-config-5k4t2\" (UID: \"e53cd853-975a-4992-aae8-b337b28df8f8\") " pod="openstack/ovn-controller-m9nzp-config-5k4t2" Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.607914 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e53cd853-975a-4992-aae8-b337b28df8f8-var-run\") pod \"ovn-controller-m9nzp-config-5k4t2\" (UID: \"e53cd853-975a-4992-aae8-b337b28df8f8\") " pod="openstack/ovn-controller-m9nzp-config-5k4t2" Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.709523 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e53cd853-975a-4992-aae8-b337b28df8f8-var-log-ovn\") pod \"ovn-controller-m9nzp-config-5k4t2\" (UID: \"e53cd853-975a-4992-aae8-b337b28df8f8\") " pod="openstack/ovn-controller-m9nzp-config-5k4t2" Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.709635 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e53cd853-975a-4992-aae8-b337b28df8f8-var-run\") pod \"ovn-controller-m9nzp-config-5k4t2\" (UID: \"e53cd853-975a-4992-aae8-b337b28df8f8\") " pod="openstack/ovn-controller-m9nzp-config-5k4t2" Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.709705 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e53cd853-975a-4992-aae8-b337b28df8f8-additional-scripts\") pod \"ovn-controller-m9nzp-config-5k4t2\" (UID: \"e53cd853-975a-4992-aae8-b337b28df8f8\") " pod="openstack/ovn-controller-m9nzp-config-5k4t2" Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.709770 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e53cd853-975a-4992-aae8-b337b28df8f8-var-run-ovn\") pod \"ovn-controller-m9nzp-config-5k4t2\" (UID: \"e53cd853-975a-4992-aae8-b337b28df8f8\") " pod="openstack/ovn-controller-m9nzp-config-5k4t2" Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.709793 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e53cd853-975a-4992-aae8-b337b28df8f8-scripts\") pod \"ovn-controller-m9nzp-config-5k4t2\" (UID: \"e53cd853-975a-4992-aae8-b337b28df8f8\") " pod="openstack/ovn-controller-m9nzp-config-5k4t2" Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.709816 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8726\" (UniqueName: \"kubernetes.io/projected/e53cd853-975a-4992-aae8-b337b28df8f8-kube-api-access-f8726\") pod \"ovn-controller-m9nzp-config-5k4t2\" (UID: \"e53cd853-975a-4992-aae8-b337b28df8f8\") " pod="openstack/ovn-controller-m9nzp-config-5k4t2" Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.710224 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e53cd853-975a-4992-aae8-b337b28df8f8-var-log-ovn\") pod \"ovn-controller-m9nzp-config-5k4t2\" (UID: \"e53cd853-975a-4992-aae8-b337b28df8f8\") " pod="openstack/ovn-controller-m9nzp-config-5k4t2" Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.710251 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e53cd853-975a-4992-aae8-b337b28df8f8-var-run\") pod \"ovn-controller-m9nzp-config-5k4t2\" (UID: \"e53cd853-975a-4992-aae8-b337b28df8f8\") " pod="openstack/ovn-controller-m9nzp-config-5k4t2" Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.710286 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e53cd853-975a-4992-aae8-b337b28df8f8-var-run-ovn\") pod \"ovn-controller-m9nzp-config-5k4t2\" (UID: \"e53cd853-975a-4992-aae8-b337b28df8f8\") " pod="openstack/ovn-controller-m9nzp-config-5k4t2" Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.711089 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e53cd853-975a-4992-aae8-b337b28df8f8-additional-scripts\") pod \"ovn-controller-m9nzp-config-5k4t2\" (UID: \"e53cd853-975a-4992-aae8-b337b28df8f8\") " pod="openstack/ovn-controller-m9nzp-config-5k4t2" Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.715129 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e53cd853-975a-4992-aae8-b337b28df8f8-scripts\") pod \"ovn-controller-m9nzp-config-5k4t2\" (UID: \"e53cd853-975a-4992-aae8-b337b28df8f8\") " pod="openstack/ovn-controller-m9nzp-config-5k4t2" Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.729149 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8726\" (UniqueName: \"kubernetes.io/projected/e53cd853-975a-4992-aae8-b337b28df8f8-kube-api-access-f8726\") pod \"ovn-controller-m9nzp-config-5k4t2\" (UID: \"e53cd853-975a-4992-aae8-b337b28df8f8\") " pod="openstack/ovn-controller-m9nzp-config-5k4t2" Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.794670 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-24ghv"] Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.891743 4700 generic.go:334] "Generic (PLEG): container finished" podID="cefeb69f-ade8-41be-9525-ed5d017c8b2b" containerID="a5b0c25c2b2a7455592b2febbd21a4fe93b84d1ed938c4e29250fa55202ee554" exitCode=0 Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.891838 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5355-account-create-2pblh" event={"ID":"cefeb69f-ade8-41be-9525-ed5d017c8b2b","Type":"ContainerDied","Data":"a5b0c25c2b2a7455592b2febbd21a4fe93b84d1ed938c4e29250fa55202ee554"} Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.891885 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5355-account-create-2pblh" event={"ID":"cefeb69f-ade8-41be-9525-ed5d017c8b2b","Type":"ContainerStarted","Data":"38f054d8ac5d0ca243ef939b831720f638d9e1de16678eb2bea7c3f7a1820645"} Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.893421 4700 generic.go:334] "Generic (PLEG): container finished" podID="b7923001-b1a5-4e98-8776-9acdf9f161a1" containerID="b1c70a7c665f63dafd46d84580594819f06d1e9edaa6baa472bfafe3204dee91" exitCode=0 Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.894385 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4970-account-create-b52jf" event={"ID":"b7923001-b1a5-4e98-8776-9acdf9f161a1","Type":"ContainerDied","Data":"b1c70a7c665f63dafd46d84580594819f06d1e9edaa6baa472bfafe3204dee91"} Oct 07 11:36:44 crc kubenswrapper[4700]: I1007 11:36:44.906061 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m9nzp-config-5k4t2" Oct 07 11:36:45 crc kubenswrapper[4700]: I1007 11:36:45.333936 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 11:36:45 crc kubenswrapper[4700]: I1007 11:36:45.334540 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 11:36:45 crc kubenswrapper[4700]: I1007 11:36:45.334609 4700 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" Oct 07 11:36:45 crc kubenswrapper[4700]: I1007 11:36:45.335520 4700 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b9e5650b4ada44376befecad6ee06386391296fc23c23a71914ec3f35d9306ee"} pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 11:36:45 crc kubenswrapper[4700]: I1007 11:36:45.335577 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" containerID="cri-o://b9e5650b4ada44376befecad6ee06386391296fc23c23a71914ec3f35d9306ee" gracePeriod=600 Oct 07 11:36:45 crc kubenswrapper[4700]: I1007 11:36:45.645519 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-m9nzp-config-5k4t2"] Oct 07 11:36:45 crc kubenswrapper[4700]: W1007 11:36:45.664009 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode53cd853_975a_4992_aae8_b337b28df8f8.slice/crio-6b29cf42ed194be198ca7b8eb2c89c85e9ccd4327d1006c8419b6d1b2250ccd4 WatchSource:0}: Error finding container 6b29cf42ed194be198ca7b8eb2c89c85e9ccd4327d1006c8419b6d1b2250ccd4: Status 404 returned error can't find the container with id 6b29cf42ed194be198ca7b8eb2c89c85e9ccd4327d1006c8419b6d1b2250ccd4 Oct 07 11:36:45 crc kubenswrapper[4700]: I1007 11:36:45.907482 4700 generic.go:334] "Generic (PLEG): container finished" podID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerID="b9e5650b4ada44376befecad6ee06386391296fc23c23a71914ec3f35d9306ee" exitCode=0 Oct 07 11:36:45 crc kubenswrapper[4700]: I1007 11:36:45.907568 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" event={"ID":"97a77b38-e9b1-4243-ac3a-28d83d87cf15","Type":"ContainerDied","Data":"b9e5650b4ada44376befecad6ee06386391296fc23c23a71914ec3f35d9306ee"} Oct 07 11:36:45 crc kubenswrapper[4700]: I1007 11:36:45.907896 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" event={"ID":"97a77b38-e9b1-4243-ac3a-28d83d87cf15","Type":"ContainerStarted","Data":"39f51c218de12efa082d2cc5034a6195a011e23573e568730496b6798d2fbe71"} Oct 07 11:36:45 crc kubenswrapper[4700]: I1007 11:36:45.907919 4700 scope.go:117] "RemoveContainer" containerID="01b641391ace1b00a610b14b6a967ed35cc42ded426b03a9ec0a64a8438621b6" Oct 07 11:36:45 crc kubenswrapper[4700]: I1007 11:36:45.912567 4700 generic.go:334] "Generic (PLEG): container finished" podID="ef7fab2e-f9fb-429f-9d47-e03f68165a13" containerID="3ed13cf2f5a8cd7c7fa788fceb5e49f8318d6db89d3b8d267bcf7127ecee2337" exitCode=0 Oct 07 11:36:45 crc kubenswrapper[4700]: I1007 11:36:45.912671 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ef7fab2e-f9fb-429f-9d47-e03f68165a13","Type":"ContainerDied","Data":"3ed13cf2f5a8cd7c7fa788fceb5e49f8318d6db89d3b8d267bcf7127ecee2337"} Oct 07 11:36:45 crc kubenswrapper[4700]: I1007 11:36:45.918077 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m9nzp-config-5k4t2" event={"ID":"e53cd853-975a-4992-aae8-b337b28df8f8","Type":"ContainerStarted","Data":"6b29cf42ed194be198ca7b8eb2c89c85e9ccd4327d1006c8419b6d1b2250ccd4"} Oct 07 11:36:45 crc kubenswrapper[4700]: I1007 11:36:45.919872 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-24ghv" event={"ID":"40ee0582-037f-452a-a529-bedd3e1f51c9","Type":"ContainerStarted","Data":"244f1754ce3233246c0175f64d3c2600e07380dfb7194fea6dd6454f0507dbe0"} Oct 07 11:36:45 crc kubenswrapper[4700]: I1007 11:36:45.928110 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6555d4a9-f098-43b2-9b50-f7c9d855cf6a","Type":"ContainerStarted","Data":"1cd3810ef1a81c13b97974c801c0addd9b70357bac26451244d899e9b0159ad9"} Oct 07 11:36:45 crc kubenswrapper[4700]: I1007 11:36:45.928156 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6555d4a9-f098-43b2-9b50-f7c9d855cf6a","Type":"ContainerStarted","Data":"affbe4ed4805aeefa0390d02cef51e8fb215c5b9c09a20631baf44d5954dd61a"} Oct 07 11:36:45 crc kubenswrapper[4700]: I1007 11:36:45.928165 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6555d4a9-f098-43b2-9b50-f7c9d855cf6a","Type":"ContainerStarted","Data":"44af29e0dfcbbc31fa1d7b27016640fe79421a1ca28447bd276ee672d38bdaf5"} Oct 07 11:36:45 crc kubenswrapper[4700]: I1007 11:36:45.930880 4700 generic.go:334] "Generic (PLEG): container finished" podID="ca1c2675-0718-4979-98b8-9227bc9c5f18" containerID="a46dc66bd8afce5928f040158bfee6805a39b6a68e5c8e88e1b819c3300cd2ab" exitCode=0 Oct 07 11:36:45 crc kubenswrapper[4700]: I1007 11:36:45.930966 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ca1c2675-0718-4979-98b8-9227bc9c5f18","Type":"ContainerDied","Data":"a46dc66bd8afce5928f040158bfee6805a39b6a68e5c8e88e1b819c3300cd2ab"} Oct 07 11:36:46 crc kubenswrapper[4700]: I1007 11:36:46.361303 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5355-account-create-2pblh" Oct 07 11:36:46 crc kubenswrapper[4700]: I1007 11:36:46.471893 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4970-account-create-b52jf" Oct 07 11:36:46 crc kubenswrapper[4700]: I1007 11:36:46.537263 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgth5\" (UniqueName: \"kubernetes.io/projected/cefeb69f-ade8-41be-9525-ed5d017c8b2b-kube-api-access-bgth5\") pod \"cefeb69f-ade8-41be-9525-ed5d017c8b2b\" (UID: \"cefeb69f-ade8-41be-9525-ed5d017c8b2b\") " Oct 07 11:36:46 crc kubenswrapper[4700]: I1007 11:36:46.542483 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cefeb69f-ade8-41be-9525-ed5d017c8b2b-kube-api-access-bgth5" (OuterVolumeSpecName: "kube-api-access-bgth5") pod "cefeb69f-ade8-41be-9525-ed5d017c8b2b" (UID: "cefeb69f-ade8-41be-9525-ed5d017c8b2b"). InnerVolumeSpecName "kube-api-access-bgth5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:36:46 crc kubenswrapper[4700]: I1007 11:36:46.638203 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn56t\" (UniqueName: \"kubernetes.io/projected/b7923001-b1a5-4e98-8776-9acdf9f161a1-kube-api-access-nn56t\") pod \"b7923001-b1a5-4e98-8776-9acdf9f161a1\" (UID: \"b7923001-b1a5-4e98-8776-9acdf9f161a1\") " Oct 07 11:36:46 crc kubenswrapper[4700]: I1007 11:36:46.638688 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgth5\" (UniqueName: \"kubernetes.io/projected/cefeb69f-ade8-41be-9525-ed5d017c8b2b-kube-api-access-bgth5\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:46 crc kubenswrapper[4700]: I1007 11:36:46.642169 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7923001-b1a5-4e98-8776-9acdf9f161a1-kube-api-access-nn56t" (OuterVolumeSpecName: "kube-api-access-nn56t") pod "b7923001-b1a5-4e98-8776-9acdf9f161a1" (UID: "b7923001-b1a5-4e98-8776-9acdf9f161a1"). InnerVolumeSpecName "kube-api-access-nn56t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:36:46 crc kubenswrapper[4700]: I1007 11:36:46.739743 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nn56t\" (UniqueName: \"kubernetes.io/projected/b7923001-b1a5-4e98-8776-9acdf9f161a1-kube-api-access-nn56t\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:46 crc kubenswrapper[4700]: I1007 11:36:46.939684 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4970-account-create-b52jf" Oct 07 11:36:46 crc kubenswrapper[4700]: I1007 11:36:46.939692 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4970-account-create-b52jf" event={"ID":"b7923001-b1a5-4e98-8776-9acdf9f161a1","Type":"ContainerDied","Data":"b752da63f29d63acf5c30f38b9523fc1d52c51d3b374acf89bfb4f32a450da7a"} Oct 07 11:36:46 crc kubenswrapper[4700]: I1007 11:36:46.939754 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b752da63f29d63acf5c30f38b9523fc1d52c51d3b374acf89bfb4f32a450da7a" Oct 07 11:36:46 crc kubenswrapper[4700]: I1007 11:36:46.942590 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6555d4a9-f098-43b2-9b50-f7c9d855cf6a","Type":"ContainerStarted","Data":"8ab56bf2bb0b794fae9ad38e9a95c42a741a7c369fd72ebcea462ed53a5d5d15"} Oct 07 11:36:46 crc kubenswrapper[4700]: I1007 11:36:46.944416 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ca1c2675-0718-4979-98b8-9227bc9c5f18","Type":"ContainerStarted","Data":"a21209fc50993df12343081ce8aa6918e4133ef130b5b9f4e4132912fb4c2658"} Oct 07 11:36:46 crc kubenswrapper[4700]: I1007 11:36:46.945497 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:36:46 crc kubenswrapper[4700]: I1007 11:36:46.946992 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5355-account-create-2pblh" event={"ID":"cefeb69f-ade8-41be-9525-ed5d017c8b2b","Type":"ContainerDied","Data":"38f054d8ac5d0ca243ef939b831720f638d9e1de16678eb2bea7c3f7a1820645"} Oct 07 11:36:46 crc kubenswrapper[4700]: I1007 11:36:46.947181 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38f054d8ac5d0ca243ef939b831720f638d9e1de16678eb2bea7c3f7a1820645" Oct 07 11:36:46 crc kubenswrapper[4700]: I1007 11:36:46.947218 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5355-account-create-2pblh" Oct 07 11:36:46 crc kubenswrapper[4700]: I1007 11:36:46.951455 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ef7fab2e-f9fb-429f-9d47-e03f68165a13","Type":"ContainerStarted","Data":"726411ac54738387dce7118ef2b69e0ad2b1bb7ebb6f7ad31f7c1fec21f41203"} Oct 07 11:36:46 crc kubenswrapper[4700]: I1007 11:36:46.952758 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 07 11:36:46 crc kubenswrapper[4700]: I1007 11:36:46.955143 4700 generic.go:334] "Generic (PLEG): container finished" podID="e53cd853-975a-4992-aae8-b337b28df8f8" containerID="ba4093bbaa3c2de7ca0aa39ce7b8afb747128458eb7ba59f3f285d9ca6373e48" exitCode=0 Oct 07 11:36:46 crc kubenswrapper[4700]: I1007 11:36:46.955173 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m9nzp-config-5k4t2" event={"ID":"e53cd853-975a-4992-aae8-b337b28df8f8","Type":"ContainerDied","Data":"ba4093bbaa3c2de7ca0aa39ce7b8afb747128458eb7ba59f3f285d9ca6373e48"} Oct 07 11:36:46 crc kubenswrapper[4700]: I1007 11:36:46.983585 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=50.520921713999996 podStartE2EDuration="57.983566383s" podCreationTimestamp="2025-10-07 11:35:49 +0000 UTC" firstStartedPulling="2025-10-07 11:36:03.186414984 +0000 UTC m=+929.982813973" lastFinishedPulling="2025-10-07 11:36:10.649059643 +0000 UTC m=+937.445458642" observedRunningTime="2025-10-07 11:36:46.972924094 +0000 UTC m=+973.769323103" watchObservedRunningTime="2025-10-07 11:36:46.983566383 +0000 UTC m=+973.779965372" Oct 07 11:36:47 crc kubenswrapper[4700]: I1007 11:36:47.021652 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=51.631521278 podStartE2EDuration="59.021631559s" podCreationTimestamp="2025-10-07 11:35:48 +0000 UTC" firstStartedPulling="2025-10-07 11:36:03.163028552 +0000 UTC m=+929.959427541" lastFinishedPulling="2025-10-07 11:36:10.553138793 +0000 UTC m=+937.349537822" observedRunningTime="2025-10-07 11:36:47.020442368 +0000 UTC m=+973.816841367" watchObservedRunningTime="2025-10-07 11:36:47.021631559 +0000 UTC m=+973.818030558" Oct 07 11:36:48 crc kubenswrapper[4700]: I1007 11:36:48.361744 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m9nzp-config-5k4t2" Oct 07 11:36:48 crc kubenswrapper[4700]: I1007 11:36:48.472129 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e53cd853-975a-4992-aae8-b337b28df8f8-additional-scripts\") pod \"e53cd853-975a-4992-aae8-b337b28df8f8\" (UID: \"e53cd853-975a-4992-aae8-b337b28df8f8\") " Oct 07 11:36:48 crc kubenswrapper[4700]: I1007 11:36:48.472220 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e53cd853-975a-4992-aae8-b337b28df8f8-var-log-ovn\") pod \"e53cd853-975a-4992-aae8-b337b28df8f8\" (UID: \"e53cd853-975a-4992-aae8-b337b28df8f8\") " Oct 07 11:36:48 crc kubenswrapper[4700]: I1007 11:36:48.472330 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e53cd853-975a-4992-aae8-b337b28df8f8-var-run\") pod \"e53cd853-975a-4992-aae8-b337b28df8f8\" (UID: \"e53cd853-975a-4992-aae8-b337b28df8f8\") " Oct 07 11:36:48 crc kubenswrapper[4700]: I1007 11:36:48.472369 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8726\" (UniqueName: \"kubernetes.io/projected/e53cd853-975a-4992-aae8-b337b28df8f8-kube-api-access-f8726\") pod \"e53cd853-975a-4992-aae8-b337b28df8f8\" (UID: \"e53cd853-975a-4992-aae8-b337b28df8f8\") " Oct 07 11:36:48 crc kubenswrapper[4700]: I1007 11:36:48.472384 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e53cd853-975a-4992-aae8-b337b28df8f8-var-run-ovn\") pod \"e53cd853-975a-4992-aae8-b337b28df8f8\" (UID: \"e53cd853-975a-4992-aae8-b337b28df8f8\") " Oct 07 11:36:48 crc kubenswrapper[4700]: I1007 11:36:48.472459 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e53cd853-975a-4992-aae8-b337b28df8f8-scripts\") pod \"e53cd853-975a-4992-aae8-b337b28df8f8\" (UID: \"e53cd853-975a-4992-aae8-b337b28df8f8\") " Oct 07 11:36:48 crc kubenswrapper[4700]: I1007 11:36:48.473366 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e53cd853-975a-4992-aae8-b337b28df8f8-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "e53cd853-975a-4992-aae8-b337b28df8f8" (UID: "e53cd853-975a-4992-aae8-b337b28df8f8"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 11:36:48 crc kubenswrapper[4700]: I1007 11:36:48.473420 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e53cd853-975a-4992-aae8-b337b28df8f8-var-run" (OuterVolumeSpecName: "var-run") pod "e53cd853-975a-4992-aae8-b337b28df8f8" (UID: "e53cd853-975a-4992-aae8-b337b28df8f8"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 11:36:48 crc kubenswrapper[4700]: I1007 11:36:48.473424 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e53cd853-975a-4992-aae8-b337b28df8f8-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "e53cd853-975a-4992-aae8-b337b28df8f8" (UID: "e53cd853-975a-4992-aae8-b337b28df8f8"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 11:36:48 crc kubenswrapper[4700]: I1007 11:36:48.474151 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e53cd853-975a-4992-aae8-b337b28df8f8-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "e53cd853-975a-4992-aae8-b337b28df8f8" (UID: "e53cd853-975a-4992-aae8-b337b28df8f8"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:36:48 crc kubenswrapper[4700]: I1007 11:36:48.474369 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e53cd853-975a-4992-aae8-b337b28df8f8-scripts" (OuterVolumeSpecName: "scripts") pod "e53cd853-975a-4992-aae8-b337b28df8f8" (UID: "e53cd853-975a-4992-aae8-b337b28df8f8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:36:48 crc kubenswrapper[4700]: I1007 11:36:48.480457 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e53cd853-975a-4992-aae8-b337b28df8f8-kube-api-access-f8726" (OuterVolumeSpecName: "kube-api-access-f8726") pod "e53cd853-975a-4992-aae8-b337b28df8f8" (UID: "e53cd853-975a-4992-aae8-b337b28df8f8"). InnerVolumeSpecName "kube-api-access-f8726". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:36:48 crc kubenswrapper[4700]: I1007 11:36:48.574495 4700 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e53cd853-975a-4992-aae8-b337b28df8f8-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:48 crc kubenswrapper[4700]: I1007 11:36:48.574526 4700 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e53cd853-975a-4992-aae8-b337b28df8f8-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:48 crc kubenswrapper[4700]: I1007 11:36:48.574536 4700 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e53cd853-975a-4992-aae8-b337b28df8f8-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:48 crc kubenswrapper[4700]: I1007 11:36:48.574546 4700 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e53cd853-975a-4992-aae8-b337b28df8f8-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:48 crc kubenswrapper[4700]: I1007 11:36:48.574555 4700 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e53cd853-975a-4992-aae8-b337b28df8f8-var-run\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:48 crc kubenswrapper[4700]: I1007 11:36:48.574564 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8726\" (UniqueName: \"kubernetes.io/projected/e53cd853-975a-4992-aae8-b337b28df8f8-kube-api-access-f8726\") on node \"crc\" DevicePath \"\"" Oct 07 11:36:48 crc kubenswrapper[4700]: I1007 11:36:48.977089 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6555d4a9-f098-43b2-9b50-f7c9d855cf6a","Type":"ContainerStarted","Data":"a42e4a885d9a085102f4ec00832e33523048b0f401b68c506328be5f088916eb"} Oct 07 11:36:48 crc kubenswrapper[4700]: I1007 11:36:48.977502 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6555d4a9-f098-43b2-9b50-f7c9d855cf6a","Type":"ContainerStarted","Data":"df6d51d525d38ac1d9334bf77913d26411d6ed2972e21419634d7697f2b68723"} Oct 07 11:36:48 crc kubenswrapper[4700]: I1007 11:36:48.977517 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6555d4a9-f098-43b2-9b50-f7c9d855cf6a","Type":"ContainerStarted","Data":"81b517155b4c4af6eb40a718fdd9ba6790374a4d51e05a0c56731464c4b9896b"} Oct 07 11:36:48 crc kubenswrapper[4700]: I1007 11:36:48.977526 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6555d4a9-f098-43b2-9b50-f7c9d855cf6a","Type":"ContainerStarted","Data":"b2709805ca43f3ad834ceddef75ed27d2850ef678df90844fda5a7db3f8c9ea3"} Oct 07 11:36:48 crc kubenswrapper[4700]: I1007 11:36:48.979587 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m9nzp-config-5k4t2" event={"ID":"e53cd853-975a-4992-aae8-b337b28df8f8","Type":"ContainerDied","Data":"6b29cf42ed194be198ca7b8eb2c89c85e9ccd4327d1006c8419b6d1b2250ccd4"} Oct 07 11:36:48 crc kubenswrapper[4700]: I1007 11:36:48.979632 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b29cf42ed194be198ca7b8eb2c89c85e9ccd4327d1006c8419b6d1b2250ccd4" Oct 07 11:36:48 crc kubenswrapper[4700]: I1007 11:36:48.979688 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m9nzp-config-5k4t2" Oct 07 11:36:49 crc kubenswrapper[4700]: I1007 11:36:49.175104 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-m9nzp" Oct 07 11:36:49 crc kubenswrapper[4700]: I1007 11:36:49.476939 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-m9nzp-config-5k4t2"] Oct 07 11:36:49 crc kubenswrapper[4700]: I1007 11:36:49.483997 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-m9nzp-config-5k4t2"] Oct 07 11:36:49 crc kubenswrapper[4700]: I1007 11:36:49.966932 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e53cd853-975a-4992-aae8-b337b28df8f8" path="/var/lib/kubelet/pods/e53cd853-975a-4992-aae8-b337b28df8f8/volumes" Oct 07 11:36:51 crc kubenswrapper[4700]: I1007 11:36:51.002521 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6555d4a9-f098-43b2-9b50-f7c9d855cf6a","Type":"ContainerStarted","Data":"31bb071ecd2a055cf6b5702059e745d889c19d6182117f1aba57bb5365738aca"} Oct 07 11:36:51 crc kubenswrapper[4700]: I1007 11:36:51.002759 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6555d4a9-f098-43b2-9b50-f7c9d855cf6a","Type":"ContainerStarted","Data":"2c405cef70d537167a75fde3b4a31740e21427ae0d3dd0ee72ec0c96281b7de9"} Oct 07 11:36:52 crc kubenswrapper[4700]: I1007 11:36:52.018220 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6555d4a9-f098-43b2-9b50-f7c9d855cf6a","Type":"ContainerStarted","Data":"4bdac153593ae8b4f445fafa78a045b45222d110a540bbedcc290e46b38a1a48"} Oct 07 11:36:52 crc kubenswrapper[4700]: I1007 11:36:52.018270 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6555d4a9-f098-43b2-9b50-f7c9d855cf6a","Type":"ContainerStarted","Data":"1f20ea13cba595bb0c0a7b6483b68aeeffc6961872e04116cf05693e5ba415ef"} Oct 07 11:36:52 crc kubenswrapper[4700]: I1007 11:36:52.018283 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6555d4a9-f098-43b2-9b50-f7c9d855cf6a","Type":"ContainerStarted","Data":"a9ffd2f96314dc3f61f99ac60bf80c2c227e2625066b79ca7fef39c1f48633e5"} Oct 07 11:36:52 crc kubenswrapper[4700]: I1007 11:36:52.018291 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6555d4a9-f098-43b2-9b50-f7c9d855cf6a","Type":"ContainerStarted","Data":"ec19c7f4941191a4051aed8127533e6bf964fba2320b093f579d8e42bde72a7b"} Oct 07 11:36:52 crc kubenswrapper[4700]: I1007 11:36:52.018299 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6555d4a9-f098-43b2-9b50-f7c9d855cf6a","Type":"ContainerStarted","Data":"7e45d858e89c4cf484af4bb1d383e03476b1ce532f5f04002fce08604bea594c"} Oct 07 11:36:52 crc kubenswrapper[4700]: I1007 11:36:52.309855 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.213248751 podStartE2EDuration="27.30983255s" podCreationTimestamp="2025-10-07 11:36:25 +0000 UTC" firstStartedPulling="2025-10-07 11:36:43.357244727 +0000 UTC m=+970.153643706" lastFinishedPulling="2025-10-07 11:36:50.453828526 +0000 UTC m=+977.250227505" observedRunningTime="2025-10-07 11:36:52.067690593 +0000 UTC m=+978.864089582" watchObservedRunningTime="2025-10-07 11:36:52.30983255 +0000 UTC m=+979.106231539" Oct 07 11:36:52 crc kubenswrapper[4700]: I1007 11:36:52.314826 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-dmn6s"] Oct 07 11:36:52 crc kubenswrapper[4700]: E1007 11:36:52.315229 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7923001-b1a5-4e98-8776-9acdf9f161a1" containerName="mariadb-account-create" Oct 07 11:36:52 crc kubenswrapper[4700]: I1007 11:36:52.315251 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7923001-b1a5-4e98-8776-9acdf9f161a1" containerName="mariadb-account-create" Oct 07 11:36:52 crc kubenswrapper[4700]: E1007 11:36:52.315287 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e53cd853-975a-4992-aae8-b337b28df8f8" containerName="ovn-config" Oct 07 11:36:52 crc kubenswrapper[4700]: I1007 11:36:52.315297 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="e53cd853-975a-4992-aae8-b337b28df8f8" containerName="ovn-config" Oct 07 11:36:52 crc kubenswrapper[4700]: E1007 11:36:52.315322 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cefeb69f-ade8-41be-9525-ed5d017c8b2b" containerName="mariadb-account-create" Oct 07 11:36:52 crc kubenswrapper[4700]: I1007 11:36:52.315330 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="cefeb69f-ade8-41be-9525-ed5d017c8b2b" containerName="mariadb-account-create" Oct 07 11:36:52 crc kubenswrapper[4700]: I1007 11:36:52.315525 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="cefeb69f-ade8-41be-9525-ed5d017c8b2b" containerName="mariadb-account-create" Oct 07 11:36:52 crc kubenswrapper[4700]: I1007 11:36:52.315543 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7923001-b1a5-4e98-8776-9acdf9f161a1" containerName="mariadb-account-create" Oct 07 11:36:52 crc kubenswrapper[4700]: I1007 11:36:52.315560 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="e53cd853-975a-4992-aae8-b337b28df8f8" containerName="ovn-config" Oct 07 11:36:52 crc kubenswrapper[4700]: I1007 11:36:52.316555 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-dmn6s" Oct 07 11:36:52 crc kubenswrapper[4700]: I1007 11:36:52.321149 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 07 11:36:52 crc kubenswrapper[4700]: I1007 11:36:52.328166 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-dmn6s"] Oct 07 11:36:52 crc kubenswrapper[4700]: I1007 11:36:52.429453 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/944eef79-0417-4a6d-b965-807e00d64351-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-dmn6s\" (UID: \"944eef79-0417-4a6d-b965-807e00d64351\") " pod="openstack/dnsmasq-dns-77585f5f8c-dmn6s" Oct 07 11:36:52 crc kubenswrapper[4700]: I1007 11:36:52.429500 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/944eef79-0417-4a6d-b965-807e00d64351-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-dmn6s\" (UID: \"944eef79-0417-4a6d-b965-807e00d64351\") " pod="openstack/dnsmasq-dns-77585f5f8c-dmn6s" Oct 07 11:36:52 crc kubenswrapper[4700]: I1007 11:36:52.429601 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/944eef79-0417-4a6d-b965-807e00d64351-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-dmn6s\" (UID: \"944eef79-0417-4a6d-b965-807e00d64351\") " pod="openstack/dnsmasq-dns-77585f5f8c-dmn6s" Oct 07 11:36:52 crc kubenswrapper[4700]: I1007 11:36:52.429718 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/944eef79-0417-4a6d-b965-807e00d64351-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-dmn6s\" (UID: \"944eef79-0417-4a6d-b965-807e00d64351\") " pod="openstack/dnsmasq-dns-77585f5f8c-dmn6s" Oct 07 11:36:52 crc kubenswrapper[4700]: I1007 11:36:52.429762 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb4z4\" (UniqueName: \"kubernetes.io/projected/944eef79-0417-4a6d-b965-807e00d64351-kube-api-access-vb4z4\") pod \"dnsmasq-dns-77585f5f8c-dmn6s\" (UID: \"944eef79-0417-4a6d-b965-807e00d64351\") " pod="openstack/dnsmasq-dns-77585f5f8c-dmn6s" Oct 07 11:36:52 crc kubenswrapper[4700]: I1007 11:36:52.429922 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/944eef79-0417-4a6d-b965-807e00d64351-config\") pod \"dnsmasq-dns-77585f5f8c-dmn6s\" (UID: \"944eef79-0417-4a6d-b965-807e00d64351\") " pod="openstack/dnsmasq-dns-77585f5f8c-dmn6s" Oct 07 11:36:52 crc kubenswrapper[4700]: I1007 11:36:52.532796 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/944eef79-0417-4a6d-b965-807e00d64351-config\") pod \"dnsmasq-dns-77585f5f8c-dmn6s\" (UID: \"944eef79-0417-4a6d-b965-807e00d64351\") " pod="openstack/dnsmasq-dns-77585f5f8c-dmn6s" Oct 07 11:36:52 crc kubenswrapper[4700]: I1007 11:36:52.532921 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/944eef79-0417-4a6d-b965-807e00d64351-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-dmn6s\" (UID: \"944eef79-0417-4a6d-b965-807e00d64351\") " pod="openstack/dnsmasq-dns-77585f5f8c-dmn6s" Oct 07 11:36:52 crc kubenswrapper[4700]: I1007 11:36:52.532969 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/944eef79-0417-4a6d-b965-807e00d64351-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-dmn6s\" (UID: \"944eef79-0417-4a6d-b965-807e00d64351\") " pod="openstack/dnsmasq-dns-77585f5f8c-dmn6s" Oct 07 11:36:52 crc kubenswrapper[4700]: I1007 11:36:52.533015 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/944eef79-0417-4a6d-b965-807e00d64351-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-dmn6s\" (UID: \"944eef79-0417-4a6d-b965-807e00d64351\") " pod="openstack/dnsmasq-dns-77585f5f8c-dmn6s" Oct 07 11:36:52 crc kubenswrapper[4700]: I1007 11:36:52.533060 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/944eef79-0417-4a6d-b965-807e00d64351-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-dmn6s\" (UID: \"944eef79-0417-4a6d-b965-807e00d64351\") " pod="openstack/dnsmasq-dns-77585f5f8c-dmn6s" Oct 07 11:36:52 crc kubenswrapper[4700]: I1007 11:36:52.533089 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vb4z4\" (UniqueName: \"kubernetes.io/projected/944eef79-0417-4a6d-b965-807e00d64351-kube-api-access-vb4z4\") pod \"dnsmasq-dns-77585f5f8c-dmn6s\" (UID: \"944eef79-0417-4a6d-b965-807e00d64351\") " pod="openstack/dnsmasq-dns-77585f5f8c-dmn6s" Oct 07 11:36:52 crc kubenswrapper[4700]: I1007 11:36:52.533973 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/944eef79-0417-4a6d-b965-807e00d64351-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-dmn6s\" (UID: \"944eef79-0417-4a6d-b965-807e00d64351\") " pod="openstack/dnsmasq-dns-77585f5f8c-dmn6s" Oct 07 11:36:52 crc kubenswrapper[4700]: I1007 11:36:52.534086 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/944eef79-0417-4a6d-b965-807e00d64351-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-dmn6s\" (UID: \"944eef79-0417-4a6d-b965-807e00d64351\") " pod="openstack/dnsmasq-dns-77585f5f8c-dmn6s" Oct 07 11:36:52 crc kubenswrapper[4700]: I1007 11:36:52.534153 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/944eef79-0417-4a6d-b965-807e00d64351-config\") pod \"dnsmasq-dns-77585f5f8c-dmn6s\" (UID: \"944eef79-0417-4a6d-b965-807e00d64351\") " pod="openstack/dnsmasq-dns-77585f5f8c-dmn6s" Oct 07 11:36:52 crc kubenswrapper[4700]: I1007 11:36:52.534178 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/944eef79-0417-4a6d-b965-807e00d64351-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-dmn6s\" (UID: \"944eef79-0417-4a6d-b965-807e00d64351\") " pod="openstack/dnsmasq-dns-77585f5f8c-dmn6s" Oct 07 11:36:52 crc kubenswrapper[4700]: I1007 11:36:52.534609 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/944eef79-0417-4a6d-b965-807e00d64351-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-dmn6s\" (UID: \"944eef79-0417-4a6d-b965-807e00d64351\") " pod="openstack/dnsmasq-dns-77585f5f8c-dmn6s" Oct 07 11:36:52 crc kubenswrapper[4700]: I1007 11:36:52.564761 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb4z4\" (UniqueName: \"kubernetes.io/projected/944eef79-0417-4a6d-b965-807e00d64351-kube-api-access-vb4z4\") pod \"dnsmasq-dns-77585f5f8c-dmn6s\" (UID: \"944eef79-0417-4a6d-b965-807e00d64351\") " pod="openstack/dnsmasq-dns-77585f5f8c-dmn6s" Oct 07 11:36:52 crc kubenswrapper[4700]: I1007 11:36:52.634598 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-dmn6s" Oct 07 11:36:59 crc kubenswrapper[4700]: I1007 11:36:59.622511 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-dmn6s"] Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.075532 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.114062 4700 generic.go:334] "Generic (PLEG): container finished" podID="944eef79-0417-4a6d-b965-807e00d64351" containerID="eca12d80c3de6e4768a4a0dfaa7804d2098fff2cb9593407cbc853fc0c08904c" exitCode=0 Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.114231 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-dmn6s" event={"ID":"944eef79-0417-4a6d-b965-807e00d64351","Type":"ContainerDied","Data":"eca12d80c3de6e4768a4a0dfaa7804d2098fff2cb9593407cbc853fc0c08904c"} Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.114286 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-dmn6s" event={"ID":"944eef79-0417-4a6d-b965-807e00d64351","Type":"ContainerStarted","Data":"22e346303c4630bdcceea97bd18b48d5b84c50bc45f81fbe5dd3287cced08707"} Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.122036 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-24ghv" event={"ID":"40ee0582-037f-452a-a529-bedd3e1f51c9","Type":"ContainerStarted","Data":"f5373611b12383a8276693b628d0d46bbe05067429ae8a07113a26b059d96ea7"} Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.170632 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-24ghv" podStartSLOduration=3.109465278 podStartE2EDuration="17.170599189s" podCreationTimestamp="2025-10-07 11:36:43 +0000 UTC" firstStartedPulling="2025-10-07 11:36:45.182043215 +0000 UTC m=+971.978442204" lastFinishedPulling="2025-10-07 11:36:59.243177126 +0000 UTC m=+986.039576115" observedRunningTime="2025-10-07 11:37:00.158484162 +0000 UTC m=+986.954883191" watchObservedRunningTime="2025-10-07 11:37:00.170599189 +0000 UTC m=+986.966998218" Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.398460 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.461022 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-cktjx"] Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.462358 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cktjx" Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.493150 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s48sv\" (UniqueName: \"kubernetes.io/projected/cd89bc50-bb18-405d-863b-5edf57509f7f-kube-api-access-s48sv\") pod \"cinder-db-create-cktjx\" (UID: \"cd89bc50-bb18-405d-863b-5edf57509f7f\") " pod="openstack/cinder-db-create-cktjx" Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.547541 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-cktjx"] Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.561340 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-ckw8z"] Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.562406 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ckw8z" Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.570053 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-ckw8z"] Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.595063 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s48sv\" (UniqueName: \"kubernetes.io/projected/cd89bc50-bb18-405d-863b-5edf57509f7f-kube-api-access-s48sv\") pod \"cinder-db-create-cktjx\" (UID: \"cd89bc50-bb18-405d-863b-5edf57509f7f\") " pod="openstack/cinder-db-create-cktjx" Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.595377 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbcfl\" (UniqueName: \"kubernetes.io/projected/392647fc-fc26-4946-9e59-9a3af9283041-kube-api-access-hbcfl\") pod \"barbican-db-create-ckw8z\" (UID: \"392647fc-fc26-4946-9e59-9a3af9283041\") " pod="openstack/barbican-db-create-ckw8z" Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.618461 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s48sv\" (UniqueName: \"kubernetes.io/projected/cd89bc50-bb18-405d-863b-5edf57509f7f-kube-api-access-s48sv\") pod \"cinder-db-create-cktjx\" (UID: \"cd89bc50-bb18-405d-863b-5edf57509f7f\") " pod="openstack/cinder-db-create-cktjx" Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.629747 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-ffmxn"] Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.630738 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-ffmxn" Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.675608 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-ffmxn"] Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.697198 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-946s2\" (UniqueName: \"kubernetes.io/projected/41f08d93-f0df-4917-a21a-70f26ac33e1f-kube-api-access-946s2\") pod \"heat-db-create-ffmxn\" (UID: \"41f08d93-f0df-4917-a21a-70f26ac33e1f\") " pod="openstack/heat-db-create-ffmxn" Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.697323 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbcfl\" (UniqueName: \"kubernetes.io/projected/392647fc-fc26-4946-9e59-9a3af9283041-kube-api-access-hbcfl\") pod \"barbican-db-create-ckw8z\" (UID: \"392647fc-fc26-4946-9e59-9a3af9283041\") " pod="openstack/barbican-db-create-ckw8z" Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.724692 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbcfl\" (UniqueName: \"kubernetes.io/projected/392647fc-fc26-4946-9e59-9a3af9283041-kube-api-access-hbcfl\") pod \"barbican-db-create-ckw8z\" (UID: \"392647fc-fc26-4946-9e59-9a3af9283041\") " pod="openstack/barbican-db-create-ckw8z" Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.744797 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-964gm"] Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.746230 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-964gm" Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.755678 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-964gm"] Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.778750 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cktjx" Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.799037 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sks2x\" (UniqueName: \"kubernetes.io/projected/de501b70-8d75-489f-9826-e0f1dd7306f7-kube-api-access-sks2x\") pod \"neutron-db-create-964gm\" (UID: \"de501b70-8d75-489f-9826-e0f1dd7306f7\") " pod="openstack/neutron-db-create-964gm" Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.799174 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-946s2\" (UniqueName: \"kubernetes.io/projected/41f08d93-f0df-4917-a21a-70f26ac33e1f-kube-api-access-946s2\") pod \"heat-db-create-ffmxn\" (UID: \"41f08d93-f0df-4917-a21a-70f26ac33e1f\") " pod="openstack/heat-db-create-ffmxn" Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.820194 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-946s2\" (UniqueName: \"kubernetes.io/projected/41f08d93-f0df-4917-a21a-70f26ac33e1f-kube-api-access-946s2\") pod \"heat-db-create-ffmxn\" (UID: \"41f08d93-f0df-4917-a21a-70f26ac33e1f\") " pod="openstack/heat-db-create-ffmxn" Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.827607 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-8rr4j"] Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.828829 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8rr4j" Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.830408 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.831175 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-djssz" Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.831951 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.832158 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.851247 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-8rr4j"] Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.884058 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ckw8z" Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.900624 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90c677ff-4aff-4044-b765-e0b3de056302-config-data\") pod \"keystone-db-sync-8rr4j\" (UID: \"90c677ff-4aff-4044-b765-e0b3de056302\") " pod="openstack/keystone-db-sync-8rr4j" Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.900691 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90c677ff-4aff-4044-b765-e0b3de056302-combined-ca-bundle\") pod \"keystone-db-sync-8rr4j\" (UID: \"90c677ff-4aff-4044-b765-e0b3de056302\") " pod="openstack/keystone-db-sync-8rr4j" Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.900718 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sks2x\" (UniqueName: \"kubernetes.io/projected/de501b70-8d75-489f-9826-e0f1dd7306f7-kube-api-access-sks2x\") pod \"neutron-db-create-964gm\" (UID: \"de501b70-8d75-489f-9826-e0f1dd7306f7\") " pod="openstack/neutron-db-create-964gm" Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.900742 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwrww\" (UniqueName: \"kubernetes.io/projected/90c677ff-4aff-4044-b765-e0b3de056302-kube-api-access-gwrww\") pod \"keystone-db-sync-8rr4j\" (UID: \"90c677ff-4aff-4044-b765-e0b3de056302\") " pod="openstack/keystone-db-sync-8rr4j" Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.925509 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sks2x\" (UniqueName: \"kubernetes.io/projected/de501b70-8d75-489f-9826-e0f1dd7306f7-kube-api-access-sks2x\") pod \"neutron-db-create-964gm\" (UID: \"de501b70-8d75-489f-9826-e0f1dd7306f7\") " pod="openstack/neutron-db-create-964gm" Oct 07 11:37:00 crc kubenswrapper[4700]: I1007 11:37:00.977871 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-ffmxn" Oct 07 11:37:01 crc kubenswrapper[4700]: I1007 11:37:01.002849 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90c677ff-4aff-4044-b765-e0b3de056302-config-data\") pod \"keystone-db-sync-8rr4j\" (UID: \"90c677ff-4aff-4044-b765-e0b3de056302\") " pod="openstack/keystone-db-sync-8rr4j" Oct 07 11:37:01 crc kubenswrapper[4700]: I1007 11:37:01.002937 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90c677ff-4aff-4044-b765-e0b3de056302-combined-ca-bundle\") pod \"keystone-db-sync-8rr4j\" (UID: \"90c677ff-4aff-4044-b765-e0b3de056302\") " pod="openstack/keystone-db-sync-8rr4j" Oct 07 11:37:01 crc kubenswrapper[4700]: I1007 11:37:01.002968 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwrww\" (UniqueName: \"kubernetes.io/projected/90c677ff-4aff-4044-b765-e0b3de056302-kube-api-access-gwrww\") pod \"keystone-db-sync-8rr4j\" (UID: \"90c677ff-4aff-4044-b765-e0b3de056302\") " pod="openstack/keystone-db-sync-8rr4j" Oct 07 11:37:01 crc kubenswrapper[4700]: I1007 11:37:01.008544 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90c677ff-4aff-4044-b765-e0b3de056302-combined-ca-bundle\") pod \"keystone-db-sync-8rr4j\" (UID: \"90c677ff-4aff-4044-b765-e0b3de056302\") " pod="openstack/keystone-db-sync-8rr4j" Oct 07 11:37:01 crc kubenswrapper[4700]: I1007 11:37:01.010211 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90c677ff-4aff-4044-b765-e0b3de056302-config-data\") pod \"keystone-db-sync-8rr4j\" (UID: \"90c677ff-4aff-4044-b765-e0b3de056302\") " pod="openstack/keystone-db-sync-8rr4j" Oct 07 11:37:01 crc kubenswrapper[4700]: I1007 11:37:01.026263 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwrww\" (UniqueName: \"kubernetes.io/projected/90c677ff-4aff-4044-b765-e0b3de056302-kube-api-access-gwrww\") pod \"keystone-db-sync-8rr4j\" (UID: \"90c677ff-4aff-4044-b765-e0b3de056302\") " pod="openstack/keystone-db-sync-8rr4j" Oct 07 11:37:01 crc kubenswrapper[4700]: I1007 11:37:01.079023 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-964gm" Oct 07 11:37:01 crc kubenswrapper[4700]: I1007 11:37:01.137879 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-dmn6s" event={"ID":"944eef79-0417-4a6d-b965-807e00d64351","Type":"ContainerStarted","Data":"4e08da9ca45d7f088c3286e015ccfbe4bdf01fec95faf7e7272a0a3a4f18d0c5"} Oct 07 11:37:01 crc kubenswrapper[4700]: I1007 11:37:01.137966 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-dmn6s" Oct 07 11:37:01 crc kubenswrapper[4700]: I1007 11:37:01.148551 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8rr4j" Oct 07 11:37:01 crc kubenswrapper[4700]: I1007 11:37:01.168320 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77585f5f8c-dmn6s" podStartSLOduration=9.16827984 podStartE2EDuration="9.16827984s" podCreationTimestamp="2025-10-07 11:36:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:37:01.160756513 +0000 UTC m=+987.957155522" watchObservedRunningTime="2025-10-07 11:37:01.16827984 +0000 UTC m=+987.964678849" Oct 07 11:37:01 crc kubenswrapper[4700]: I1007 11:37:01.214939 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-ckw8z"] Oct 07 11:37:01 crc kubenswrapper[4700]: I1007 11:37:01.280782 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-cktjx"] Oct 07 11:37:01 crc kubenswrapper[4700]: I1007 11:37:01.336655 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-ffmxn"] Oct 07 11:37:01 crc kubenswrapper[4700]: W1007 11:37:01.380750 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41f08d93_f0df_4917_a21a_70f26ac33e1f.slice/crio-9700526784c3b1865b196e26f67362a4cc77d29a78e252f2b88f073cee284c1a WatchSource:0}: Error finding container 9700526784c3b1865b196e26f67362a4cc77d29a78e252f2b88f073cee284c1a: Status 404 returned error can't find the container with id 9700526784c3b1865b196e26f67362a4cc77d29a78e252f2b88f073cee284c1a Oct 07 11:37:01 crc kubenswrapper[4700]: I1007 11:37:01.515291 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-964gm"] Oct 07 11:37:01 crc kubenswrapper[4700]: W1007 11:37:01.522911 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde501b70_8d75_489f_9826_e0f1dd7306f7.slice/crio-e853c5a69e114d32188291dc43dc178d01038611f4f07d1628732fe590fdb16e WatchSource:0}: Error finding container e853c5a69e114d32188291dc43dc178d01038611f4f07d1628732fe590fdb16e: Status 404 returned error can't find the container with id e853c5a69e114d32188291dc43dc178d01038611f4f07d1628732fe590fdb16e Oct 07 11:37:01 crc kubenswrapper[4700]: I1007 11:37:01.651110 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-8rr4j"] Oct 07 11:37:01 crc kubenswrapper[4700]: W1007 11:37:01.657739 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90c677ff_4aff_4044_b765_e0b3de056302.slice/crio-33346308e121e23620208bc252fb8509e7038576cc0c22bbc325d02f5c9ff63d WatchSource:0}: Error finding container 33346308e121e23620208bc252fb8509e7038576cc0c22bbc325d02f5c9ff63d: Status 404 returned error can't find the container with id 33346308e121e23620208bc252fb8509e7038576cc0c22bbc325d02f5c9ff63d Oct 07 11:37:02 crc kubenswrapper[4700]: I1007 11:37:02.145967 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8rr4j" event={"ID":"90c677ff-4aff-4044-b765-e0b3de056302","Type":"ContainerStarted","Data":"33346308e121e23620208bc252fb8509e7038576cc0c22bbc325d02f5c9ff63d"} Oct 07 11:37:02 crc kubenswrapper[4700]: I1007 11:37:02.148164 4700 generic.go:334] "Generic (PLEG): container finished" podID="cd89bc50-bb18-405d-863b-5edf57509f7f" containerID="d6519d39549e4cafa9a6067d639928383eaf545d245f7975404db2d3de28f026" exitCode=0 Oct 07 11:37:02 crc kubenswrapper[4700]: I1007 11:37:02.148249 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-cktjx" event={"ID":"cd89bc50-bb18-405d-863b-5edf57509f7f","Type":"ContainerDied","Data":"d6519d39549e4cafa9a6067d639928383eaf545d245f7975404db2d3de28f026"} Oct 07 11:37:02 crc kubenswrapper[4700]: I1007 11:37:02.148287 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-cktjx" event={"ID":"cd89bc50-bb18-405d-863b-5edf57509f7f","Type":"ContainerStarted","Data":"bed25eac181297c1de315bf49bdeb320179f285ae8aa5fc5a92e0b2fc3437f1b"} Oct 07 11:37:02 crc kubenswrapper[4700]: I1007 11:37:02.150675 4700 generic.go:334] "Generic (PLEG): container finished" podID="41f08d93-f0df-4917-a21a-70f26ac33e1f" containerID="64f961a309a5f564e3068d2b7817cb5cc6fe374991c15711927ddcdc348fc56a" exitCode=0 Oct 07 11:37:02 crc kubenswrapper[4700]: I1007 11:37:02.150766 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-ffmxn" event={"ID":"41f08d93-f0df-4917-a21a-70f26ac33e1f","Type":"ContainerDied","Data":"64f961a309a5f564e3068d2b7817cb5cc6fe374991c15711927ddcdc348fc56a"} Oct 07 11:37:02 crc kubenswrapper[4700]: I1007 11:37:02.150841 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-ffmxn" event={"ID":"41f08d93-f0df-4917-a21a-70f26ac33e1f","Type":"ContainerStarted","Data":"9700526784c3b1865b196e26f67362a4cc77d29a78e252f2b88f073cee284c1a"} Oct 07 11:37:02 crc kubenswrapper[4700]: I1007 11:37:02.152788 4700 generic.go:334] "Generic (PLEG): container finished" podID="392647fc-fc26-4946-9e59-9a3af9283041" containerID="40348253aebbd51116f135f23c070043b62fbab44caf0b885192942bf8a6c408" exitCode=0 Oct 07 11:37:02 crc kubenswrapper[4700]: I1007 11:37:02.152873 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ckw8z" event={"ID":"392647fc-fc26-4946-9e59-9a3af9283041","Type":"ContainerDied","Data":"40348253aebbd51116f135f23c070043b62fbab44caf0b885192942bf8a6c408"} Oct 07 11:37:02 crc kubenswrapper[4700]: I1007 11:37:02.152896 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ckw8z" event={"ID":"392647fc-fc26-4946-9e59-9a3af9283041","Type":"ContainerStarted","Data":"69db1736c506ac2e35e4453f3121c0167e42f7575a42bd378c9c5599b5ae9532"} Oct 07 11:37:02 crc kubenswrapper[4700]: I1007 11:37:02.154499 4700 generic.go:334] "Generic (PLEG): container finished" podID="de501b70-8d75-489f-9826-e0f1dd7306f7" containerID="9cf6042d6634403e51838cea86a617b31b97412c31cddb072e05249464f8d133" exitCode=0 Oct 07 11:37:02 crc kubenswrapper[4700]: I1007 11:37:02.154574 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-964gm" event={"ID":"de501b70-8d75-489f-9826-e0f1dd7306f7","Type":"ContainerDied","Data":"9cf6042d6634403e51838cea86a617b31b97412c31cddb072e05249464f8d133"} Oct 07 11:37:02 crc kubenswrapper[4700]: I1007 11:37:02.154639 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-964gm" event={"ID":"de501b70-8d75-489f-9826-e0f1dd7306f7","Type":"ContainerStarted","Data":"e853c5a69e114d32188291dc43dc178d01038611f4f07d1628732fe590fdb16e"} Oct 07 11:37:03 crc kubenswrapper[4700]: I1007 11:37:03.596624 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ckw8z" Oct 07 11:37:03 crc kubenswrapper[4700]: I1007 11:37:03.649923 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbcfl\" (UniqueName: \"kubernetes.io/projected/392647fc-fc26-4946-9e59-9a3af9283041-kube-api-access-hbcfl\") pod \"392647fc-fc26-4946-9e59-9a3af9283041\" (UID: \"392647fc-fc26-4946-9e59-9a3af9283041\") " Oct 07 11:37:03 crc kubenswrapper[4700]: I1007 11:37:03.654999 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/392647fc-fc26-4946-9e59-9a3af9283041-kube-api-access-hbcfl" (OuterVolumeSpecName: "kube-api-access-hbcfl") pod "392647fc-fc26-4946-9e59-9a3af9283041" (UID: "392647fc-fc26-4946-9e59-9a3af9283041"). InnerVolumeSpecName "kube-api-access-hbcfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:37:03 crc kubenswrapper[4700]: I1007 11:37:03.752119 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbcfl\" (UniqueName: \"kubernetes.io/projected/392647fc-fc26-4946-9e59-9a3af9283041-kube-api-access-hbcfl\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:03 crc kubenswrapper[4700]: E1007 11:37:03.934382 4700 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.13:44876->38.102.83.13:35247: read tcp 38.102.83.13:44876->38.102.83.13:35247: read: connection reset by peer Oct 07 11:37:04 crc kubenswrapper[4700]: I1007 11:37:04.194949 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ckw8z" event={"ID":"392647fc-fc26-4946-9e59-9a3af9283041","Type":"ContainerDied","Data":"69db1736c506ac2e35e4453f3121c0167e42f7575a42bd378c9c5599b5ae9532"} Oct 07 11:37:04 crc kubenswrapper[4700]: I1007 11:37:04.195315 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69db1736c506ac2e35e4453f3121c0167e42f7575a42bd378c9c5599b5ae9532" Oct 07 11:37:04 crc kubenswrapper[4700]: I1007 11:37:04.195010 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ckw8z" Oct 07 11:37:06 crc kubenswrapper[4700]: I1007 11:37:06.208819 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-ffmxn" Oct 07 11:37:06 crc kubenswrapper[4700]: I1007 11:37:06.215643 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-cktjx" event={"ID":"cd89bc50-bb18-405d-863b-5edf57509f7f","Type":"ContainerDied","Data":"bed25eac181297c1de315bf49bdeb320179f285ae8aa5fc5a92e0b2fc3437f1b"} Oct 07 11:37:06 crc kubenswrapper[4700]: I1007 11:37:06.215684 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bed25eac181297c1de315bf49bdeb320179f285ae8aa5fc5a92e0b2fc3437f1b" Oct 07 11:37:06 crc kubenswrapper[4700]: I1007 11:37:06.218844 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-964gm" Oct 07 11:37:06 crc kubenswrapper[4700]: I1007 11:37:06.218840 4700 generic.go:334] "Generic (PLEG): container finished" podID="40ee0582-037f-452a-a529-bedd3e1f51c9" containerID="f5373611b12383a8276693b628d0d46bbe05067429ae8a07113a26b059d96ea7" exitCode=0 Oct 07 11:37:06 crc kubenswrapper[4700]: I1007 11:37:06.218988 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-24ghv" event={"ID":"40ee0582-037f-452a-a529-bedd3e1f51c9","Type":"ContainerDied","Data":"f5373611b12383a8276693b628d0d46bbe05067429ae8a07113a26b059d96ea7"} Oct 07 11:37:06 crc kubenswrapper[4700]: I1007 11:37:06.221230 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-ffmxn" Oct 07 11:37:06 crc kubenswrapper[4700]: I1007 11:37:06.221235 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-ffmxn" event={"ID":"41f08d93-f0df-4917-a21a-70f26ac33e1f","Type":"ContainerDied","Data":"9700526784c3b1865b196e26f67362a4cc77d29a78e252f2b88f073cee284c1a"} Oct 07 11:37:06 crc kubenswrapper[4700]: I1007 11:37:06.221500 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9700526784c3b1865b196e26f67362a4cc77d29a78e252f2b88f073cee284c1a" Oct 07 11:37:06 crc kubenswrapper[4700]: I1007 11:37:06.229892 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cktjx" Oct 07 11:37:06 crc kubenswrapper[4700]: I1007 11:37:06.250664 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-964gm" event={"ID":"de501b70-8d75-489f-9826-e0f1dd7306f7","Type":"ContainerDied","Data":"e853c5a69e114d32188291dc43dc178d01038611f4f07d1628732fe590fdb16e"} Oct 07 11:37:06 crc kubenswrapper[4700]: I1007 11:37:06.250723 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e853c5a69e114d32188291dc43dc178d01038611f4f07d1628732fe590fdb16e" Oct 07 11:37:06 crc kubenswrapper[4700]: I1007 11:37:06.250820 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-964gm" Oct 07 11:37:06 crc kubenswrapper[4700]: I1007 11:37:06.303325 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-946s2\" (UniqueName: \"kubernetes.io/projected/41f08d93-f0df-4917-a21a-70f26ac33e1f-kube-api-access-946s2\") pod \"41f08d93-f0df-4917-a21a-70f26ac33e1f\" (UID: \"41f08d93-f0df-4917-a21a-70f26ac33e1f\") " Oct 07 11:37:06 crc kubenswrapper[4700]: I1007 11:37:06.303848 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sks2x\" (UniqueName: \"kubernetes.io/projected/de501b70-8d75-489f-9826-e0f1dd7306f7-kube-api-access-sks2x\") pod \"de501b70-8d75-489f-9826-e0f1dd7306f7\" (UID: \"de501b70-8d75-489f-9826-e0f1dd7306f7\") " Oct 07 11:37:06 crc kubenswrapper[4700]: I1007 11:37:06.303888 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s48sv\" (UniqueName: \"kubernetes.io/projected/cd89bc50-bb18-405d-863b-5edf57509f7f-kube-api-access-s48sv\") pod \"cd89bc50-bb18-405d-863b-5edf57509f7f\" (UID: \"cd89bc50-bb18-405d-863b-5edf57509f7f\") " Oct 07 11:37:06 crc kubenswrapper[4700]: I1007 11:37:06.307951 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de501b70-8d75-489f-9826-e0f1dd7306f7-kube-api-access-sks2x" (OuterVolumeSpecName: "kube-api-access-sks2x") pod "de501b70-8d75-489f-9826-e0f1dd7306f7" (UID: "de501b70-8d75-489f-9826-e0f1dd7306f7"). InnerVolumeSpecName "kube-api-access-sks2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:37:06 crc kubenswrapper[4700]: I1007 11:37:06.308823 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41f08d93-f0df-4917-a21a-70f26ac33e1f-kube-api-access-946s2" (OuterVolumeSpecName: "kube-api-access-946s2") pod "41f08d93-f0df-4917-a21a-70f26ac33e1f" (UID: "41f08d93-f0df-4917-a21a-70f26ac33e1f"). InnerVolumeSpecName "kube-api-access-946s2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:37:06 crc kubenswrapper[4700]: I1007 11:37:06.308974 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd89bc50-bb18-405d-863b-5edf57509f7f-kube-api-access-s48sv" (OuterVolumeSpecName: "kube-api-access-s48sv") pod "cd89bc50-bb18-405d-863b-5edf57509f7f" (UID: "cd89bc50-bb18-405d-863b-5edf57509f7f"). InnerVolumeSpecName "kube-api-access-s48sv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:37:06 crc kubenswrapper[4700]: I1007 11:37:06.406039 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-946s2\" (UniqueName: \"kubernetes.io/projected/41f08d93-f0df-4917-a21a-70f26ac33e1f-kube-api-access-946s2\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:06 crc kubenswrapper[4700]: I1007 11:37:06.406068 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sks2x\" (UniqueName: \"kubernetes.io/projected/de501b70-8d75-489f-9826-e0f1dd7306f7-kube-api-access-sks2x\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:06 crc kubenswrapper[4700]: I1007 11:37:06.406080 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s48sv\" (UniqueName: \"kubernetes.io/projected/cd89bc50-bb18-405d-863b-5edf57509f7f-kube-api-access-s48sv\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:07 crc kubenswrapper[4700]: I1007 11:37:07.273061 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cktjx" Oct 07 11:37:07 crc kubenswrapper[4700]: I1007 11:37:07.273066 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8rr4j" event={"ID":"90c677ff-4aff-4044-b765-e0b3de056302","Type":"ContainerStarted","Data":"119214a849ac67aa5eea076516b83ce8f957dd596c73519418edbb8198b10be8"} Oct 07 11:37:07 crc kubenswrapper[4700]: I1007 11:37:07.338994 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-8rr4j" podStartSLOduration=2.935058399 podStartE2EDuration="7.338941826s" podCreationTimestamp="2025-10-07 11:37:00 +0000 UTC" firstStartedPulling="2025-10-07 11:37:01.664802594 +0000 UTC m=+988.461201583" lastFinishedPulling="2025-10-07 11:37:06.068685981 +0000 UTC m=+992.865085010" observedRunningTime="2025-10-07 11:37:07.302489502 +0000 UTC m=+994.098888541" watchObservedRunningTime="2025-10-07 11:37:07.338941826 +0000 UTC m=+994.135340825" Oct 07 11:37:07 crc kubenswrapper[4700]: I1007 11:37:07.636520 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77585f5f8c-dmn6s" Oct 07 11:37:07 crc kubenswrapper[4700]: I1007 11:37:07.692069 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-brvlc"] Oct 07 11:37:07 crc kubenswrapper[4700]: I1007 11:37:07.692291 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-brvlc" podUID="0f6ec6b3-bf89-4f27-a2df-d07a61eee130" containerName="dnsmasq-dns" containerID="cri-o://8d3af7cd8017204a38e31f2c00f0163c15c749089e772805f1c7805c37f96833" gracePeriod=10 Oct 07 11:37:07 crc kubenswrapper[4700]: I1007 11:37:07.929354 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-24ghv" Oct 07 11:37:07 crc kubenswrapper[4700]: I1007 11:37:07.938460 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/40ee0582-037f-452a-a529-bedd3e1f51c9-db-sync-config-data\") pod \"40ee0582-037f-452a-a529-bedd3e1f51c9\" (UID: \"40ee0582-037f-452a-a529-bedd3e1f51c9\") " Oct 07 11:37:07 crc kubenswrapper[4700]: I1007 11:37:07.938524 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40ee0582-037f-452a-a529-bedd3e1f51c9-combined-ca-bundle\") pod \"40ee0582-037f-452a-a529-bedd3e1f51c9\" (UID: \"40ee0582-037f-452a-a529-bedd3e1f51c9\") " Oct 07 11:37:07 crc kubenswrapper[4700]: I1007 11:37:07.938552 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40ee0582-037f-452a-a529-bedd3e1f51c9-config-data\") pod \"40ee0582-037f-452a-a529-bedd3e1f51c9\" (UID: \"40ee0582-037f-452a-a529-bedd3e1f51c9\") " Oct 07 11:37:07 crc kubenswrapper[4700]: I1007 11:37:07.938626 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lk88\" (UniqueName: \"kubernetes.io/projected/40ee0582-037f-452a-a529-bedd3e1f51c9-kube-api-access-7lk88\") pod \"40ee0582-037f-452a-a529-bedd3e1f51c9\" (UID: \"40ee0582-037f-452a-a529-bedd3e1f51c9\") " Oct 07 11:37:07 crc kubenswrapper[4700]: I1007 11:37:07.948881 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40ee0582-037f-452a-a529-bedd3e1f51c9-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "40ee0582-037f-452a-a529-bedd3e1f51c9" (UID: "40ee0582-037f-452a-a529-bedd3e1f51c9"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:37:07 crc kubenswrapper[4700]: I1007 11:37:07.957151 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40ee0582-037f-452a-a529-bedd3e1f51c9-kube-api-access-7lk88" (OuterVolumeSpecName: "kube-api-access-7lk88") pod "40ee0582-037f-452a-a529-bedd3e1f51c9" (UID: "40ee0582-037f-452a-a529-bedd3e1f51c9"). InnerVolumeSpecName "kube-api-access-7lk88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:37:07 crc kubenswrapper[4700]: I1007 11:37:07.980836 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40ee0582-037f-452a-a529-bedd3e1f51c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40ee0582-037f-452a-a529-bedd3e1f51c9" (UID: "40ee0582-037f-452a-a529-bedd3e1f51c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.015586 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40ee0582-037f-452a-a529-bedd3e1f51c9-config-data" (OuterVolumeSpecName: "config-data") pod "40ee0582-037f-452a-a529-bedd3e1f51c9" (UID: "40ee0582-037f-452a-a529-bedd3e1f51c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.039869 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lk88\" (UniqueName: \"kubernetes.io/projected/40ee0582-037f-452a-a529-bedd3e1f51c9-kube-api-access-7lk88\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.039900 4700 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/40ee0582-037f-452a-a529-bedd3e1f51c9-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.039909 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40ee0582-037f-452a-a529-bedd3e1f51c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.039925 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40ee0582-037f-452a-a529-bedd3e1f51c9-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.104216 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-brvlc" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.242804 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7c7t\" (UniqueName: \"kubernetes.io/projected/0f6ec6b3-bf89-4f27-a2df-d07a61eee130-kube-api-access-j7c7t\") pod \"0f6ec6b3-bf89-4f27-a2df-d07a61eee130\" (UID: \"0f6ec6b3-bf89-4f27-a2df-d07a61eee130\") " Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.243050 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f6ec6b3-bf89-4f27-a2df-d07a61eee130-ovsdbserver-sb\") pod \"0f6ec6b3-bf89-4f27-a2df-d07a61eee130\" (UID: \"0f6ec6b3-bf89-4f27-a2df-d07a61eee130\") " Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.243123 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f6ec6b3-bf89-4f27-a2df-d07a61eee130-config\") pod \"0f6ec6b3-bf89-4f27-a2df-d07a61eee130\" (UID: \"0f6ec6b3-bf89-4f27-a2df-d07a61eee130\") " Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.243141 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f6ec6b3-bf89-4f27-a2df-d07a61eee130-dns-svc\") pod \"0f6ec6b3-bf89-4f27-a2df-d07a61eee130\" (UID: \"0f6ec6b3-bf89-4f27-a2df-d07a61eee130\") " Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.243196 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f6ec6b3-bf89-4f27-a2df-d07a61eee130-ovsdbserver-nb\") pod \"0f6ec6b3-bf89-4f27-a2df-d07a61eee130\" (UID: \"0f6ec6b3-bf89-4f27-a2df-d07a61eee130\") " Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.248590 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f6ec6b3-bf89-4f27-a2df-d07a61eee130-kube-api-access-j7c7t" (OuterVolumeSpecName: "kube-api-access-j7c7t") pod "0f6ec6b3-bf89-4f27-a2df-d07a61eee130" (UID: "0f6ec6b3-bf89-4f27-a2df-d07a61eee130"). InnerVolumeSpecName "kube-api-access-j7c7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.290293 4700 generic.go:334] "Generic (PLEG): container finished" podID="0f6ec6b3-bf89-4f27-a2df-d07a61eee130" containerID="8d3af7cd8017204a38e31f2c00f0163c15c749089e772805f1c7805c37f96833" exitCode=0 Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.290445 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-brvlc" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.290574 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-brvlc" event={"ID":"0f6ec6b3-bf89-4f27-a2df-d07a61eee130","Type":"ContainerDied","Data":"8d3af7cd8017204a38e31f2c00f0163c15c749089e772805f1c7805c37f96833"} Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.290607 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-brvlc" event={"ID":"0f6ec6b3-bf89-4f27-a2df-d07a61eee130","Type":"ContainerDied","Data":"250e9a5e1f2f5cc9fcf59cb967649e997a04b0f4bcf6783c484211000c33e1f6"} Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.290625 4700 scope.go:117] "RemoveContainer" containerID="8d3af7cd8017204a38e31f2c00f0163c15c749089e772805f1c7805c37f96833" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.292508 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f6ec6b3-bf89-4f27-a2df-d07a61eee130-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0f6ec6b3-bf89-4f27-a2df-d07a61eee130" (UID: "0f6ec6b3-bf89-4f27-a2df-d07a61eee130"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.294343 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-24ghv" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.294396 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-24ghv" event={"ID":"40ee0582-037f-452a-a529-bedd3e1f51c9","Type":"ContainerDied","Data":"244f1754ce3233246c0175f64d3c2600e07380dfb7194fea6dd6454f0507dbe0"} Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.294456 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="244f1754ce3233246c0175f64d3c2600e07380dfb7194fea6dd6454f0507dbe0" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.298872 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f6ec6b3-bf89-4f27-a2df-d07a61eee130-config" (OuterVolumeSpecName: "config") pod "0f6ec6b3-bf89-4f27-a2df-d07a61eee130" (UID: "0f6ec6b3-bf89-4f27-a2df-d07a61eee130"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.312320 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f6ec6b3-bf89-4f27-a2df-d07a61eee130-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0f6ec6b3-bf89-4f27-a2df-d07a61eee130" (UID: "0f6ec6b3-bf89-4f27-a2df-d07a61eee130"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.323762 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f6ec6b3-bf89-4f27-a2df-d07a61eee130-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0f6ec6b3-bf89-4f27-a2df-d07a61eee130" (UID: "0f6ec6b3-bf89-4f27-a2df-d07a61eee130"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.345090 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7c7t\" (UniqueName: \"kubernetes.io/projected/0f6ec6b3-bf89-4f27-a2df-d07a61eee130-kube-api-access-j7c7t\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.345117 4700 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f6ec6b3-bf89-4f27-a2df-d07a61eee130-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.345126 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f6ec6b3-bf89-4f27-a2df-d07a61eee130-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.345157 4700 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f6ec6b3-bf89-4f27-a2df-d07a61eee130-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.345165 4700 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f6ec6b3-bf89-4f27-a2df-d07a61eee130-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.383891 4700 scope.go:117] "RemoveContainer" containerID="7cc34004825b9b0178a2d81e0a58de64c4845da4f033babfe04f2c97492042e4" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.419244 4700 scope.go:117] "RemoveContainer" containerID="8d3af7cd8017204a38e31f2c00f0163c15c749089e772805f1c7805c37f96833" Oct 07 11:37:08 crc kubenswrapper[4700]: E1007 11:37:08.419758 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d3af7cd8017204a38e31f2c00f0163c15c749089e772805f1c7805c37f96833\": container with ID starting with 8d3af7cd8017204a38e31f2c00f0163c15c749089e772805f1c7805c37f96833 not found: ID does not exist" containerID="8d3af7cd8017204a38e31f2c00f0163c15c749089e772805f1c7805c37f96833" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.419794 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d3af7cd8017204a38e31f2c00f0163c15c749089e772805f1c7805c37f96833"} err="failed to get container status \"8d3af7cd8017204a38e31f2c00f0163c15c749089e772805f1c7805c37f96833\": rpc error: code = NotFound desc = could not find container \"8d3af7cd8017204a38e31f2c00f0163c15c749089e772805f1c7805c37f96833\": container with ID starting with 8d3af7cd8017204a38e31f2c00f0163c15c749089e772805f1c7805c37f96833 not found: ID does not exist" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.419819 4700 scope.go:117] "RemoveContainer" containerID="7cc34004825b9b0178a2d81e0a58de64c4845da4f033babfe04f2c97492042e4" Oct 07 11:37:08 crc kubenswrapper[4700]: E1007 11:37:08.420720 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cc34004825b9b0178a2d81e0a58de64c4845da4f033babfe04f2c97492042e4\": container with ID starting with 7cc34004825b9b0178a2d81e0a58de64c4845da4f033babfe04f2c97492042e4 not found: ID does not exist" containerID="7cc34004825b9b0178a2d81e0a58de64c4845da4f033babfe04f2c97492042e4" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.420777 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cc34004825b9b0178a2d81e0a58de64c4845da4f033babfe04f2c97492042e4"} err="failed to get container status \"7cc34004825b9b0178a2d81e0a58de64c4845da4f033babfe04f2c97492042e4\": rpc error: code = NotFound desc = could not find container \"7cc34004825b9b0178a2d81e0a58de64c4845da4f033babfe04f2c97492042e4\": container with ID starting with 7cc34004825b9b0178a2d81e0a58de64c4845da4f033babfe04f2c97492042e4 not found: ID does not exist" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.628521 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-brvlc"] Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.642171 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-brvlc"] Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.659393 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-8wtm6"] Oct 07 11:37:08 crc kubenswrapper[4700]: E1007 11:37:08.659724 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd89bc50-bb18-405d-863b-5edf57509f7f" containerName="mariadb-database-create" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.659736 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd89bc50-bb18-405d-863b-5edf57509f7f" containerName="mariadb-database-create" Oct 07 11:37:08 crc kubenswrapper[4700]: E1007 11:37:08.659746 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="392647fc-fc26-4946-9e59-9a3af9283041" containerName="mariadb-database-create" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.659752 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="392647fc-fc26-4946-9e59-9a3af9283041" containerName="mariadb-database-create" Oct 07 11:37:08 crc kubenswrapper[4700]: E1007 11:37:08.659768 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40ee0582-037f-452a-a529-bedd3e1f51c9" containerName="glance-db-sync" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.659775 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="40ee0582-037f-452a-a529-bedd3e1f51c9" containerName="glance-db-sync" Oct 07 11:37:08 crc kubenswrapper[4700]: E1007 11:37:08.659789 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41f08d93-f0df-4917-a21a-70f26ac33e1f" containerName="mariadb-database-create" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.659795 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="41f08d93-f0df-4917-a21a-70f26ac33e1f" containerName="mariadb-database-create" Oct 07 11:37:08 crc kubenswrapper[4700]: E1007 11:37:08.659804 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f6ec6b3-bf89-4f27-a2df-d07a61eee130" containerName="dnsmasq-dns" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.659810 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f6ec6b3-bf89-4f27-a2df-d07a61eee130" containerName="dnsmasq-dns" Oct 07 11:37:08 crc kubenswrapper[4700]: E1007 11:37:08.659820 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de501b70-8d75-489f-9826-e0f1dd7306f7" containerName="mariadb-database-create" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.659825 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="de501b70-8d75-489f-9826-e0f1dd7306f7" containerName="mariadb-database-create" Oct 07 11:37:08 crc kubenswrapper[4700]: E1007 11:37:08.659836 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f6ec6b3-bf89-4f27-a2df-d07a61eee130" containerName="init" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.659841 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f6ec6b3-bf89-4f27-a2df-d07a61eee130" containerName="init" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.659996 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="40ee0582-037f-452a-a529-bedd3e1f51c9" containerName="glance-db-sync" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.660009 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="392647fc-fc26-4946-9e59-9a3af9283041" containerName="mariadb-database-create" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.660019 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd89bc50-bb18-405d-863b-5edf57509f7f" containerName="mariadb-database-create" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.660033 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="41f08d93-f0df-4917-a21a-70f26ac33e1f" containerName="mariadb-database-create" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.660039 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f6ec6b3-bf89-4f27-a2df-d07a61eee130" containerName="dnsmasq-dns" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.660051 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="de501b70-8d75-489f-9826-e0f1dd7306f7" containerName="mariadb-database-create" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.660823 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-8wtm6" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.661552 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a8b4bab-bbbf-44e8-8205-3eb77e531abf-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-8wtm6\" (UID: \"3a8b4bab-bbbf-44e8-8205-3eb77e531abf\") " pod="openstack/dnsmasq-dns-7ff5475cc9-8wtm6" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.661595 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a8b4bab-bbbf-44e8-8205-3eb77e531abf-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-8wtm6\" (UID: \"3a8b4bab-bbbf-44e8-8205-3eb77e531abf\") " pod="openstack/dnsmasq-dns-7ff5475cc9-8wtm6" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.661637 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3a8b4bab-bbbf-44e8-8205-3eb77e531abf-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-8wtm6\" (UID: \"3a8b4bab-bbbf-44e8-8205-3eb77e531abf\") " pod="openstack/dnsmasq-dns-7ff5475cc9-8wtm6" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.661656 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74794\" (UniqueName: \"kubernetes.io/projected/3a8b4bab-bbbf-44e8-8205-3eb77e531abf-kube-api-access-74794\") pod \"dnsmasq-dns-7ff5475cc9-8wtm6\" (UID: \"3a8b4bab-bbbf-44e8-8205-3eb77e531abf\") " pod="openstack/dnsmasq-dns-7ff5475cc9-8wtm6" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.661721 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a8b4bab-bbbf-44e8-8205-3eb77e531abf-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-8wtm6\" (UID: \"3a8b4bab-bbbf-44e8-8205-3eb77e531abf\") " pod="openstack/dnsmasq-dns-7ff5475cc9-8wtm6" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.661773 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a8b4bab-bbbf-44e8-8205-3eb77e531abf-config\") pod \"dnsmasq-dns-7ff5475cc9-8wtm6\" (UID: \"3a8b4bab-bbbf-44e8-8205-3eb77e531abf\") " pod="openstack/dnsmasq-dns-7ff5475cc9-8wtm6" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.713755 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-8wtm6"] Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.763123 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a8b4bab-bbbf-44e8-8205-3eb77e531abf-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-8wtm6\" (UID: \"3a8b4bab-bbbf-44e8-8205-3eb77e531abf\") " pod="openstack/dnsmasq-dns-7ff5475cc9-8wtm6" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.763178 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3a8b4bab-bbbf-44e8-8205-3eb77e531abf-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-8wtm6\" (UID: \"3a8b4bab-bbbf-44e8-8205-3eb77e531abf\") " pod="openstack/dnsmasq-dns-7ff5475cc9-8wtm6" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.763197 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74794\" (UniqueName: \"kubernetes.io/projected/3a8b4bab-bbbf-44e8-8205-3eb77e531abf-kube-api-access-74794\") pod \"dnsmasq-dns-7ff5475cc9-8wtm6\" (UID: \"3a8b4bab-bbbf-44e8-8205-3eb77e531abf\") " pod="openstack/dnsmasq-dns-7ff5475cc9-8wtm6" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.763234 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a8b4bab-bbbf-44e8-8205-3eb77e531abf-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-8wtm6\" (UID: \"3a8b4bab-bbbf-44e8-8205-3eb77e531abf\") " pod="openstack/dnsmasq-dns-7ff5475cc9-8wtm6" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.763250 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a8b4bab-bbbf-44e8-8205-3eb77e531abf-config\") pod \"dnsmasq-dns-7ff5475cc9-8wtm6\" (UID: \"3a8b4bab-bbbf-44e8-8205-3eb77e531abf\") " pod="openstack/dnsmasq-dns-7ff5475cc9-8wtm6" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.763337 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a8b4bab-bbbf-44e8-8205-3eb77e531abf-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-8wtm6\" (UID: \"3a8b4bab-bbbf-44e8-8205-3eb77e531abf\") " pod="openstack/dnsmasq-dns-7ff5475cc9-8wtm6" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.764138 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a8b4bab-bbbf-44e8-8205-3eb77e531abf-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-8wtm6\" (UID: \"3a8b4bab-bbbf-44e8-8205-3eb77e531abf\") " pod="openstack/dnsmasq-dns-7ff5475cc9-8wtm6" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.764813 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a8b4bab-bbbf-44e8-8205-3eb77e531abf-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-8wtm6\" (UID: \"3a8b4bab-bbbf-44e8-8205-3eb77e531abf\") " pod="openstack/dnsmasq-dns-7ff5475cc9-8wtm6" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.765382 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3a8b4bab-bbbf-44e8-8205-3eb77e531abf-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-8wtm6\" (UID: \"3a8b4bab-bbbf-44e8-8205-3eb77e531abf\") " pod="openstack/dnsmasq-dns-7ff5475cc9-8wtm6" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.766187 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a8b4bab-bbbf-44e8-8205-3eb77e531abf-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-8wtm6\" (UID: \"3a8b4bab-bbbf-44e8-8205-3eb77e531abf\") " pod="openstack/dnsmasq-dns-7ff5475cc9-8wtm6" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.766793 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a8b4bab-bbbf-44e8-8205-3eb77e531abf-config\") pod \"dnsmasq-dns-7ff5475cc9-8wtm6\" (UID: \"3a8b4bab-bbbf-44e8-8205-3eb77e531abf\") " pod="openstack/dnsmasq-dns-7ff5475cc9-8wtm6" Oct 07 11:37:08 crc kubenswrapper[4700]: I1007 11:37:08.783612 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74794\" (UniqueName: \"kubernetes.io/projected/3a8b4bab-bbbf-44e8-8205-3eb77e531abf-kube-api-access-74794\") pod \"dnsmasq-dns-7ff5475cc9-8wtm6\" (UID: \"3a8b4bab-bbbf-44e8-8205-3eb77e531abf\") " pod="openstack/dnsmasq-dns-7ff5475cc9-8wtm6" Oct 07 11:37:09 crc kubenswrapper[4700]: I1007 11:37:09.028751 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-8wtm6" Oct 07 11:37:09 crc kubenswrapper[4700]: I1007 11:37:09.491609 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-8wtm6"] Oct 07 11:37:09 crc kubenswrapper[4700]: I1007 11:37:09.969065 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f6ec6b3-bf89-4f27-a2df-d07a61eee130" path="/var/lib/kubelet/pods/0f6ec6b3-bf89-4f27-a2df-d07a61eee130/volumes" Oct 07 11:37:10 crc kubenswrapper[4700]: I1007 11:37:10.313831 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-8wtm6" event={"ID":"3a8b4bab-bbbf-44e8-8205-3eb77e531abf","Type":"ContainerStarted","Data":"381dcaec1abfe5eb901a6b6cf1162ac8a81bd795b34f6652b6cf3d39b5c58b3d"} Oct 07 11:37:10 crc kubenswrapper[4700]: I1007 11:37:10.316070 4700 generic.go:334] "Generic (PLEG): container finished" podID="90c677ff-4aff-4044-b765-e0b3de056302" containerID="119214a849ac67aa5eea076516b83ce8f957dd596c73519418edbb8198b10be8" exitCode=0 Oct 07 11:37:10 crc kubenswrapper[4700]: I1007 11:37:10.316111 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8rr4j" event={"ID":"90c677ff-4aff-4044-b765-e0b3de056302","Type":"ContainerDied","Data":"119214a849ac67aa5eea076516b83ce8f957dd596c73519418edbb8198b10be8"} Oct 07 11:37:10 crc kubenswrapper[4700]: I1007 11:37:10.575934 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-4750-account-create-h6rmf"] Oct 07 11:37:10 crc kubenswrapper[4700]: I1007 11:37:10.584657 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4750-account-create-h6rmf" Oct 07 11:37:10 crc kubenswrapper[4700]: I1007 11:37:10.591569 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 07 11:37:10 crc kubenswrapper[4700]: I1007 11:37:10.599038 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs99x\" (UniqueName: \"kubernetes.io/projected/6bce568a-3e8c-4994-ab06-02cec9b80cc3-kube-api-access-cs99x\") pod \"barbican-4750-account-create-h6rmf\" (UID: \"6bce568a-3e8c-4994-ab06-02cec9b80cc3\") " pod="openstack/barbican-4750-account-create-h6rmf" Oct 07 11:37:10 crc kubenswrapper[4700]: I1007 11:37:10.617680 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-4750-account-create-h6rmf"] Oct 07 11:37:10 crc kubenswrapper[4700]: I1007 11:37:10.672152 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-897f-account-create-qqqnz"] Oct 07 11:37:10 crc kubenswrapper[4700]: I1007 11:37:10.673593 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-897f-account-create-qqqnz" Oct 07 11:37:10 crc kubenswrapper[4700]: I1007 11:37:10.676029 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 07 11:37:10 crc kubenswrapper[4700]: I1007 11:37:10.679900 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-897f-account-create-qqqnz"] Oct 07 11:37:10 crc kubenswrapper[4700]: I1007 11:37:10.701532 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cs99x\" (UniqueName: \"kubernetes.io/projected/6bce568a-3e8c-4994-ab06-02cec9b80cc3-kube-api-access-cs99x\") pod \"barbican-4750-account-create-h6rmf\" (UID: \"6bce568a-3e8c-4994-ab06-02cec9b80cc3\") " pod="openstack/barbican-4750-account-create-h6rmf" Oct 07 11:37:10 crc kubenswrapper[4700]: I1007 11:37:10.717348 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs99x\" (UniqueName: \"kubernetes.io/projected/6bce568a-3e8c-4994-ab06-02cec9b80cc3-kube-api-access-cs99x\") pod \"barbican-4750-account-create-h6rmf\" (UID: \"6bce568a-3e8c-4994-ab06-02cec9b80cc3\") " pod="openstack/barbican-4750-account-create-h6rmf" Oct 07 11:37:10 crc kubenswrapper[4700]: I1007 11:37:10.803711 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74r4r\" (UniqueName: \"kubernetes.io/projected/1ab6e4ca-6676-4f0b-ac32-ca674ee7dfa9-kube-api-access-74r4r\") pod \"cinder-897f-account-create-qqqnz\" (UID: \"1ab6e4ca-6676-4f0b-ac32-ca674ee7dfa9\") " pod="openstack/cinder-897f-account-create-qqqnz" Oct 07 11:37:10 crc kubenswrapper[4700]: I1007 11:37:10.904718 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74r4r\" (UniqueName: \"kubernetes.io/projected/1ab6e4ca-6676-4f0b-ac32-ca674ee7dfa9-kube-api-access-74r4r\") pod \"cinder-897f-account-create-qqqnz\" (UID: \"1ab6e4ca-6676-4f0b-ac32-ca674ee7dfa9\") " pod="openstack/cinder-897f-account-create-qqqnz" Oct 07 11:37:10 crc kubenswrapper[4700]: I1007 11:37:10.907622 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4750-account-create-h6rmf" Oct 07 11:37:10 crc kubenswrapper[4700]: I1007 11:37:10.924002 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74r4r\" (UniqueName: \"kubernetes.io/projected/1ab6e4ca-6676-4f0b-ac32-ca674ee7dfa9-kube-api-access-74r4r\") pod \"cinder-897f-account-create-qqqnz\" (UID: \"1ab6e4ca-6676-4f0b-ac32-ca674ee7dfa9\") " pod="openstack/cinder-897f-account-create-qqqnz" Oct 07 11:37:10 crc kubenswrapper[4700]: I1007 11:37:10.994771 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-897f-account-create-qqqnz" Oct 07 11:37:12 crc kubenswrapper[4700]: I1007 11:37:11.325549 4700 generic.go:334] "Generic (PLEG): container finished" podID="3a8b4bab-bbbf-44e8-8205-3eb77e531abf" containerID="51f9defbe0f6041e20233e1325512ba5c757afe4d829ec63ee8a530025bce97d" exitCode=0 Oct 07 11:37:12 crc kubenswrapper[4700]: I1007 11:37:11.325587 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-8wtm6" event={"ID":"3a8b4bab-bbbf-44e8-8205-3eb77e531abf","Type":"ContainerDied","Data":"51f9defbe0f6041e20233e1325512ba5c757afe4d829ec63ee8a530025bce97d"} Oct 07 11:37:12 crc kubenswrapper[4700]: I1007 11:37:11.461828 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-4750-account-create-h6rmf"] Oct 07 11:37:12 crc kubenswrapper[4700]: I1007 11:37:11.597979 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-897f-account-create-qqqnz"] Oct 07 11:37:12 crc kubenswrapper[4700]: W1007 11:37:11.609500 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ab6e4ca_6676_4f0b_ac32_ca674ee7dfa9.slice/crio-b9bb0c80149b82c3a278c09fdde55e7cf9ddbe536662b89059912cf48f087dc6 WatchSource:0}: Error finding container b9bb0c80149b82c3a278c09fdde55e7cf9ddbe536662b89059912cf48f087dc6: Status 404 returned error can't find the container with id b9bb0c80149b82c3a278c09fdde55e7cf9ddbe536662b89059912cf48f087dc6 Oct 07 11:37:12 crc kubenswrapper[4700]: I1007 11:37:12.336034 4700 generic.go:334] "Generic (PLEG): container finished" podID="1ab6e4ca-6676-4f0b-ac32-ca674ee7dfa9" containerID="ddf67e85849c352426d96ae60bc9a62de6d8fbb658ded3171492394360995f3e" exitCode=0 Oct 07 11:37:12 crc kubenswrapper[4700]: I1007 11:37:12.336082 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-897f-account-create-qqqnz" event={"ID":"1ab6e4ca-6676-4f0b-ac32-ca674ee7dfa9","Type":"ContainerDied","Data":"ddf67e85849c352426d96ae60bc9a62de6d8fbb658ded3171492394360995f3e"} Oct 07 11:37:12 crc kubenswrapper[4700]: I1007 11:37:12.336684 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-897f-account-create-qqqnz" event={"ID":"1ab6e4ca-6676-4f0b-ac32-ca674ee7dfa9","Type":"ContainerStarted","Data":"b9bb0c80149b82c3a278c09fdde55e7cf9ddbe536662b89059912cf48f087dc6"} Oct 07 11:37:12 crc kubenswrapper[4700]: I1007 11:37:12.338510 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8rr4j" event={"ID":"90c677ff-4aff-4044-b765-e0b3de056302","Type":"ContainerDied","Data":"33346308e121e23620208bc252fb8509e7038576cc0c22bbc325d02f5c9ff63d"} Oct 07 11:37:12 crc kubenswrapper[4700]: I1007 11:37:12.338540 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33346308e121e23620208bc252fb8509e7038576cc0c22bbc325d02f5c9ff63d" Oct 07 11:37:12 crc kubenswrapper[4700]: I1007 11:37:12.344499 4700 generic.go:334] "Generic (PLEG): container finished" podID="6bce568a-3e8c-4994-ab06-02cec9b80cc3" containerID="c4a7c8d6c1646705da0979076f65c39e23bb8d3e449de3dc7b8a5014ca3b2f4e" exitCode=0 Oct 07 11:37:12 crc kubenswrapper[4700]: I1007 11:37:12.344598 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4750-account-create-h6rmf" event={"ID":"6bce568a-3e8c-4994-ab06-02cec9b80cc3","Type":"ContainerDied","Data":"c4a7c8d6c1646705da0979076f65c39e23bb8d3e449de3dc7b8a5014ca3b2f4e"} Oct 07 11:37:12 crc kubenswrapper[4700]: I1007 11:37:12.344631 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4750-account-create-h6rmf" event={"ID":"6bce568a-3e8c-4994-ab06-02cec9b80cc3","Type":"ContainerStarted","Data":"0fed068b28f3c64ed1a542aa2b2597a002a3752d16fa81c696daff085d73dd56"} Oct 07 11:37:12 crc kubenswrapper[4700]: I1007 11:37:12.346226 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-8wtm6" event={"ID":"3a8b4bab-bbbf-44e8-8205-3eb77e531abf","Type":"ContainerStarted","Data":"b870a7fe10113bfb08c56ef47e45370554b1fa11d77722a5cd2eea0a8932f4ae"} Oct 07 11:37:12 crc kubenswrapper[4700]: I1007 11:37:12.346511 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7ff5475cc9-8wtm6" Oct 07 11:37:12 crc kubenswrapper[4700]: I1007 11:37:12.376675 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7ff5475cc9-8wtm6" podStartSLOduration=4.376643979 podStartE2EDuration="4.376643979s" podCreationTimestamp="2025-10-07 11:37:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:37:12.370097678 +0000 UTC m=+999.166496707" watchObservedRunningTime="2025-10-07 11:37:12.376643979 +0000 UTC m=+999.173043008" Oct 07 11:37:12 crc kubenswrapper[4700]: I1007 11:37:12.378185 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8rr4j" Oct 07 11:37:12 crc kubenswrapper[4700]: I1007 11:37:12.443633 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90c677ff-4aff-4044-b765-e0b3de056302-combined-ca-bundle\") pod \"90c677ff-4aff-4044-b765-e0b3de056302\" (UID: \"90c677ff-4aff-4044-b765-e0b3de056302\") " Oct 07 11:37:12 crc kubenswrapper[4700]: I1007 11:37:12.443703 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90c677ff-4aff-4044-b765-e0b3de056302-config-data\") pod \"90c677ff-4aff-4044-b765-e0b3de056302\" (UID: \"90c677ff-4aff-4044-b765-e0b3de056302\") " Oct 07 11:37:12 crc kubenswrapper[4700]: I1007 11:37:12.443746 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwrww\" (UniqueName: \"kubernetes.io/projected/90c677ff-4aff-4044-b765-e0b3de056302-kube-api-access-gwrww\") pod \"90c677ff-4aff-4044-b765-e0b3de056302\" (UID: \"90c677ff-4aff-4044-b765-e0b3de056302\") " Oct 07 11:37:12 crc kubenswrapper[4700]: I1007 11:37:12.453656 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90c677ff-4aff-4044-b765-e0b3de056302-kube-api-access-gwrww" (OuterVolumeSpecName: "kube-api-access-gwrww") pod "90c677ff-4aff-4044-b765-e0b3de056302" (UID: "90c677ff-4aff-4044-b765-e0b3de056302"). InnerVolumeSpecName "kube-api-access-gwrww". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:37:12 crc kubenswrapper[4700]: I1007 11:37:12.484512 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90c677ff-4aff-4044-b765-e0b3de056302-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90c677ff-4aff-4044-b765-e0b3de056302" (UID: "90c677ff-4aff-4044-b765-e0b3de056302"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:37:12 crc kubenswrapper[4700]: I1007 11:37:12.492387 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90c677ff-4aff-4044-b765-e0b3de056302-config-data" (OuterVolumeSpecName: "config-data") pod "90c677ff-4aff-4044-b765-e0b3de056302" (UID: "90c677ff-4aff-4044-b765-e0b3de056302"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:37:12 crc kubenswrapper[4700]: I1007 11:37:12.545532 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90c677ff-4aff-4044-b765-e0b3de056302-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:12 crc kubenswrapper[4700]: I1007 11:37:12.545572 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90c677ff-4aff-4044-b765-e0b3de056302-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:12 crc kubenswrapper[4700]: I1007 11:37:12.545585 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwrww\" (UniqueName: \"kubernetes.io/projected/90c677ff-4aff-4044-b765-e0b3de056302-kube-api-access-gwrww\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.352608 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8rr4j" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.614367 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-8wtm6"] Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.661283 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-qt97c"] Oct 07 11:37:13 crc kubenswrapper[4700]: E1007 11:37:13.661686 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90c677ff-4aff-4044-b765-e0b3de056302" containerName="keystone-db-sync" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.661697 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="90c677ff-4aff-4044-b765-e0b3de056302" containerName="keystone-db-sync" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.661882 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="90c677ff-4aff-4044-b765-e0b3de056302" containerName="keystone-db-sync" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.672585 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qt97c" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.678549 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.678777 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.678915 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.690512 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-58wg8"] Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.694145 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-djssz" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.696799 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-58wg8" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.729425 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qt97c"] Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.761874 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-58wg8"] Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.775613 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74f43b30-1015-447e-bcfe-773238ff545f-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-58wg8\" (UID: \"74f43b30-1015-447e-bcfe-773238ff545f\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-58wg8" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.775651 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7n4c\" (UniqueName: \"kubernetes.io/projected/74f43b30-1015-447e-bcfe-773238ff545f-kube-api-access-t7n4c\") pod \"dnsmasq-dns-5c5cc7c5ff-58wg8\" (UID: \"74f43b30-1015-447e-bcfe-773238ff545f\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-58wg8" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.775675 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34edfc11-9aef-4d9f-9887-d4b698771823-config-data\") pod \"keystone-bootstrap-qt97c\" (UID: \"34edfc11-9aef-4d9f-9887-d4b698771823\") " pod="openstack/keystone-bootstrap-qt97c" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.775709 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49hk5\" (UniqueName: \"kubernetes.io/projected/34edfc11-9aef-4d9f-9887-d4b698771823-kube-api-access-49hk5\") pod \"keystone-bootstrap-qt97c\" (UID: \"34edfc11-9aef-4d9f-9887-d4b698771823\") " pod="openstack/keystone-bootstrap-qt97c" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.775766 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/34edfc11-9aef-4d9f-9887-d4b698771823-fernet-keys\") pod \"keystone-bootstrap-qt97c\" (UID: \"34edfc11-9aef-4d9f-9887-d4b698771823\") " pod="openstack/keystone-bootstrap-qt97c" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.775785 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74f43b30-1015-447e-bcfe-773238ff545f-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-58wg8\" (UID: \"74f43b30-1015-447e-bcfe-773238ff545f\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-58wg8" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.775802 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34edfc11-9aef-4d9f-9887-d4b698771823-scripts\") pod \"keystone-bootstrap-qt97c\" (UID: \"34edfc11-9aef-4d9f-9887-d4b698771823\") " pod="openstack/keystone-bootstrap-qt97c" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.775849 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74f43b30-1015-447e-bcfe-773238ff545f-config\") pod \"dnsmasq-dns-5c5cc7c5ff-58wg8\" (UID: \"74f43b30-1015-447e-bcfe-773238ff545f\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-58wg8" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.775876 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74f43b30-1015-447e-bcfe-773238ff545f-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-58wg8\" (UID: \"74f43b30-1015-447e-bcfe-773238ff545f\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-58wg8" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.775895 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34edfc11-9aef-4d9f-9887-d4b698771823-combined-ca-bundle\") pod \"keystone-bootstrap-qt97c\" (UID: \"34edfc11-9aef-4d9f-9887-d4b698771823\") " pod="openstack/keystone-bootstrap-qt97c" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.775916 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74f43b30-1015-447e-bcfe-773238ff545f-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-58wg8\" (UID: \"74f43b30-1015-447e-bcfe-773238ff545f\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-58wg8" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.775938 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/34edfc11-9aef-4d9f-9887-d4b698771823-credential-keys\") pod \"keystone-bootstrap-qt97c\" (UID: \"34edfc11-9aef-4d9f-9887-d4b698771823\") " pod="openstack/keystone-bootstrap-qt97c" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.848833 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-897f-account-create-qqqnz" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.861891 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4750-account-create-h6rmf" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.881066 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cs99x\" (UniqueName: \"kubernetes.io/projected/6bce568a-3e8c-4994-ab06-02cec9b80cc3-kube-api-access-cs99x\") pod \"6bce568a-3e8c-4994-ab06-02cec9b80cc3\" (UID: \"6bce568a-3e8c-4994-ab06-02cec9b80cc3\") " Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.881156 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74r4r\" (UniqueName: \"kubernetes.io/projected/1ab6e4ca-6676-4f0b-ac32-ca674ee7dfa9-kube-api-access-74r4r\") pod \"1ab6e4ca-6676-4f0b-ac32-ca674ee7dfa9\" (UID: \"1ab6e4ca-6676-4f0b-ac32-ca674ee7dfa9\") " Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.881434 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/34edfc11-9aef-4d9f-9887-d4b698771823-credential-keys\") pod \"keystone-bootstrap-qt97c\" (UID: \"34edfc11-9aef-4d9f-9887-d4b698771823\") " pod="openstack/keystone-bootstrap-qt97c" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.881492 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74f43b30-1015-447e-bcfe-773238ff545f-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-58wg8\" (UID: \"74f43b30-1015-447e-bcfe-773238ff545f\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-58wg8" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.881515 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7n4c\" (UniqueName: \"kubernetes.io/projected/74f43b30-1015-447e-bcfe-773238ff545f-kube-api-access-t7n4c\") pod \"dnsmasq-dns-5c5cc7c5ff-58wg8\" (UID: \"74f43b30-1015-447e-bcfe-773238ff545f\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-58wg8" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.881536 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34edfc11-9aef-4d9f-9887-d4b698771823-config-data\") pod \"keystone-bootstrap-qt97c\" (UID: \"34edfc11-9aef-4d9f-9887-d4b698771823\") " pod="openstack/keystone-bootstrap-qt97c" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.881564 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49hk5\" (UniqueName: \"kubernetes.io/projected/34edfc11-9aef-4d9f-9887-d4b698771823-kube-api-access-49hk5\") pod \"keystone-bootstrap-qt97c\" (UID: \"34edfc11-9aef-4d9f-9887-d4b698771823\") " pod="openstack/keystone-bootstrap-qt97c" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.881588 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/34edfc11-9aef-4d9f-9887-d4b698771823-fernet-keys\") pod \"keystone-bootstrap-qt97c\" (UID: \"34edfc11-9aef-4d9f-9887-d4b698771823\") " pod="openstack/keystone-bootstrap-qt97c" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.881607 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74f43b30-1015-447e-bcfe-773238ff545f-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-58wg8\" (UID: \"74f43b30-1015-447e-bcfe-773238ff545f\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-58wg8" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.881625 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34edfc11-9aef-4d9f-9887-d4b698771823-scripts\") pod \"keystone-bootstrap-qt97c\" (UID: \"34edfc11-9aef-4d9f-9887-d4b698771823\") " pod="openstack/keystone-bootstrap-qt97c" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.881674 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74f43b30-1015-447e-bcfe-773238ff545f-config\") pod \"dnsmasq-dns-5c5cc7c5ff-58wg8\" (UID: \"74f43b30-1015-447e-bcfe-773238ff545f\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-58wg8" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.881702 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74f43b30-1015-447e-bcfe-773238ff545f-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-58wg8\" (UID: \"74f43b30-1015-447e-bcfe-773238ff545f\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-58wg8" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.881720 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34edfc11-9aef-4d9f-9887-d4b698771823-combined-ca-bundle\") pod \"keystone-bootstrap-qt97c\" (UID: \"34edfc11-9aef-4d9f-9887-d4b698771823\") " pod="openstack/keystone-bootstrap-qt97c" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.881744 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74f43b30-1015-447e-bcfe-773238ff545f-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-58wg8\" (UID: \"74f43b30-1015-447e-bcfe-773238ff545f\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-58wg8" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.882647 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74f43b30-1015-447e-bcfe-773238ff545f-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-58wg8\" (UID: \"74f43b30-1015-447e-bcfe-773238ff545f\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-58wg8" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.890082 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74f43b30-1015-447e-bcfe-773238ff545f-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-58wg8\" (UID: \"74f43b30-1015-447e-bcfe-773238ff545f\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-58wg8" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.890849 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74f43b30-1015-447e-bcfe-773238ff545f-config\") pod \"dnsmasq-dns-5c5cc7c5ff-58wg8\" (UID: \"74f43b30-1015-447e-bcfe-773238ff545f\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-58wg8" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.892206 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bce568a-3e8c-4994-ab06-02cec9b80cc3-kube-api-access-cs99x" (OuterVolumeSpecName: "kube-api-access-cs99x") pod "6bce568a-3e8c-4994-ab06-02cec9b80cc3" (UID: "6bce568a-3e8c-4994-ab06-02cec9b80cc3"). InnerVolumeSpecName "kube-api-access-cs99x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.892886 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74f43b30-1015-447e-bcfe-773238ff545f-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-58wg8\" (UID: \"74f43b30-1015-447e-bcfe-773238ff545f\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-58wg8" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.895297 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74f43b30-1015-447e-bcfe-773238ff545f-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-58wg8\" (UID: \"74f43b30-1015-447e-bcfe-773238ff545f\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-58wg8" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.900536 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ab6e4ca-6676-4f0b-ac32-ca674ee7dfa9-kube-api-access-74r4r" (OuterVolumeSpecName: "kube-api-access-74r4r") pod "1ab6e4ca-6676-4f0b-ac32-ca674ee7dfa9" (UID: "1ab6e4ca-6676-4f0b-ac32-ca674ee7dfa9"). InnerVolumeSpecName "kube-api-access-74r4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.910024 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34edfc11-9aef-4d9f-9887-d4b698771823-config-data\") pod \"keystone-bootstrap-qt97c\" (UID: \"34edfc11-9aef-4d9f-9887-d4b698771823\") " pod="openstack/keystone-bootstrap-qt97c" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.911864 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34edfc11-9aef-4d9f-9887-d4b698771823-scripts\") pod \"keystone-bootstrap-qt97c\" (UID: \"34edfc11-9aef-4d9f-9887-d4b698771823\") " pod="openstack/keystone-bootstrap-qt97c" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.916826 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/34edfc11-9aef-4d9f-9887-d4b698771823-fernet-keys\") pod \"keystone-bootstrap-qt97c\" (UID: \"34edfc11-9aef-4d9f-9887-d4b698771823\") " pod="openstack/keystone-bootstrap-qt97c" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.924658 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34edfc11-9aef-4d9f-9887-d4b698771823-combined-ca-bundle\") pod \"keystone-bootstrap-qt97c\" (UID: \"34edfc11-9aef-4d9f-9887-d4b698771823\") " pod="openstack/keystone-bootstrap-qt97c" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.936876 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/34edfc11-9aef-4d9f-9887-d4b698771823-credential-keys\") pod \"keystone-bootstrap-qt97c\" (UID: \"34edfc11-9aef-4d9f-9887-d4b698771823\") " pod="openstack/keystone-bootstrap-qt97c" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.940371 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7n4c\" (UniqueName: \"kubernetes.io/projected/74f43b30-1015-447e-bcfe-773238ff545f-kube-api-access-t7n4c\") pod \"dnsmasq-dns-5c5cc7c5ff-58wg8\" (UID: \"74f43b30-1015-447e-bcfe-773238ff545f\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-58wg8" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.946840 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49hk5\" (UniqueName: \"kubernetes.io/projected/34edfc11-9aef-4d9f-9887-d4b698771823-kube-api-access-49hk5\") pod \"keystone-bootstrap-qt97c\" (UID: \"34edfc11-9aef-4d9f-9887-d4b698771823\") " pod="openstack/keystone-bootstrap-qt97c" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.988262 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cs99x\" (UniqueName: \"kubernetes.io/projected/6bce568a-3e8c-4994-ab06-02cec9b80cc3-kube-api-access-cs99x\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:13 crc kubenswrapper[4700]: I1007 11:37:13.988288 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74r4r\" (UniqueName: \"kubernetes.io/projected/1ab6e4ca-6676-4f0b-ac32-ca674ee7dfa9-kube-api-access-74r4r\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.022097 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:37:14 crc kubenswrapper[4700]: E1007 11:37:14.022504 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ab6e4ca-6676-4f0b-ac32-ca674ee7dfa9" containerName="mariadb-account-create" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.022520 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ab6e4ca-6676-4f0b-ac32-ca674ee7dfa9" containerName="mariadb-account-create" Oct 07 11:37:14 crc kubenswrapper[4700]: E1007 11:37:14.022542 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bce568a-3e8c-4994-ab06-02cec9b80cc3" containerName="mariadb-account-create" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.022550 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bce568a-3e8c-4994-ab06-02cec9b80cc3" containerName="mariadb-account-create" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.022721 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ab6e4ca-6676-4f0b-ac32-ca674ee7dfa9" containerName="mariadb-account-create" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.022744 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bce568a-3e8c-4994-ab06-02cec9b80cc3" containerName="mariadb-account-create" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.024749 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.033681 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.033914 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.047700 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.131661 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qt97c" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.141228 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-58wg8"] Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.142436 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-58wg8" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.195790 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f1fee1b-3e64-4266-b787-c2804804a232-scripts\") pod \"ceilometer-0\" (UID: \"4f1fee1b-3e64-4266-b787-c2804804a232\") " pod="openstack/ceilometer-0" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.195826 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f1fee1b-3e64-4266-b787-c2804804a232-run-httpd\") pod \"ceilometer-0\" (UID: \"4f1fee1b-3e64-4266-b787-c2804804a232\") " pod="openstack/ceilometer-0" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.195877 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f1fee1b-3e64-4266-b787-c2804804a232-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4f1fee1b-3e64-4266-b787-c2804804a232\") " pod="openstack/ceilometer-0" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.195900 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f1fee1b-3e64-4266-b787-c2804804a232-log-httpd\") pod \"ceilometer-0\" (UID: \"4f1fee1b-3e64-4266-b787-c2804804a232\") " pod="openstack/ceilometer-0" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.195958 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl8jd\" (UniqueName: \"kubernetes.io/projected/4f1fee1b-3e64-4266-b787-c2804804a232-kube-api-access-hl8jd\") pod \"ceilometer-0\" (UID: \"4f1fee1b-3e64-4266-b787-c2804804a232\") " pod="openstack/ceilometer-0" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.195992 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f1fee1b-3e64-4266-b787-c2804804a232-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4f1fee1b-3e64-4266-b787-c2804804a232\") " pod="openstack/ceilometer-0" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.196044 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f1fee1b-3e64-4266-b787-c2804804a232-config-data\") pod \"ceilometer-0\" (UID: \"4f1fee1b-3e64-4266-b787-c2804804a232\") " pod="openstack/ceilometer-0" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.216948 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-ppsbl"] Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.217942 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ppsbl" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.235823 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-7lc5j" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.243117 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.243186 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.247254 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-jkbj4"] Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.251713 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-jkbj4" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.256423 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-ppsbl"] Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.301230 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f1fee1b-3e64-4266-b787-c2804804a232-config-data\") pod \"ceilometer-0\" (UID: \"4f1fee1b-3e64-4266-b787-c2804804a232\") " pod="openstack/ceilometer-0" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.301319 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7aaa144-2b28-48f6-8398-d4b0766f53f4-config-data\") pod \"placement-db-sync-ppsbl\" (UID: \"c7aaa144-2b28-48f6-8398-d4b0766f53f4\") " pod="openstack/placement-db-sync-ppsbl" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.301347 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7aaa144-2b28-48f6-8398-d4b0766f53f4-scripts\") pod \"placement-db-sync-ppsbl\" (UID: \"c7aaa144-2b28-48f6-8398-d4b0766f53f4\") " pod="openstack/placement-db-sync-ppsbl" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.301370 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f1fee1b-3e64-4266-b787-c2804804a232-scripts\") pod \"ceilometer-0\" (UID: \"4f1fee1b-3e64-4266-b787-c2804804a232\") " pod="openstack/ceilometer-0" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.301394 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f1fee1b-3e64-4266-b787-c2804804a232-run-httpd\") pod \"ceilometer-0\" (UID: \"4f1fee1b-3e64-4266-b787-c2804804a232\") " pod="openstack/ceilometer-0" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.301414 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7aaa144-2b28-48f6-8398-d4b0766f53f4-logs\") pod \"placement-db-sync-ppsbl\" (UID: \"c7aaa144-2b28-48f6-8398-d4b0766f53f4\") " pod="openstack/placement-db-sync-ppsbl" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.301454 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f1fee1b-3e64-4266-b787-c2804804a232-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4f1fee1b-3e64-4266-b787-c2804804a232\") " pod="openstack/ceilometer-0" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.301482 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f1fee1b-3e64-4266-b787-c2804804a232-log-httpd\") pod \"ceilometer-0\" (UID: \"4f1fee1b-3e64-4266-b787-c2804804a232\") " pod="openstack/ceilometer-0" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.301503 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qj4w\" (UniqueName: \"kubernetes.io/projected/c7aaa144-2b28-48f6-8398-d4b0766f53f4-kube-api-access-9qj4w\") pod \"placement-db-sync-ppsbl\" (UID: \"c7aaa144-2b28-48f6-8398-d4b0766f53f4\") " pod="openstack/placement-db-sync-ppsbl" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.301527 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl8jd\" (UniqueName: \"kubernetes.io/projected/4f1fee1b-3e64-4266-b787-c2804804a232-kube-api-access-hl8jd\") pod \"ceilometer-0\" (UID: \"4f1fee1b-3e64-4266-b787-c2804804a232\") " pod="openstack/ceilometer-0" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.301552 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f1fee1b-3e64-4266-b787-c2804804a232-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4f1fee1b-3e64-4266-b787-c2804804a232\") " pod="openstack/ceilometer-0" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.301572 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7aaa144-2b28-48f6-8398-d4b0766f53f4-combined-ca-bundle\") pod \"placement-db-sync-ppsbl\" (UID: \"c7aaa144-2b28-48f6-8398-d4b0766f53f4\") " pod="openstack/placement-db-sync-ppsbl" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.303285 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f1fee1b-3e64-4266-b787-c2804804a232-run-httpd\") pod \"ceilometer-0\" (UID: \"4f1fee1b-3e64-4266-b787-c2804804a232\") " pod="openstack/ceilometer-0" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.303713 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f1fee1b-3e64-4266-b787-c2804804a232-log-httpd\") pod \"ceilometer-0\" (UID: \"4f1fee1b-3e64-4266-b787-c2804804a232\") " pod="openstack/ceilometer-0" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.307888 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f1fee1b-3e64-4266-b787-c2804804a232-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4f1fee1b-3e64-4266-b787-c2804804a232\") " pod="openstack/ceilometer-0" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.309709 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f1fee1b-3e64-4266-b787-c2804804a232-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4f1fee1b-3e64-4266-b787-c2804804a232\") " pod="openstack/ceilometer-0" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.312280 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-jkbj4"] Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.315017 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f1fee1b-3e64-4266-b787-c2804804a232-scripts\") pod \"ceilometer-0\" (UID: \"4f1fee1b-3e64-4266-b787-c2804804a232\") " pod="openstack/ceilometer-0" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.315934 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f1fee1b-3e64-4266-b787-c2804804a232-config-data\") pod \"ceilometer-0\" (UID: \"4f1fee1b-3e64-4266-b787-c2804804a232\") " pod="openstack/ceilometer-0" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.340761 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl8jd\" (UniqueName: \"kubernetes.io/projected/4f1fee1b-3e64-4266-b787-c2804804a232-kube-api-access-hl8jd\") pod \"ceilometer-0\" (UID: \"4f1fee1b-3e64-4266-b787-c2804804a232\") " pod="openstack/ceilometer-0" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.375037 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.383900 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4750-account-create-h6rmf" event={"ID":"6bce568a-3e8c-4994-ab06-02cec9b80cc3","Type":"ContainerDied","Data":"0fed068b28f3c64ed1a542aa2b2597a002a3752d16fa81c696daff085d73dd56"} Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.383935 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fed068b28f3c64ed1a542aa2b2597a002a3752d16fa81c696daff085d73dd56" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.383945 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4750-account-create-h6rmf" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.385961 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7ff5475cc9-8wtm6" podUID="3a8b4bab-bbbf-44e8-8205-3eb77e531abf" containerName="dnsmasq-dns" containerID="cri-o://b870a7fe10113bfb08c56ef47e45370554b1fa11d77722a5cd2eea0a8932f4ae" gracePeriod=10 Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.386256 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-897f-account-create-qqqnz" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.386284 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-897f-account-create-qqqnz" event={"ID":"1ab6e4ca-6676-4f0b-ac32-ca674ee7dfa9","Type":"ContainerDied","Data":"b9bb0c80149b82c3a278c09fdde55e7cf9ddbe536662b89059912cf48f087dc6"} Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.386322 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9bb0c80149b82c3a278c09fdde55e7cf9ddbe536662b89059912cf48f087dc6" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.404130 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7aaa144-2b28-48f6-8398-d4b0766f53f4-combined-ca-bundle\") pod \"placement-db-sync-ppsbl\" (UID: \"c7aaa144-2b28-48f6-8398-d4b0766f53f4\") " pod="openstack/placement-db-sync-ppsbl" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.404200 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57d2908e-ab77-494f-91e7-dcdabee84614-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-jkbj4\" (UID: \"57d2908e-ab77-494f-91e7-dcdabee84614\") " pod="openstack/dnsmasq-dns-8b5c85b87-jkbj4" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.404224 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57d2908e-ab77-494f-91e7-dcdabee84614-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-jkbj4\" (UID: \"57d2908e-ab77-494f-91e7-dcdabee84614\") " pod="openstack/dnsmasq-dns-8b5c85b87-jkbj4" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.404254 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57d2908e-ab77-494f-91e7-dcdabee84614-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-jkbj4\" (UID: \"57d2908e-ab77-494f-91e7-dcdabee84614\") " pod="openstack/dnsmasq-dns-8b5c85b87-jkbj4" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.404281 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7aaa144-2b28-48f6-8398-d4b0766f53f4-config-data\") pod \"placement-db-sync-ppsbl\" (UID: \"c7aaa144-2b28-48f6-8398-d4b0766f53f4\") " pod="openstack/placement-db-sync-ppsbl" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.404297 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7aaa144-2b28-48f6-8398-d4b0766f53f4-scripts\") pod \"placement-db-sync-ppsbl\" (UID: \"c7aaa144-2b28-48f6-8398-d4b0766f53f4\") " pod="openstack/placement-db-sync-ppsbl" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.404349 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7aaa144-2b28-48f6-8398-d4b0766f53f4-logs\") pod \"placement-db-sync-ppsbl\" (UID: \"c7aaa144-2b28-48f6-8398-d4b0766f53f4\") " pod="openstack/placement-db-sync-ppsbl" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.404379 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx6kv\" (UniqueName: \"kubernetes.io/projected/57d2908e-ab77-494f-91e7-dcdabee84614-kube-api-access-fx6kv\") pod \"dnsmasq-dns-8b5c85b87-jkbj4\" (UID: \"57d2908e-ab77-494f-91e7-dcdabee84614\") " pod="openstack/dnsmasq-dns-8b5c85b87-jkbj4" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.404398 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57d2908e-ab77-494f-91e7-dcdabee84614-config\") pod \"dnsmasq-dns-8b5c85b87-jkbj4\" (UID: \"57d2908e-ab77-494f-91e7-dcdabee84614\") " pod="openstack/dnsmasq-dns-8b5c85b87-jkbj4" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.404430 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qj4w\" (UniqueName: \"kubernetes.io/projected/c7aaa144-2b28-48f6-8398-d4b0766f53f4-kube-api-access-9qj4w\") pod \"placement-db-sync-ppsbl\" (UID: \"c7aaa144-2b28-48f6-8398-d4b0766f53f4\") " pod="openstack/placement-db-sync-ppsbl" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.404449 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/57d2908e-ab77-494f-91e7-dcdabee84614-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-jkbj4\" (UID: \"57d2908e-ab77-494f-91e7-dcdabee84614\") " pod="openstack/dnsmasq-dns-8b5c85b87-jkbj4" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.405811 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7aaa144-2b28-48f6-8398-d4b0766f53f4-logs\") pod \"placement-db-sync-ppsbl\" (UID: \"c7aaa144-2b28-48f6-8398-d4b0766f53f4\") " pod="openstack/placement-db-sync-ppsbl" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.412336 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7aaa144-2b28-48f6-8398-d4b0766f53f4-combined-ca-bundle\") pod \"placement-db-sync-ppsbl\" (UID: \"c7aaa144-2b28-48f6-8398-d4b0766f53f4\") " pod="openstack/placement-db-sync-ppsbl" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.414755 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7aaa144-2b28-48f6-8398-d4b0766f53f4-config-data\") pod \"placement-db-sync-ppsbl\" (UID: \"c7aaa144-2b28-48f6-8398-d4b0766f53f4\") " pod="openstack/placement-db-sync-ppsbl" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.420620 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7aaa144-2b28-48f6-8398-d4b0766f53f4-scripts\") pod \"placement-db-sync-ppsbl\" (UID: \"c7aaa144-2b28-48f6-8398-d4b0766f53f4\") " pod="openstack/placement-db-sync-ppsbl" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.440601 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qj4w\" (UniqueName: \"kubernetes.io/projected/c7aaa144-2b28-48f6-8398-d4b0766f53f4-kube-api-access-9qj4w\") pod \"placement-db-sync-ppsbl\" (UID: \"c7aaa144-2b28-48f6-8398-d4b0766f53f4\") " pod="openstack/placement-db-sync-ppsbl" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.506139 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57d2908e-ab77-494f-91e7-dcdabee84614-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-jkbj4\" (UID: \"57d2908e-ab77-494f-91e7-dcdabee84614\") " pod="openstack/dnsmasq-dns-8b5c85b87-jkbj4" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.506464 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57d2908e-ab77-494f-91e7-dcdabee84614-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-jkbj4\" (UID: \"57d2908e-ab77-494f-91e7-dcdabee84614\") " pod="openstack/dnsmasq-dns-8b5c85b87-jkbj4" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.506525 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57d2908e-ab77-494f-91e7-dcdabee84614-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-jkbj4\" (UID: \"57d2908e-ab77-494f-91e7-dcdabee84614\") " pod="openstack/dnsmasq-dns-8b5c85b87-jkbj4" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.506638 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx6kv\" (UniqueName: \"kubernetes.io/projected/57d2908e-ab77-494f-91e7-dcdabee84614-kube-api-access-fx6kv\") pod \"dnsmasq-dns-8b5c85b87-jkbj4\" (UID: \"57d2908e-ab77-494f-91e7-dcdabee84614\") " pod="openstack/dnsmasq-dns-8b5c85b87-jkbj4" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.506665 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57d2908e-ab77-494f-91e7-dcdabee84614-config\") pod \"dnsmasq-dns-8b5c85b87-jkbj4\" (UID: \"57d2908e-ab77-494f-91e7-dcdabee84614\") " pod="openstack/dnsmasq-dns-8b5c85b87-jkbj4" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.506723 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/57d2908e-ab77-494f-91e7-dcdabee84614-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-jkbj4\" (UID: \"57d2908e-ab77-494f-91e7-dcdabee84614\") " pod="openstack/dnsmasq-dns-8b5c85b87-jkbj4" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.508559 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57d2908e-ab77-494f-91e7-dcdabee84614-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-jkbj4\" (UID: \"57d2908e-ab77-494f-91e7-dcdabee84614\") " pod="openstack/dnsmasq-dns-8b5c85b87-jkbj4" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.510061 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57d2908e-ab77-494f-91e7-dcdabee84614-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-jkbj4\" (UID: \"57d2908e-ab77-494f-91e7-dcdabee84614\") " pod="openstack/dnsmasq-dns-8b5c85b87-jkbj4" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.510392 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57d2908e-ab77-494f-91e7-dcdabee84614-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-jkbj4\" (UID: \"57d2908e-ab77-494f-91e7-dcdabee84614\") " pod="openstack/dnsmasq-dns-8b5c85b87-jkbj4" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.511128 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57d2908e-ab77-494f-91e7-dcdabee84614-config\") pod \"dnsmasq-dns-8b5c85b87-jkbj4\" (UID: \"57d2908e-ab77-494f-91e7-dcdabee84614\") " pod="openstack/dnsmasq-dns-8b5c85b87-jkbj4" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.517598 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/57d2908e-ab77-494f-91e7-dcdabee84614-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-jkbj4\" (UID: \"57d2908e-ab77-494f-91e7-dcdabee84614\") " pod="openstack/dnsmasq-dns-8b5c85b87-jkbj4" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.533060 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx6kv\" (UniqueName: \"kubernetes.io/projected/57d2908e-ab77-494f-91e7-dcdabee84614-kube-api-access-fx6kv\") pod \"dnsmasq-dns-8b5c85b87-jkbj4\" (UID: \"57d2908e-ab77-494f-91e7-dcdabee84614\") " pod="openstack/dnsmasq-dns-8b5c85b87-jkbj4" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.716004 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ppsbl" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.741625 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-jkbj4" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.774562 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.775824 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.779684 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.780137 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9x8lb" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.780627 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.780800 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.805254 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.862833 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-58wg8"] Oct 07 11:37:14 crc kubenswrapper[4700]: W1007 11:37:14.876276 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74f43b30_1015_447e_bcfe_773238ff545f.slice/crio-72b66b9f7612a5e31041c5ee3d776813a5e82f0adb7573b3d682bd8a8806301b WatchSource:0}: Error finding container 72b66b9f7612a5e31041c5ee3d776813a5e82f0adb7573b3d682bd8a8806301b: Status 404 returned error can't find the container with id 72b66b9f7612a5e31041c5ee3d776813a5e82f0adb7573b3d682bd8a8806301b Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.912929 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qt97c"] Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.924391 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-config-data\") pod \"glance-default-external-api-0\" (UID: \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.924455 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.924518 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwl94\" (UniqueName: \"kubernetes.io/projected/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-kube-api-access-kwl94\") pod \"glance-default-external-api-0\" (UID: \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.924537 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-logs\") pod \"glance-default-external-api-0\" (UID: \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.924567 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.924585 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.924607 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.924626 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-scripts\") pod \"glance-default-external-api-0\" (UID: \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.977816 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.986953 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.992447 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 07 11:37:14 crc kubenswrapper[4700]: I1007 11:37:14.993104 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.037877 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.037945 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.037986 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.038017 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-scripts\") pod \"glance-default-external-api-0\" (UID: \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.038079 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-config-data\") pod \"glance-default-external-api-0\" (UID: \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.038134 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.038204 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwl94\" (UniqueName: \"kubernetes.io/projected/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-kube-api-access-kwl94\") pod \"glance-default-external-api-0\" (UID: \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.038222 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-logs\") pod \"glance-default-external-api-0\" (UID: \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.040769 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-logs\") pod \"glance-default-external-api-0\" (UID: \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.041113 4700 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.043667 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.054783 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.055103 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-config-data\") pod \"glance-default-external-api-0\" (UID: \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.055426 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.057539 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-scripts\") pod \"glance-default-external-api-0\" (UID: \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.071686 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.077649 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwl94\" (UniqueName: \"kubernetes.io/projected/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-kube-api-access-kwl94\") pod \"glance-default-external-api-0\" (UID: \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.077700 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.112784 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.139865 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xltfn\" (UniqueName: \"kubernetes.io/projected/7854e722-9de1-4249-943e-b5c4ce4634ad-kube-api-access-xltfn\") pod \"glance-default-internal-api-0\" (UID: \"7854e722-9de1-4249-943e-b5c4ce4634ad\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.139926 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"7854e722-9de1-4249-943e-b5c4ce4634ad\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.139981 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7854e722-9de1-4249-943e-b5c4ce4634ad-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7854e722-9de1-4249-943e-b5c4ce4634ad\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.140011 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7854e722-9de1-4249-943e-b5c4ce4634ad-logs\") pod \"glance-default-internal-api-0\" (UID: \"7854e722-9de1-4249-943e-b5c4ce4634ad\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.140043 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7854e722-9de1-4249-943e-b5c4ce4634ad-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7854e722-9de1-4249-943e-b5c4ce4634ad\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.140086 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7854e722-9de1-4249-943e-b5c4ce4634ad-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7854e722-9de1-4249-943e-b5c4ce4634ad\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.140120 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7854e722-9de1-4249-943e-b5c4ce4634ad-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7854e722-9de1-4249-943e-b5c4ce4634ad\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.140152 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7854e722-9de1-4249-943e-b5c4ce4634ad-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7854e722-9de1-4249-943e-b5c4ce4634ad\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.241704 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7854e722-9de1-4249-943e-b5c4ce4634ad-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7854e722-9de1-4249-943e-b5c4ce4634ad\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.242175 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7854e722-9de1-4249-943e-b5c4ce4634ad-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7854e722-9de1-4249-943e-b5c4ce4634ad\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.242339 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xltfn\" (UniqueName: \"kubernetes.io/projected/7854e722-9de1-4249-943e-b5c4ce4634ad-kube-api-access-xltfn\") pod \"glance-default-internal-api-0\" (UID: \"7854e722-9de1-4249-943e-b5c4ce4634ad\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.242413 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"7854e722-9de1-4249-943e-b5c4ce4634ad\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.242515 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7854e722-9de1-4249-943e-b5c4ce4634ad-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7854e722-9de1-4249-943e-b5c4ce4634ad\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.242592 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7854e722-9de1-4249-943e-b5c4ce4634ad-logs\") pod \"glance-default-internal-api-0\" (UID: \"7854e722-9de1-4249-943e-b5c4ce4634ad\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.242665 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7854e722-9de1-4249-943e-b5c4ce4634ad-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7854e722-9de1-4249-943e-b5c4ce4634ad\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.242762 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7854e722-9de1-4249-943e-b5c4ce4634ad-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7854e722-9de1-4249-943e-b5c4ce4634ad\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.243464 4700 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"7854e722-9de1-4249-943e-b5c4ce4634ad\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.244107 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7854e722-9de1-4249-943e-b5c4ce4634ad-logs\") pod \"glance-default-internal-api-0\" (UID: \"7854e722-9de1-4249-943e-b5c4ce4634ad\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.244337 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7854e722-9de1-4249-943e-b5c4ce4634ad-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7854e722-9de1-4249-943e-b5c4ce4634ad\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.249429 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7854e722-9de1-4249-943e-b5c4ce4634ad-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7854e722-9de1-4249-943e-b5c4ce4634ad\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.250506 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7854e722-9de1-4249-943e-b5c4ce4634ad-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7854e722-9de1-4249-943e-b5c4ce4634ad\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.252740 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7854e722-9de1-4249-943e-b5c4ce4634ad-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7854e722-9de1-4249-943e-b5c4ce4634ad\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.253751 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7854e722-9de1-4249-943e-b5c4ce4634ad-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7854e722-9de1-4249-943e-b5c4ce4634ad\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.263548 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xltfn\" (UniqueName: \"kubernetes.io/projected/7854e722-9de1-4249-943e-b5c4ce4634ad-kube-api-access-xltfn\") pod \"glance-default-internal-api-0\" (UID: \"7854e722-9de1-4249-943e-b5c4ce4634ad\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.270267 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"7854e722-9de1-4249-943e-b5c4ce4634ad\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.331179 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.393613 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qt97c" event={"ID":"34edfc11-9aef-4d9f-9887-d4b698771823","Type":"ContainerStarted","Data":"35a210cf86279e54c280485119415075acd36c72025728674b746957f04d00e7"} Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.395025 4700 generic.go:334] "Generic (PLEG): container finished" podID="3a8b4bab-bbbf-44e8-8205-3eb77e531abf" containerID="b870a7fe10113bfb08c56ef47e45370554b1fa11d77722a5cd2eea0a8932f4ae" exitCode=0 Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.395064 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-8wtm6" event={"ID":"3a8b4bab-bbbf-44e8-8205-3eb77e531abf","Type":"ContainerDied","Data":"b870a7fe10113bfb08c56ef47e45370554b1fa11d77722a5cd2eea0a8932f4ae"} Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.395836 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f1fee1b-3e64-4266-b787-c2804804a232","Type":"ContainerStarted","Data":"35e0eb63f58e38d3c93eb63ed4c6ccc9c56b01729c0bdd379305b5abe24008c9"} Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.396830 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-58wg8" event={"ID":"74f43b30-1015-447e-bcfe-773238ff545f","Type":"ContainerStarted","Data":"72b66b9f7612a5e31041c5ee3d776813a5e82f0adb7573b3d682bd8a8806301b"} Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.404421 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-jkbj4"] Oct 07 11:37:15 crc kubenswrapper[4700]: W1007 11:37:15.420461 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57d2908e_ab77_494f_91e7_dcdabee84614.slice/crio-83e44672eea7cd7130dd1e5e7895bc0a70fd9d4b89898f2d5f8124045e3ce716 WatchSource:0}: Error finding container 83e44672eea7cd7130dd1e5e7895bc0a70fd9d4b89898f2d5f8124045e3ce716: Status 404 returned error can't find the container with id 83e44672eea7cd7130dd1e5e7895bc0a70fd9d4b89898f2d5f8124045e3ce716 Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.420844 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.452799 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-ppsbl"] Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.493569 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-8wtm6" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.551485 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a8b4bab-bbbf-44e8-8205-3eb77e531abf-ovsdbserver-sb\") pod \"3a8b4bab-bbbf-44e8-8205-3eb77e531abf\" (UID: \"3a8b4bab-bbbf-44e8-8205-3eb77e531abf\") " Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.551600 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a8b4bab-bbbf-44e8-8205-3eb77e531abf-ovsdbserver-nb\") pod \"3a8b4bab-bbbf-44e8-8205-3eb77e531abf\" (UID: \"3a8b4bab-bbbf-44e8-8205-3eb77e531abf\") " Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.551657 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74794\" (UniqueName: \"kubernetes.io/projected/3a8b4bab-bbbf-44e8-8205-3eb77e531abf-kube-api-access-74794\") pod \"3a8b4bab-bbbf-44e8-8205-3eb77e531abf\" (UID: \"3a8b4bab-bbbf-44e8-8205-3eb77e531abf\") " Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.551678 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a8b4bab-bbbf-44e8-8205-3eb77e531abf-dns-svc\") pod \"3a8b4bab-bbbf-44e8-8205-3eb77e531abf\" (UID: \"3a8b4bab-bbbf-44e8-8205-3eb77e531abf\") " Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.551806 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3a8b4bab-bbbf-44e8-8205-3eb77e531abf-dns-swift-storage-0\") pod \"3a8b4bab-bbbf-44e8-8205-3eb77e531abf\" (UID: \"3a8b4bab-bbbf-44e8-8205-3eb77e531abf\") " Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.552017 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a8b4bab-bbbf-44e8-8205-3eb77e531abf-config\") pod \"3a8b4bab-bbbf-44e8-8205-3eb77e531abf\" (UID: \"3a8b4bab-bbbf-44e8-8205-3eb77e531abf\") " Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.559241 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a8b4bab-bbbf-44e8-8205-3eb77e531abf-kube-api-access-74794" (OuterVolumeSpecName: "kube-api-access-74794") pod "3a8b4bab-bbbf-44e8-8205-3eb77e531abf" (UID: "3a8b4bab-bbbf-44e8-8205-3eb77e531abf"). InnerVolumeSpecName "kube-api-access-74794". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.624356 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a8b4bab-bbbf-44e8-8205-3eb77e531abf-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3a8b4bab-bbbf-44e8-8205-3eb77e531abf" (UID: "3a8b4bab-bbbf-44e8-8205-3eb77e531abf"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.625331 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a8b4bab-bbbf-44e8-8205-3eb77e531abf-config" (OuterVolumeSpecName: "config") pod "3a8b4bab-bbbf-44e8-8205-3eb77e531abf" (UID: "3a8b4bab-bbbf-44e8-8205-3eb77e531abf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.628064 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a8b4bab-bbbf-44e8-8205-3eb77e531abf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3a8b4bab-bbbf-44e8-8205-3eb77e531abf" (UID: "3a8b4bab-bbbf-44e8-8205-3eb77e531abf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.629046 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a8b4bab-bbbf-44e8-8205-3eb77e531abf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3a8b4bab-bbbf-44e8-8205-3eb77e531abf" (UID: "3a8b4bab-bbbf-44e8-8205-3eb77e531abf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.644489 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a8b4bab-bbbf-44e8-8205-3eb77e531abf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3a8b4bab-bbbf-44e8-8205-3eb77e531abf" (UID: "3a8b4bab-bbbf-44e8-8205-3eb77e531abf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.654182 4700 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a8b4bab-bbbf-44e8-8205-3eb77e531abf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.654208 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74794\" (UniqueName: \"kubernetes.io/projected/3a8b4bab-bbbf-44e8-8205-3eb77e531abf-kube-api-access-74794\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.654219 4700 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a8b4bab-bbbf-44e8-8205-3eb77e531abf-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.654228 4700 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3a8b4bab-bbbf-44e8-8205-3eb77e531abf-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.654236 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a8b4bab-bbbf-44e8-8205-3eb77e531abf-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.654245 4700 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a8b4bab-bbbf-44e8-8205-3eb77e531abf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.921067 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-bqgg5"] Oct 07 11:37:15 crc kubenswrapper[4700]: E1007 11:37:15.921716 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a8b4bab-bbbf-44e8-8205-3eb77e531abf" containerName="dnsmasq-dns" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.921732 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a8b4bab-bbbf-44e8-8205-3eb77e531abf" containerName="dnsmasq-dns" Oct 07 11:37:15 crc kubenswrapper[4700]: E1007 11:37:15.921762 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a8b4bab-bbbf-44e8-8205-3eb77e531abf" containerName="init" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.921771 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a8b4bab-bbbf-44e8-8205-3eb77e531abf" containerName="init" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.922003 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a8b4bab-bbbf-44e8-8205-3eb77e531abf" containerName="dnsmasq-dns" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.923656 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-bqgg5" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.925904 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-jtrf6" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.926614 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.926739 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.940031 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-bqgg5"] Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.949814 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-nkx5c"] Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.950920 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-nkx5c" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.953584 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-pzcm7" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.954079 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 07 11:37:15 crc kubenswrapper[4700]: I1007 11:37:15.992790 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-nkx5c"] Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.059183 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.064638 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdc86b02-ca87-4bda-869a-6fd42e5f5f1b-scripts\") pod \"cinder-db-sync-bqgg5\" (UID: \"bdc86b02-ca87-4bda-869a-6fd42e5f5f1b\") " pod="openstack/cinder-db-sync-bqgg5" Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.064681 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdc86b02-ca87-4bda-869a-6fd42e5f5f1b-config-data\") pod \"cinder-db-sync-bqgg5\" (UID: \"bdc86b02-ca87-4bda-869a-6fd42e5f5f1b\") " pod="openstack/cinder-db-sync-bqgg5" Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.064706 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bdc86b02-ca87-4bda-869a-6fd42e5f5f1b-etc-machine-id\") pod \"cinder-db-sync-bqgg5\" (UID: \"bdc86b02-ca87-4bda-869a-6fd42e5f5f1b\") " pod="openstack/cinder-db-sync-bqgg5" Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.064772 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4ccc0ab6-f412-449d-bef2-82bbd06f3d9e-db-sync-config-data\") pod \"barbican-db-sync-nkx5c\" (UID: \"4ccc0ab6-f412-449d-bef2-82bbd06f3d9e\") " pod="openstack/barbican-db-sync-nkx5c" Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.064824 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twqj8\" (UniqueName: \"kubernetes.io/projected/bdc86b02-ca87-4bda-869a-6fd42e5f5f1b-kube-api-access-twqj8\") pod \"cinder-db-sync-bqgg5\" (UID: \"bdc86b02-ca87-4bda-869a-6fd42e5f5f1b\") " pod="openstack/cinder-db-sync-bqgg5" Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.064846 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdc86b02-ca87-4bda-869a-6fd42e5f5f1b-combined-ca-bundle\") pod \"cinder-db-sync-bqgg5\" (UID: \"bdc86b02-ca87-4bda-869a-6fd42e5f5f1b\") " pod="openstack/cinder-db-sync-bqgg5" Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.064875 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtjtd\" (UniqueName: \"kubernetes.io/projected/4ccc0ab6-f412-449d-bef2-82bbd06f3d9e-kube-api-access-qtjtd\") pod \"barbican-db-sync-nkx5c\" (UID: \"4ccc0ab6-f412-449d-bef2-82bbd06f3d9e\") " pod="openstack/barbican-db-sync-nkx5c" Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.065085 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bdc86b02-ca87-4bda-869a-6fd42e5f5f1b-db-sync-config-data\") pod \"cinder-db-sync-bqgg5\" (UID: \"bdc86b02-ca87-4bda-869a-6fd42e5f5f1b\") " pod="openstack/cinder-db-sync-bqgg5" Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.065107 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ccc0ab6-f412-449d-bef2-82bbd06f3d9e-combined-ca-bundle\") pod \"barbican-db-sync-nkx5c\" (UID: \"4ccc0ab6-f412-449d-bef2-82bbd06f3d9e\") " pod="openstack/barbican-db-sync-nkx5c" Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.098701 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.124956 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.166895 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4ccc0ab6-f412-449d-bef2-82bbd06f3d9e-db-sync-config-data\") pod \"barbican-db-sync-nkx5c\" (UID: \"4ccc0ab6-f412-449d-bef2-82bbd06f3d9e\") " pod="openstack/barbican-db-sync-nkx5c" Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.166968 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twqj8\" (UniqueName: \"kubernetes.io/projected/bdc86b02-ca87-4bda-869a-6fd42e5f5f1b-kube-api-access-twqj8\") pod \"cinder-db-sync-bqgg5\" (UID: \"bdc86b02-ca87-4bda-869a-6fd42e5f5f1b\") " pod="openstack/cinder-db-sync-bqgg5" Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.166988 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdc86b02-ca87-4bda-869a-6fd42e5f5f1b-combined-ca-bundle\") pod \"cinder-db-sync-bqgg5\" (UID: \"bdc86b02-ca87-4bda-869a-6fd42e5f5f1b\") " pod="openstack/cinder-db-sync-bqgg5" Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.167011 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtjtd\" (UniqueName: \"kubernetes.io/projected/4ccc0ab6-f412-449d-bef2-82bbd06f3d9e-kube-api-access-qtjtd\") pod \"barbican-db-sync-nkx5c\" (UID: \"4ccc0ab6-f412-449d-bef2-82bbd06f3d9e\") " pod="openstack/barbican-db-sync-nkx5c" Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.167032 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bdc86b02-ca87-4bda-869a-6fd42e5f5f1b-db-sync-config-data\") pod \"cinder-db-sync-bqgg5\" (UID: \"bdc86b02-ca87-4bda-869a-6fd42e5f5f1b\") " pod="openstack/cinder-db-sync-bqgg5" Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.167056 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ccc0ab6-f412-449d-bef2-82bbd06f3d9e-combined-ca-bundle\") pod \"barbican-db-sync-nkx5c\" (UID: \"4ccc0ab6-f412-449d-bef2-82bbd06f3d9e\") " pod="openstack/barbican-db-sync-nkx5c" Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.167088 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdc86b02-ca87-4bda-869a-6fd42e5f5f1b-scripts\") pod \"cinder-db-sync-bqgg5\" (UID: \"bdc86b02-ca87-4bda-869a-6fd42e5f5f1b\") " pod="openstack/cinder-db-sync-bqgg5" Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.167157 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdc86b02-ca87-4bda-869a-6fd42e5f5f1b-config-data\") pod \"cinder-db-sync-bqgg5\" (UID: \"bdc86b02-ca87-4bda-869a-6fd42e5f5f1b\") " pod="openstack/cinder-db-sync-bqgg5" Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.167179 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bdc86b02-ca87-4bda-869a-6fd42e5f5f1b-etc-machine-id\") pod \"cinder-db-sync-bqgg5\" (UID: \"bdc86b02-ca87-4bda-869a-6fd42e5f5f1b\") " pod="openstack/cinder-db-sync-bqgg5" Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.167332 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bdc86b02-ca87-4bda-869a-6fd42e5f5f1b-etc-machine-id\") pod \"cinder-db-sync-bqgg5\" (UID: \"bdc86b02-ca87-4bda-869a-6fd42e5f5f1b\") " pod="openstack/cinder-db-sync-bqgg5" Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.168891 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.173856 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bdc86b02-ca87-4bda-869a-6fd42e5f5f1b-db-sync-config-data\") pod \"cinder-db-sync-bqgg5\" (UID: \"bdc86b02-ca87-4bda-869a-6fd42e5f5f1b\") " pod="openstack/cinder-db-sync-bqgg5" Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.180200 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdc86b02-ca87-4bda-869a-6fd42e5f5f1b-combined-ca-bundle\") pod \"cinder-db-sync-bqgg5\" (UID: \"bdc86b02-ca87-4bda-869a-6fd42e5f5f1b\") " pod="openstack/cinder-db-sync-bqgg5" Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.183494 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdc86b02-ca87-4bda-869a-6fd42e5f5f1b-scripts\") pod \"cinder-db-sync-bqgg5\" (UID: \"bdc86b02-ca87-4bda-869a-6fd42e5f5f1b\") " pod="openstack/cinder-db-sync-bqgg5" Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.185828 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4ccc0ab6-f412-449d-bef2-82bbd06f3d9e-db-sync-config-data\") pod \"barbican-db-sync-nkx5c\" (UID: \"4ccc0ab6-f412-449d-bef2-82bbd06f3d9e\") " pod="openstack/barbican-db-sync-nkx5c" Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.191034 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ccc0ab6-f412-449d-bef2-82bbd06f3d9e-combined-ca-bundle\") pod \"barbican-db-sync-nkx5c\" (UID: \"4ccc0ab6-f412-449d-bef2-82bbd06f3d9e\") " pod="openstack/barbican-db-sync-nkx5c" Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.209696 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdc86b02-ca87-4bda-869a-6fd42e5f5f1b-config-data\") pod \"cinder-db-sync-bqgg5\" (UID: \"bdc86b02-ca87-4bda-869a-6fd42e5f5f1b\") " pod="openstack/cinder-db-sync-bqgg5" Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.210126 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twqj8\" (UniqueName: \"kubernetes.io/projected/bdc86b02-ca87-4bda-869a-6fd42e5f5f1b-kube-api-access-twqj8\") pod \"cinder-db-sync-bqgg5\" (UID: \"bdc86b02-ca87-4bda-869a-6fd42e5f5f1b\") " pod="openstack/cinder-db-sync-bqgg5" Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.220966 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtjtd\" (UniqueName: \"kubernetes.io/projected/4ccc0ab6-f412-449d-bef2-82bbd06f3d9e-kube-api-access-qtjtd\") pod \"barbican-db-sync-nkx5c\" (UID: \"4ccc0ab6-f412-449d-bef2-82bbd06f3d9e\") " pod="openstack/barbican-db-sync-nkx5c" Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.263760 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-bqgg5" Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.275636 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-nkx5c" Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.281038 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 11:37:16 crc kubenswrapper[4700]: W1007 11:37:16.408917 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6866f21_b1de_41ab_bac9_08c6acc7a5cc.slice/crio-724cfa14b9303b6cf5768478df20a621338eb3f37a83e2b0215a6e434f5dee38 WatchSource:0}: Error finding container 724cfa14b9303b6cf5768478df20a621338eb3f37a83e2b0215a6e434f5dee38: Status 404 returned error can't find the container with id 724cfa14b9303b6cf5768478df20a621338eb3f37a83e2b0215a6e434f5dee38 Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.432001 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ppsbl" event={"ID":"c7aaa144-2b28-48f6-8398-d4b0766f53f4","Type":"ContainerStarted","Data":"7c05091c2213054a504d9d62120ccc9b9f2ea9f1e2f563af4298a89c2cc25d50"} Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.488839 4700 generic.go:334] "Generic (PLEG): container finished" podID="57d2908e-ab77-494f-91e7-dcdabee84614" containerID="08f28cacd2f8e7af07963b9aa11dd780a92ddb49d61f1f09366669f1c8790b29" exitCode=0 Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.492464 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-jkbj4" event={"ID":"57d2908e-ab77-494f-91e7-dcdabee84614","Type":"ContainerDied","Data":"08f28cacd2f8e7af07963b9aa11dd780a92ddb49d61f1f09366669f1c8790b29"} Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.495489 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-jkbj4" event={"ID":"57d2908e-ab77-494f-91e7-dcdabee84614","Type":"ContainerStarted","Data":"83e44672eea7cd7130dd1e5e7895bc0a70fd9d4b89898f2d5f8124045e3ce716"} Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.502820 4700 generic.go:334] "Generic (PLEG): container finished" podID="74f43b30-1015-447e-bcfe-773238ff545f" containerID="de42e681d91f28161f969b8009ad95acb087a3a9cc051b70b96f4805df08dd05" exitCode=0 Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.502922 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-58wg8" event={"ID":"74f43b30-1015-447e-bcfe-773238ff545f","Type":"ContainerDied","Data":"de42e681d91f28161f969b8009ad95acb087a3a9cc051b70b96f4805df08dd05"} Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.509646 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qt97c" event={"ID":"34edfc11-9aef-4d9f-9887-d4b698771823","Type":"ContainerStarted","Data":"18949fced552b3138e4f36a465881e5a00520a9f77fb3f38d51c5aae57604e1c"} Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.540498 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-8wtm6" event={"ID":"3a8b4bab-bbbf-44e8-8205-3eb77e531abf","Type":"ContainerDied","Data":"381dcaec1abfe5eb901a6b6cf1162ac8a81bd795b34f6652b6cf3d39b5c58b3d"} Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.540544 4700 scope.go:117] "RemoveContainer" containerID="b870a7fe10113bfb08c56ef47e45370554b1fa11d77722a5cd2eea0a8932f4ae" Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.540647 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-8wtm6" Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.543728 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7854e722-9de1-4249-943e-b5c4ce4634ad","Type":"ContainerStarted","Data":"32f72b725429c2090c573508a0fba9e70c03bc39bd02532e3cd690cf5decc601"} Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.598574 4700 scope.go:117] "RemoveContainer" containerID="51f9defbe0f6041e20233e1325512ba5c757afe4d829ec63ee8a530025bce97d" Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.609883 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-qt97c" podStartSLOduration=3.609862629 podStartE2EDuration="3.609862629s" podCreationTimestamp="2025-10-07 11:37:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:37:16.5953868 +0000 UTC m=+1003.391785789" watchObservedRunningTime="2025-10-07 11:37:16.609862629 +0000 UTC m=+1003.406261618" Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.626324 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-8wtm6"] Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.632848 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-8wtm6"] Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.918620 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-58wg8" Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.990595 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74f43b30-1015-447e-bcfe-773238ff545f-dns-swift-storage-0\") pod \"74f43b30-1015-447e-bcfe-773238ff545f\" (UID: \"74f43b30-1015-447e-bcfe-773238ff545f\") " Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.990659 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74f43b30-1015-447e-bcfe-773238ff545f-ovsdbserver-sb\") pod \"74f43b30-1015-447e-bcfe-773238ff545f\" (UID: \"74f43b30-1015-447e-bcfe-773238ff545f\") " Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.990693 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74f43b30-1015-447e-bcfe-773238ff545f-dns-svc\") pod \"74f43b30-1015-447e-bcfe-773238ff545f\" (UID: \"74f43b30-1015-447e-bcfe-773238ff545f\") " Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.990806 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74f43b30-1015-447e-bcfe-773238ff545f-ovsdbserver-nb\") pod \"74f43b30-1015-447e-bcfe-773238ff545f\" (UID: \"74f43b30-1015-447e-bcfe-773238ff545f\") " Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.990929 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7n4c\" (UniqueName: \"kubernetes.io/projected/74f43b30-1015-447e-bcfe-773238ff545f-kube-api-access-t7n4c\") pod \"74f43b30-1015-447e-bcfe-773238ff545f\" (UID: \"74f43b30-1015-447e-bcfe-773238ff545f\") " Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.991029 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74f43b30-1015-447e-bcfe-773238ff545f-config\") pod \"74f43b30-1015-447e-bcfe-773238ff545f\" (UID: \"74f43b30-1015-447e-bcfe-773238ff545f\") " Oct 07 11:37:16 crc kubenswrapper[4700]: I1007 11:37:16.998472 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74f43b30-1015-447e-bcfe-773238ff545f-kube-api-access-t7n4c" (OuterVolumeSpecName: "kube-api-access-t7n4c") pod "74f43b30-1015-447e-bcfe-773238ff545f" (UID: "74f43b30-1015-447e-bcfe-773238ff545f"). InnerVolumeSpecName "kube-api-access-t7n4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:37:17 crc kubenswrapper[4700]: I1007 11:37:17.015603 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74f43b30-1015-447e-bcfe-773238ff545f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "74f43b30-1015-447e-bcfe-773238ff545f" (UID: "74f43b30-1015-447e-bcfe-773238ff545f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:37:17 crc kubenswrapper[4700]: I1007 11:37:17.029504 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74f43b30-1015-447e-bcfe-773238ff545f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "74f43b30-1015-447e-bcfe-773238ff545f" (UID: "74f43b30-1015-447e-bcfe-773238ff545f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:37:17 crc kubenswrapper[4700]: I1007 11:37:17.041601 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74f43b30-1015-447e-bcfe-773238ff545f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "74f43b30-1015-447e-bcfe-773238ff545f" (UID: "74f43b30-1015-447e-bcfe-773238ff545f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:37:17 crc kubenswrapper[4700]: I1007 11:37:17.043031 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74f43b30-1015-447e-bcfe-773238ff545f-config" (OuterVolumeSpecName: "config") pod "74f43b30-1015-447e-bcfe-773238ff545f" (UID: "74f43b30-1015-447e-bcfe-773238ff545f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:37:17 crc kubenswrapper[4700]: I1007 11:37:17.066222 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-bqgg5"] Oct 07 11:37:17 crc kubenswrapper[4700]: I1007 11:37:17.087440 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-nkx5c"] Oct 07 11:37:17 crc kubenswrapper[4700]: I1007 11:37:17.094658 4700 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74f43b30-1015-447e-bcfe-773238ff545f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:17 crc kubenswrapper[4700]: I1007 11:37:17.094685 4700 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74f43b30-1015-447e-bcfe-773238ff545f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:17 crc kubenswrapper[4700]: I1007 11:37:17.094696 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7n4c\" (UniqueName: \"kubernetes.io/projected/74f43b30-1015-447e-bcfe-773238ff545f-kube-api-access-t7n4c\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:17 crc kubenswrapper[4700]: I1007 11:37:17.094707 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74f43b30-1015-447e-bcfe-773238ff545f-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:17 crc kubenswrapper[4700]: I1007 11:37:17.094716 4700 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74f43b30-1015-447e-bcfe-773238ff545f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:17 crc kubenswrapper[4700]: I1007 11:37:17.103941 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74f43b30-1015-447e-bcfe-773238ff545f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "74f43b30-1015-447e-bcfe-773238ff545f" (UID: "74f43b30-1015-447e-bcfe-773238ff545f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:37:17 crc kubenswrapper[4700]: I1007 11:37:17.196412 4700 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74f43b30-1015-447e-bcfe-773238ff545f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:17 crc kubenswrapper[4700]: I1007 11:37:17.557435 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-bqgg5" event={"ID":"bdc86b02-ca87-4bda-869a-6fd42e5f5f1b","Type":"ContainerStarted","Data":"c42280b85c4a6f7e681273f655f89bcbca0f0635ac011e1092394d6aa2478902"} Oct 07 11:37:17 crc kubenswrapper[4700]: I1007 11:37:17.563627 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-58wg8" event={"ID":"74f43b30-1015-447e-bcfe-773238ff545f","Type":"ContainerDied","Data":"72b66b9f7612a5e31041c5ee3d776813a5e82f0adb7573b3d682bd8a8806301b"} Oct 07 11:37:17 crc kubenswrapper[4700]: I1007 11:37:17.563676 4700 scope.go:117] "RemoveContainer" containerID="de42e681d91f28161f969b8009ad95acb087a3a9cc051b70b96f4805df08dd05" Oct 07 11:37:17 crc kubenswrapper[4700]: I1007 11:37:17.563757 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-58wg8" Oct 07 11:37:17 crc kubenswrapper[4700]: I1007 11:37:17.569775 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e6866f21-b1de-41ab-bac9-08c6acc7a5cc","Type":"ContainerStarted","Data":"7f01812cabe4682f649dfaad561fb88073dcbe34a1b10ae10b3b67017a8e9cdd"} Oct 07 11:37:17 crc kubenswrapper[4700]: I1007 11:37:17.569810 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e6866f21-b1de-41ab-bac9-08c6acc7a5cc","Type":"ContainerStarted","Data":"724cfa14b9303b6cf5768478df20a621338eb3f37a83e2b0215a6e434f5dee38"} Oct 07 11:37:17 crc kubenswrapper[4700]: I1007 11:37:17.579457 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7854e722-9de1-4249-943e-b5c4ce4634ad","Type":"ContainerStarted","Data":"8d00eec4587f4c0fa854acca76ec2dcd8e92c5e5839bf52a2cac24021dc98659"} Oct 07 11:37:17 crc kubenswrapper[4700]: I1007 11:37:17.583098 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-jkbj4" event={"ID":"57d2908e-ab77-494f-91e7-dcdabee84614","Type":"ContainerStarted","Data":"f5095e70592475b3be954a50f424d62a14a9c7e593aed7f0f40b3c20ec5499f7"} Oct 07 11:37:17 crc kubenswrapper[4700]: I1007 11:37:17.583219 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8b5c85b87-jkbj4" Oct 07 11:37:17 crc kubenswrapper[4700]: I1007 11:37:17.586561 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-nkx5c" event={"ID":"4ccc0ab6-f412-449d-bef2-82bbd06f3d9e","Type":"ContainerStarted","Data":"b8b402674eb52d8f10672ac41540ca304baca3b9b8d358a3303540b554f921ee"} Oct 07 11:37:17 crc kubenswrapper[4700]: I1007 11:37:17.602506 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8b5c85b87-jkbj4" podStartSLOduration=3.602488108 podStartE2EDuration="3.602488108s" podCreationTimestamp="2025-10-07 11:37:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:37:17.60104632 +0000 UTC m=+1004.397445309" watchObservedRunningTime="2025-10-07 11:37:17.602488108 +0000 UTC m=+1004.398887097" Oct 07 11:37:17 crc kubenswrapper[4700]: I1007 11:37:17.631666 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-58wg8"] Oct 07 11:37:17 crc kubenswrapper[4700]: I1007 11:37:17.638494 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-58wg8"] Oct 07 11:37:17 crc kubenswrapper[4700]: I1007 11:37:17.967143 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a8b4bab-bbbf-44e8-8205-3eb77e531abf" path="/var/lib/kubelet/pods/3a8b4bab-bbbf-44e8-8205-3eb77e531abf/volumes" Oct 07 11:37:17 crc kubenswrapper[4700]: I1007 11:37:17.968101 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74f43b30-1015-447e-bcfe-773238ff545f" path="/var/lib/kubelet/pods/74f43b30-1015-447e-bcfe-773238ff545f/volumes" Oct 07 11:37:18 crc kubenswrapper[4700]: I1007 11:37:18.626243 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7854e722-9de1-4249-943e-b5c4ce4634ad","Type":"ContainerStarted","Data":"4e9b768553c2b9249b8d1f233103a31b3ca5fa6493e982c7980e5a413d9693da"} Oct 07 11:37:18 crc kubenswrapper[4700]: I1007 11:37:18.626334 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7854e722-9de1-4249-943e-b5c4ce4634ad" containerName="glance-log" containerID="cri-o://8d00eec4587f4c0fa854acca76ec2dcd8e92c5e5839bf52a2cac24021dc98659" gracePeriod=30 Oct 07 11:37:18 crc kubenswrapper[4700]: I1007 11:37:18.626433 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7854e722-9de1-4249-943e-b5c4ce4634ad" containerName="glance-httpd" containerID="cri-o://4e9b768553c2b9249b8d1f233103a31b3ca5fa6493e982c7980e5a413d9693da" gracePeriod=30 Oct 07 11:37:18 crc kubenswrapper[4700]: I1007 11:37:18.636739 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e6866f21-b1de-41ab-bac9-08c6acc7a5cc" containerName="glance-log" containerID="cri-o://7f01812cabe4682f649dfaad561fb88073dcbe34a1b10ae10b3b67017a8e9cdd" gracePeriod=30 Oct 07 11:37:18 crc kubenswrapper[4700]: I1007 11:37:18.636838 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e6866f21-b1de-41ab-bac9-08c6acc7a5cc","Type":"ContainerStarted","Data":"966c29b04f2ac9425d2d30a0c3a67ae4c309122f0e176acd3677672980b4ef2f"} Oct 07 11:37:18 crc kubenswrapper[4700]: I1007 11:37:18.636904 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e6866f21-b1de-41ab-bac9-08c6acc7a5cc" containerName="glance-httpd" containerID="cri-o://966c29b04f2ac9425d2d30a0c3a67ae4c309122f0e176acd3677672980b4ef2f" gracePeriod=30 Oct 07 11:37:18 crc kubenswrapper[4700]: I1007 11:37:18.659911 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.659890862 podStartE2EDuration="5.659890862s" podCreationTimestamp="2025-10-07 11:37:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:37:18.65598414 +0000 UTC m=+1005.452383139" watchObservedRunningTime="2025-10-07 11:37:18.659890862 +0000 UTC m=+1005.456289851" Oct 07 11:37:18 crc kubenswrapper[4700]: I1007 11:37:18.700213 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.700185927 podStartE2EDuration="5.700185927s" podCreationTimestamp="2025-10-07 11:37:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:37:18.698334608 +0000 UTC m=+1005.494733607" watchObservedRunningTime="2025-10-07 11:37:18.700185927 +0000 UTC m=+1005.496584946" Oct 07 11:37:19 crc kubenswrapper[4700]: I1007 11:37:19.661788 4700 generic.go:334] "Generic (PLEG): container finished" podID="e6866f21-b1de-41ab-bac9-08c6acc7a5cc" containerID="966c29b04f2ac9425d2d30a0c3a67ae4c309122f0e176acd3677672980b4ef2f" exitCode=0 Oct 07 11:37:19 crc kubenswrapper[4700]: I1007 11:37:19.662016 4700 generic.go:334] "Generic (PLEG): container finished" podID="e6866f21-b1de-41ab-bac9-08c6acc7a5cc" containerID="7f01812cabe4682f649dfaad561fb88073dcbe34a1b10ae10b3b67017a8e9cdd" exitCode=143 Oct 07 11:37:19 crc kubenswrapper[4700]: I1007 11:37:19.661898 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e6866f21-b1de-41ab-bac9-08c6acc7a5cc","Type":"ContainerDied","Data":"966c29b04f2ac9425d2d30a0c3a67ae4c309122f0e176acd3677672980b4ef2f"} Oct 07 11:37:19 crc kubenswrapper[4700]: I1007 11:37:19.662092 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e6866f21-b1de-41ab-bac9-08c6acc7a5cc","Type":"ContainerDied","Data":"7f01812cabe4682f649dfaad561fb88073dcbe34a1b10ae10b3b67017a8e9cdd"} Oct 07 11:37:19 crc kubenswrapper[4700]: I1007 11:37:19.665853 4700 generic.go:334] "Generic (PLEG): container finished" podID="7854e722-9de1-4249-943e-b5c4ce4634ad" containerID="4e9b768553c2b9249b8d1f233103a31b3ca5fa6493e982c7980e5a413d9693da" exitCode=0 Oct 07 11:37:19 crc kubenswrapper[4700]: I1007 11:37:19.665878 4700 generic.go:334] "Generic (PLEG): container finished" podID="7854e722-9de1-4249-943e-b5c4ce4634ad" containerID="8d00eec4587f4c0fa854acca76ec2dcd8e92c5e5839bf52a2cac24021dc98659" exitCode=143 Oct 07 11:37:19 crc kubenswrapper[4700]: I1007 11:37:19.665898 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7854e722-9de1-4249-943e-b5c4ce4634ad","Type":"ContainerDied","Data":"4e9b768553c2b9249b8d1f233103a31b3ca5fa6493e982c7980e5a413d9693da"} Oct 07 11:37:19 crc kubenswrapper[4700]: I1007 11:37:19.665921 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7854e722-9de1-4249-943e-b5c4ce4634ad","Type":"ContainerDied","Data":"8d00eec4587f4c0fa854acca76ec2dcd8e92c5e5839bf52a2cac24021dc98659"} Oct 07 11:37:20 crc kubenswrapper[4700]: I1007 11:37:20.683027 4700 generic.go:334] "Generic (PLEG): container finished" podID="34edfc11-9aef-4d9f-9887-d4b698771823" containerID="18949fced552b3138e4f36a465881e5a00520a9f77fb3f38d51c5aae57604e1c" exitCode=0 Oct 07 11:37:20 crc kubenswrapper[4700]: I1007 11:37:20.683071 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qt97c" event={"ID":"34edfc11-9aef-4d9f-9887-d4b698771823","Type":"ContainerDied","Data":"18949fced552b3138e4f36a465881e5a00520a9f77fb3f38d51c5aae57604e1c"} Oct 07 11:37:20 crc kubenswrapper[4700]: I1007 11:37:20.798892 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-f1ef-account-create-jrd5q"] Oct 07 11:37:20 crc kubenswrapper[4700]: E1007 11:37:20.799364 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74f43b30-1015-447e-bcfe-773238ff545f" containerName="init" Oct 07 11:37:20 crc kubenswrapper[4700]: I1007 11:37:20.799384 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="74f43b30-1015-447e-bcfe-773238ff545f" containerName="init" Oct 07 11:37:20 crc kubenswrapper[4700]: I1007 11:37:20.799595 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="74f43b30-1015-447e-bcfe-773238ff545f" containerName="init" Oct 07 11:37:20 crc kubenswrapper[4700]: I1007 11:37:20.800270 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-f1ef-account-create-jrd5q" Oct 07 11:37:20 crc kubenswrapper[4700]: I1007 11:37:20.803466 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Oct 07 11:37:20 crc kubenswrapper[4700]: I1007 11:37:20.811425 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-f1ef-account-create-jrd5q"] Oct 07 11:37:20 crc kubenswrapper[4700]: I1007 11:37:20.899491 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lwjc\" (UniqueName: \"kubernetes.io/projected/980702b0-2482-4109-8ece-d5ecef7f84fc-kube-api-access-6lwjc\") pod \"heat-f1ef-account-create-jrd5q\" (UID: \"980702b0-2482-4109-8ece-d5ecef7f84fc\") " pod="openstack/heat-f1ef-account-create-jrd5q" Oct 07 11:37:21 crc kubenswrapper[4700]: I1007 11:37:21.001050 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lwjc\" (UniqueName: \"kubernetes.io/projected/980702b0-2482-4109-8ece-d5ecef7f84fc-kube-api-access-6lwjc\") pod \"heat-f1ef-account-create-jrd5q\" (UID: \"980702b0-2482-4109-8ece-d5ecef7f84fc\") " pod="openstack/heat-f1ef-account-create-jrd5q" Oct 07 11:37:21 crc kubenswrapper[4700]: I1007 11:37:21.005996 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b0d7-account-create-kdfv5"] Oct 07 11:37:21 crc kubenswrapper[4700]: I1007 11:37:21.007391 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b0d7-account-create-kdfv5" Oct 07 11:37:21 crc kubenswrapper[4700]: I1007 11:37:21.009678 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 07 11:37:21 crc kubenswrapper[4700]: I1007 11:37:21.012210 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b0d7-account-create-kdfv5"] Oct 07 11:37:21 crc kubenswrapper[4700]: I1007 11:37:21.022385 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lwjc\" (UniqueName: \"kubernetes.io/projected/980702b0-2482-4109-8ece-d5ecef7f84fc-kube-api-access-6lwjc\") pod \"heat-f1ef-account-create-jrd5q\" (UID: \"980702b0-2482-4109-8ece-d5ecef7f84fc\") " pod="openstack/heat-f1ef-account-create-jrd5q" Oct 07 11:37:21 crc kubenswrapper[4700]: I1007 11:37:21.103053 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnhsr\" (UniqueName: \"kubernetes.io/projected/f4cbf147-32c1-4ba8-b7b0-f91d1df46101-kube-api-access-cnhsr\") pod \"neutron-b0d7-account-create-kdfv5\" (UID: \"f4cbf147-32c1-4ba8-b7b0-f91d1df46101\") " pod="openstack/neutron-b0d7-account-create-kdfv5" Oct 07 11:37:21 crc kubenswrapper[4700]: I1007 11:37:21.129225 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-f1ef-account-create-jrd5q" Oct 07 11:37:21 crc kubenswrapper[4700]: I1007 11:37:21.204880 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnhsr\" (UniqueName: \"kubernetes.io/projected/f4cbf147-32c1-4ba8-b7b0-f91d1df46101-kube-api-access-cnhsr\") pod \"neutron-b0d7-account-create-kdfv5\" (UID: \"f4cbf147-32c1-4ba8-b7b0-f91d1df46101\") " pod="openstack/neutron-b0d7-account-create-kdfv5" Oct 07 11:37:21 crc kubenswrapper[4700]: I1007 11:37:21.230713 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnhsr\" (UniqueName: \"kubernetes.io/projected/f4cbf147-32c1-4ba8-b7b0-f91d1df46101-kube-api-access-cnhsr\") pod \"neutron-b0d7-account-create-kdfv5\" (UID: \"f4cbf147-32c1-4ba8-b7b0-f91d1df46101\") " pod="openstack/neutron-b0d7-account-create-kdfv5" Oct 07 11:37:21 crc kubenswrapper[4700]: I1007 11:37:21.409424 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b0d7-account-create-kdfv5" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.581937 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.590807 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.649478 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7854e722-9de1-4249-943e-b5c4ce4634ad-config-data\") pod \"7854e722-9de1-4249-943e-b5c4ce4634ad\" (UID: \"7854e722-9de1-4249-943e-b5c4ce4634ad\") " Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.649549 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-public-tls-certs\") pod \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\" (UID: \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\") " Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.649617 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7854e722-9de1-4249-943e-b5c4ce4634ad-httpd-run\") pod \"7854e722-9de1-4249-943e-b5c4ce4634ad\" (UID: \"7854e722-9de1-4249-943e-b5c4ce4634ad\") " Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.649643 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7854e722-9de1-4249-943e-b5c4ce4634ad-internal-tls-certs\") pod \"7854e722-9de1-4249-943e-b5c4ce4634ad\" (UID: \"7854e722-9de1-4249-943e-b5c4ce4634ad\") " Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.649678 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7854e722-9de1-4249-943e-b5c4ce4634ad-logs\") pod \"7854e722-9de1-4249-943e-b5c4ce4634ad\" (UID: \"7854e722-9de1-4249-943e-b5c4ce4634ad\") " Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.649702 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\" (UID: \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\") " Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.649734 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7854e722-9de1-4249-943e-b5c4ce4634ad-combined-ca-bundle\") pod \"7854e722-9de1-4249-943e-b5c4ce4634ad\" (UID: \"7854e722-9de1-4249-943e-b5c4ce4634ad\") " Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.649764 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-logs\") pod \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\" (UID: \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\") " Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.649789 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwl94\" (UniqueName: \"kubernetes.io/projected/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-kube-api-access-kwl94\") pod \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\" (UID: \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\") " Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.649813 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-config-data\") pod \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\" (UID: \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\") " Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.649841 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xltfn\" (UniqueName: \"kubernetes.io/projected/7854e722-9de1-4249-943e-b5c4ce4634ad-kube-api-access-xltfn\") pod \"7854e722-9de1-4249-943e-b5c4ce4634ad\" (UID: \"7854e722-9de1-4249-943e-b5c4ce4634ad\") " Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.649878 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-combined-ca-bundle\") pod \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\" (UID: \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\") " Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.649916 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-httpd-run\") pod \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\" (UID: \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\") " Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.649937 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-scripts\") pod \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\" (UID: \"e6866f21-b1de-41ab-bac9-08c6acc7a5cc\") " Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.649963 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"7854e722-9de1-4249-943e-b5c4ce4634ad\" (UID: \"7854e722-9de1-4249-943e-b5c4ce4634ad\") " Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.649988 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7854e722-9de1-4249-943e-b5c4ce4634ad-scripts\") pod \"7854e722-9de1-4249-943e-b5c4ce4634ad\" (UID: \"7854e722-9de1-4249-943e-b5c4ce4634ad\") " Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.650207 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7854e722-9de1-4249-943e-b5c4ce4634ad-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7854e722-9de1-4249-943e-b5c4ce4634ad" (UID: "7854e722-9de1-4249-943e-b5c4ce4634ad"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.650607 4700 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7854e722-9de1-4249-943e-b5c4ce4634ad-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.651483 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e6866f21-b1de-41ab-bac9-08c6acc7a5cc" (UID: "e6866f21-b1de-41ab-bac9-08c6acc7a5cc"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.651657 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7854e722-9de1-4249-943e-b5c4ce4634ad-logs" (OuterVolumeSpecName: "logs") pod "7854e722-9de1-4249-943e-b5c4ce4634ad" (UID: "7854e722-9de1-4249-943e-b5c4ce4634ad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.655280 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-logs" (OuterVolumeSpecName: "logs") pod "e6866f21-b1de-41ab-bac9-08c6acc7a5cc" (UID: "e6866f21-b1de-41ab-bac9-08c6acc7a5cc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.656002 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "e6866f21-b1de-41ab-bac9-08c6acc7a5cc" (UID: "e6866f21-b1de-41ab-bac9-08c6acc7a5cc"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.657027 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7854e722-9de1-4249-943e-b5c4ce4634ad-kube-api-access-xltfn" (OuterVolumeSpecName: "kube-api-access-xltfn") pod "7854e722-9de1-4249-943e-b5c4ce4634ad" (UID: "7854e722-9de1-4249-943e-b5c4ce4634ad"). InnerVolumeSpecName "kube-api-access-xltfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.657994 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7854e722-9de1-4249-943e-b5c4ce4634ad-scripts" (OuterVolumeSpecName: "scripts") pod "7854e722-9de1-4249-943e-b5c4ce4634ad" (UID: "7854e722-9de1-4249-943e-b5c4ce4634ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.659026 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "7854e722-9de1-4249-943e-b5c4ce4634ad" (UID: "7854e722-9de1-4249-943e-b5c4ce4634ad"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.661295 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-kube-api-access-kwl94" (OuterVolumeSpecName: "kube-api-access-kwl94") pod "e6866f21-b1de-41ab-bac9-08c6acc7a5cc" (UID: "e6866f21-b1de-41ab-bac9-08c6acc7a5cc"). InnerVolumeSpecName "kube-api-access-kwl94". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.662082 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-scripts" (OuterVolumeSpecName: "scripts") pod "e6866f21-b1de-41ab-bac9-08c6acc7a5cc" (UID: "e6866f21-b1de-41ab-bac9-08c6acc7a5cc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.702509 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6866f21-b1de-41ab-bac9-08c6acc7a5cc" (UID: "e6866f21-b1de-41ab-bac9-08c6acc7a5cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.703986 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7854e722-9de1-4249-943e-b5c4ce4634ad-config-data" (OuterVolumeSpecName: "config-data") pod "7854e722-9de1-4249-943e-b5c4ce4634ad" (UID: "7854e722-9de1-4249-943e-b5c4ce4634ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.711358 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7854e722-9de1-4249-943e-b5c4ce4634ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7854e722-9de1-4249-943e-b5c4ce4634ad" (UID: "7854e722-9de1-4249-943e-b5c4ce4634ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.713602 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e6866f21-b1de-41ab-bac9-08c6acc7a5cc" (UID: "e6866f21-b1de-41ab-bac9-08c6acc7a5cc"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.725227 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e6866f21-b1de-41ab-bac9-08c6acc7a5cc","Type":"ContainerDied","Data":"724cfa14b9303b6cf5768478df20a621338eb3f37a83e2b0215a6e434f5dee38"} Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.725370 4700 scope.go:117] "RemoveContainer" containerID="966c29b04f2ac9425d2d30a0c3a67ae4c309122f0e176acd3677672980b4ef2f" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.725575 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.728954 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7854e722-9de1-4249-943e-b5c4ce4634ad","Type":"ContainerDied","Data":"32f72b725429c2090c573508a0fba9e70c03bc39bd02532e3cd690cf5decc601"} Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.729038 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.732765 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-config-data" (OuterVolumeSpecName: "config-data") pod "e6866f21-b1de-41ab-bac9-08c6acc7a5cc" (UID: "e6866f21-b1de-41ab-bac9-08c6acc7a5cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.739457 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7854e722-9de1-4249-943e-b5c4ce4634ad-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7854e722-9de1-4249-943e-b5c4ce4634ad" (UID: "7854e722-9de1-4249-943e-b5c4ce4634ad"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.753080 4700 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7854e722-9de1-4249-943e-b5c4ce4634ad-logs\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.753347 4700 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.753372 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7854e722-9de1-4249-943e-b5c4ce4634ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.766574 4700 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-logs\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.766612 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwl94\" (UniqueName: \"kubernetes.io/projected/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-kube-api-access-kwl94\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.766626 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.766641 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xltfn\" (UniqueName: \"kubernetes.io/projected/7854e722-9de1-4249-943e-b5c4ce4634ad-kube-api-access-xltfn\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.766652 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.766665 4700 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.766675 4700 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.766711 4700 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.766733 4700 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7854e722-9de1-4249-943e-b5c4ce4634ad-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.766744 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7854e722-9de1-4249-943e-b5c4ce4634ad-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.766755 4700 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6866f21-b1de-41ab-bac9-08c6acc7a5cc-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.766767 4700 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7854e722-9de1-4249-943e-b5c4ce4634ad-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.774756 4700 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.786996 4700 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.868834 4700 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:23 crc kubenswrapper[4700]: I1007 11:37:23.868881 4700 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.072394 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.079588 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.091112 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.101970 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.107451 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 11:37:24 crc kubenswrapper[4700]: E1007 11:37:24.107895 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7854e722-9de1-4249-943e-b5c4ce4634ad" containerName="glance-log" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.107910 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="7854e722-9de1-4249-943e-b5c4ce4634ad" containerName="glance-log" Oct 07 11:37:24 crc kubenswrapper[4700]: E1007 11:37:24.107930 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7854e722-9de1-4249-943e-b5c4ce4634ad" containerName="glance-httpd" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.107937 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="7854e722-9de1-4249-943e-b5c4ce4634ad" containerName="glance-httpd" Oct 07 11:37:24 crc kubenswrapper[4700]: E1007 11:37:24.107956 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6866f21-b1de-41ab-bac9-08c6acc7a5cc" containerName="glance-log" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.107962 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6866f21-b1de-41ab-bac9-08c6acc7a5cc" containerName="glance-log" Oct 07 11:37:24 crc kubenswrapper[4700]: E1007 11:37:24.107973 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6866f21-b1de-41ab-bac9-08c6acc7a5cc" containerName="glance-httpd" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.107980 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6866f21-b1de-41ab-bac9-08c6acc7a5cc" containerName="glance-httpd" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.108130 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6866f21-b1de-41ab-bac9-08c6acc7a5cc" containerName="glance-httpd" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.108146 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6866f21-b1de-41ab-bac9-08c6acc7a5cc" containerName="glance-log" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.108156 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="7854e722-9de1-4249-943e-b5c4ce4634ad" containerName="glance-httpd" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.108167 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="7854e722-9de1-4249-943e-b5c4ce4634ad" containerName="glance-log" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.135118 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.142316 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.142508 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.142948 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9x8lb" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.143584 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.153966 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.155795 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.161828 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.162325 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.184377 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.186141 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f62089a-2daa-491e-bcdb-7f793df7cd99-logs\") pod \"glance-default-external-api-0\" (UID: \"4f62089a-2daa-491e-bcdb-7f793df7cd99\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.186207 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f62089a-2daa-491e-bcdb-7f793df7cd99-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4f62089a-2daa-491e-bcdb-7f793df7cd99\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.186231 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4f62089a-2daa-491e-bcdb-7f793df7cd99-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4f62089a-2daa-491e-bcdb-7f793df7cd99\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.186265 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f62089a-2daa-491e-bcdb-7f793df7cd99-config-data\") pod \"glance-default-external-api-0\" (UID: \"4f62089a-2daa-491e-bcdb-7f793df7cd99\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.186285 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbncm\" (UniqueName: \"kubernetes.io/projected/4f62089a-2daa-491e-bcdb-7f793df7cd99-kube-api-access-vbncm\") pod \"glance-default-external-api-0\" (UID: \"4f62089a-2daa-491e-bcdb-7f793df7cd99\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.186330 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f62089a-2daa-491e-bcdb-7f793df7cd99-scripts\") pod \"glance-default-external-api-0\" (UID: \"4f62089a-2daa-491e-bcdb-7f793df7cd99\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.186421 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f62089a-2daa-491e-bcdb-7f793df7cd99-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4f62089a-2daa-491e-bcdb-7f793df7cd99\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.186445 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"4f62089a-2daa-491e-bcdb-7f793df7cd99\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.203918 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.288148 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4f62089a-2daa-491e-bcdb-7f793df7cd99-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4f62089a-2daa-491e-bcdb-7f793df7cd99\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.288210 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f62089a-2daa-491e-bcdb-7f793df7cd99-config-data\") pod \"glance-default-external-api-0\" (UID: \"4f62089a-2daa-491e-bcdb-7f793df7cd99\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.288238 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba6fb68-aeed-4282-ab35-a7cc04b39f52-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.288267 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbncm\" (UniqueName: \"kubernetes.io/projected/4f62089a-2daa-491e-bcdb-7f793df7cd99-kube-api-access-vbncm\") pod \"glance-default-external-api-0\" (UID: \"4f62089a-2daa-491e-bcdb-7f793df7cd99\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.288293 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba6fb68-aeed-4282-ab35-a7cc04b39f52-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.288337 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f62089a-2daa-491e-bcdb-7f793df7cd99-scripts\") pod \"glance-default-external-api-0\" (UID: \"4f62089a-2daa-491e-bcdb-7f793df7cd99\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.288378 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpz78\" (UniqueName: \"kubernetes.io/projected/cba6fb68-aeed-4282-ab35-a7cc04b39f52-kube-api-access-lpz78\") pod \"glance-default-internal-api-0\" (UID: \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.288429 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.288466 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f62089a-2daa-491e-bcdb-7f793df7cd99-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4f62089a-2daa-491e-bcdb-7f793df7cd99\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.288491 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"4f62089a-2daa-491e-bcdb-7f793df7cd99\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.288530 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba6fb68-aeed-4282-ab35-a7cc04b39f52-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.288559 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cba6fb68-aeed-4282-ab35-a7cc04b39f52-logs\") pod \"glance-default-internal-api-0\" (UID: \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.288588 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba6fb68-aeed-4282-ab35-a7cc04b39f52-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.288612 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cba6fb68-aeed-4282-ab35-a7cc04b39f52-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.288666 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f62089a-2daa-491e-bcdb-7f793df7cd99-logs\") pod \"glance-default-external-api-0\" (UID: \"4f62089a-2daa-491e-bcdb-7f793df7cd99\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.288701 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f62089a-2daa-491e-bcdb-7f793df7cd99-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4f62089a-2daa-491e-bcdb-7f793df7cd99\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.290643 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4f62089a-2daa-491e-bcdb-7f793df7cd99-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4f62089a-2daa-491e-bcdb-7f793df7cd99\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.291378 4700 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"4f62089a-2daa-491e-bcdb-7f793df7cd99\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.291495 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f62089a-2daa-491e-bcdb-7f793df7cd99-logs\") pod \"glance-default-external-api-0\" (UID: \"4f62089a-2daa-491e-bcdb-7f793df7cd99\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.294230 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f62089a-2daa-491e-bcdb-7f793df7cd99-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4f62089a-2daa-491e-bcdb-7f793df7cd99\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.294747 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f62089a-2daa-491e-bcdb-7f793df7cd99-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4f62089a-2daa-491e-bcdb-7f793df7cd99\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.296953 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f62089a-2daa-491e-bcdb-7f793df7cd99-scripts\") pod \"glance-default-external-api-0\" (UID: \"4f62089a-2daa-491e-bcdb-7f793df7cd99\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.311024 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbncm\" (UniqueName: \"kubernetes.io/projected/4f62089a-2daa-491e-bcdb-7f793df7cd99-kube-api-access-vbncm\") pod \"glance-default-external-api-0\" (UID: \"4f62089a-2daa-491e-bcdb-7f793df7cd99\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.311531 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f62089a-2daa-491e-bcdb-7f793df7cd99-config-data\") pod \"glance-default-external-api-0\" (UID: \"4f62089a-2daa-491e-bcdb-7f793df7cd99\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.323612 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"4f62089a-2daa-491e-bcdb-7f793df7cd99\") " pod="openstack/glance-default-external-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.390162 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba6fb68-aeed-4282-ab35-a7cc04b39f52-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.390211 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba6fb68-aeed-4282-ab35-a7cc04b39f52-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.390247 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpz78\" (UniqueName: \"kubernetes.io/projected/cba6fb68-aeed-4282-ab35-a7cc04b39f52-kube-api-access-lpz78\") pod \"glance-default-internal-api-0\" (UID: \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.390288 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.390382 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba6fb68-aeed-4282-ab35-a7cc04b39f52-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.390401 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cba6fb68-aeed-4282-ab35-a7cc04b39f52-logs\") pod \"glance-default-internal-api-0\" (UID: \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.390422 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba6fb68-aeed-4282-ab35-a7cc04b39f52-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.390438 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cba6fb68-aeed-4282-ab35-a7cc04b39f52-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.390804 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cba6fb68-aeed-4282-ab35-a7cc04b39f52-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.391075 4700 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.394611 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba6fb68-aeed-4282-ab35-a7cc04b39f52-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.394724 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cba6fb68-aeed-4282-ab35-a7cc04b39f52-logs\") pod \"glance-default-internal-api-0\" (UID: \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.398781 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba6fb68-aeed-4282-ab35-a7cc04b39f52-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.400711 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba6fb68-aeed-4282-ab35-a7cc04b39f52-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.404848 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba6fb68-aeed-4282-ab35-a7cc04b39f52-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.414080 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpz78\" (UniqueName: \"kubernetes.io/projected/cba6fb68-aeed-4282-ab35-a7cc04b39f52-kube-api-access-lpz78\") pod \"glance-default-internal-api-0\" (UID: \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.424081 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.466655 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.480549 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.742472 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8b5c85b87-jkbj4" Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.799051 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-dmn6s"] Oct 07 11:37:24 crc kubenswrapper[4700]: I1007 11:37:24.799296 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77585f5f8c-dmn6s" podUID="944eef79-0417-4a6d-b965-807e00d64351" containerName="dnsmasq-dns" containerID="cri-o://4e08da9ca45d7f088c3286e015ccfbe4bdf01fec95faf7e7272a0a3a4f18d0c5" gracePeriod=10 Oct 07 11:37:25 crc kubenswrapper[4700]: I1007 11:37:25.757006 4700 generic.go:334] "Generic (PLEG): container finished" podID="944eef79-0417-4a6d-b965-807e00d64351" containerID="4e08da9ca45d7f088c3286e015ccfbe4bdf01fec95faf7e7272a0a3a4f18d0c5" exitCode=0 Oct 07 11:37:25 crc kubenswrapper[4700]: I1007 11:37:25.757071 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-dmn6s" event={"ID":"944eef79-0417-4a6d-b965-807e00d64351","Type":"ContainerDied","Data":"4e08da9ca45d7f088c3286e015ccfbe4bdf01fec95faf7e7272a0a3a4f18d0c5"} Oct 07 11:37:25 crc kubenswrapper[4700]: I1007 11:37:25.967109 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7854e722-9de1-4249-943e-b5c4ce4634ad" path="/var/lib/kubelet/pods/7854e722-9de1-4249-943e-b5c4ce4634ad/volumes" Oct 07 11:37:25 crc kubenswrapper[4700]: I1007 11:37:25.967900 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6866f21-b1de-41ab-bac9-08c6acc7a5cc" path="/var/lib/kubelet/pods/e6866f21-b1de-41ab-bac9-08c6acc7a5cc/volumes" Oct 07 11:37:27 crc kubenswrapper[4700]: I1007 11:37:27.911506 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qt97c" Oct 07 11:37:27 crc kubenswrapper[4700]: I1007 11:37:27.962390 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/34edfc11-9aef-4d9f-9887-d4b698771823-fernet-keys\") pod \"34edfc11-9aef-4d9f-9887-d4b698771823\" (UID: \"34edfc11-9aef-4d9f-9887-d4b698771823\") " Oct 07 11:37:27 crc kubenswrapper[4700]: I1007 11:37:27.962450 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34edfc11-9aef-4d9f-9887-d4b698771823-scripts\") pod \"34edfc11-9aef-4d9f-9887-d4b698771823\" (UID: \"34edfc11-9aef-4d9f-9887-d4b698771823\") " Oct 07 11:37:27 crc kubenswrapper[4700]: I1007 11:37:27.962592 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/34edfc11-9aef-4d9f-9887-d4b698771823-credential-keys\") pod \"34edfc11-9aef-4d9f-9887-d4b698771823\" (UID: \"34edfc11-9aef-4d9f-9887-d4b698771823\") " Oct 07 11:37:27 crc kubenswrapper[4700]: I1007 11:37:27.962645 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49hk5\" (UniqueName: \"kubernetes.io/projected/34edfc11-9aef-4d9f-9887-d4b698771823-kube-api-access-49hk5\") pod \"34edfc11-9aef-4d9f-9887-d4b698771823\" (UID: \"34edfc11-9aef-4d9f-9887-d4b698771823\") " Oct 07 11:37:27 crc kubenswrapper[4700]: I1007 11:37:27.962787 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34edfc11-9aef-4d9f-9887-d4b698771823-combined-ca-bundle\") pod \"34edfc11-9aef-4d9f-9887-d4b698771823\" (UID: \"34edfc11-9aef-4d9f-9887-d4b698771823\") " Oct 07 11:37:27 crc kubenswrapper[4700]: I1007 11:37:27.962938 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34edfc11-9aef-4d9f-9887-d4b698771823-config-data\") pod \"34edfc11-9aef-4d9f-9887-d4b698771823\" (UID: \"34edfc11-9aef-4d9f-9887-d4b698771823\") " Oct 07 11:37:27 crc kubenswrapper[4700]: I1007 11:37:27.976651 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34edfc11-9aef-4d9f-9887-d4b698771823-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "34edfc11-9aef-4d9f-9887-d4b698771823" (UID: "34edfc11-9aef-4d9f-9887-d4b698771823"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:37:27 crc kubenswrapper[4700]: I1007 11:37:27.976770 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34edfc11-9aef-4d9f-9887-d4b698771823-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "34edfc11-9aef-4d9f-9887-d4b698771823" (UID: "34edfc11-9aef-4d9f-9887-d4b698771823"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:37:27 crc kubenswrapper[4700]: I1007 11:37:27.976818 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34edfc11-9aef-4d9f-9887-d4b698771823-kube-api-access-49hk5" (OuterVolumeSpecName: "kube-api-access-49hk5") pod "34edfc11-9aef-4d9f-9887-d4b698771823" (UID: "34edfc11-9aef-4d9f-9887-d4b698771823"). InnerVolumeSpecName "kube-api-access-49hk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:37:27 crc kubenswrapper[4700]: I1007 11:37:27.976883 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34edfc11-9aef-4d9f-9887-d4b698771823-scripts" (OuterVolumeSpecName: "scripts") pod "34edfc11-9aef-4d9f-9887-d4b698771823" (UID: "34edfc11-9aef-4d9f-9887-d4b698771823"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:37:28 crc kubenswrapper[4700]: I1007 11:37:28.000382 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34edfc11-9aef-4d9f-9887-d4b698771823-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34edfc11-9aef-4d9f-9887-d4b698771823" (UID: "34edfc11-9aef-4d9f-9887-d4b698771823"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:37:28 crc kubenswrapper[4700]: I1007 11:37:28.003753 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34edfc11-9aef-4d9f-9887-d4b698771823-config-data" (OuterVolumeSpecName: "config-data") pod "34edfc11-9aef-4d9f-9887-d4b698771823" (UID: "34edfc11-9aef-4d9f-9887-d4b698771823"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:37:28 crc kubenswrapper[4700]: I1007 11:37:28.064567 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34edfc11-9aef-4d9f-9887-d4b698771823-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:28 crc kubenswrapper[4700]: I1007 11:37:28.064608 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34edfc11-9aef-4d9f-9887-d4b698771823-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:28 crc kubenswrapper[4700]: I1007 11:37:28.064620 4700 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/34edfc11-9aef-4d9f-9887-d4b698771823-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:28 crc kubenswrapper[4700]: I1007 11:37:28.064631 4700 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34edfc11-9aef-4d9f-9887-d4b698771823-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:28 crc kubenswrapper[4700]: I1007 11:37:28.064642 4700 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/34edfc11-9aef-4d9f-9887-d4b698771823-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:28 crc kubenswrapper[4700]: I1007 11:37:28.064654 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49hk5\" (UniqueName: \"kubernetes.io/projected/34edfc11-9aef-4d9f-9887-d4b698771823-kube-api-access-49hk5\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:28 crc kubenswrapper[4700]: I1007 11:37:28.790204 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qt97c" event={"ID":"34edfc11-9aef-4d9f-9887-d4b698771823","Type":"ContainerDied","Data":"35a210cf86279e54c280485119415075acd36c72025728674b746957f04d00e7"} Oct 07 11:37:28 crc kubenswrapper[4700]: I1007 11:37:28.790242 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qt97c" Oct 07 11:37:28 crc kubenswrapper[4700]: I1007 11:37:28.790261 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35a210cf86279e54c280485119415075acd36c72025728674b746957f04d00e7" Oct 07 11:37:28 crc kubenswrapper[4700]: I1007 11:37:28.992433 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-qt97c"] Oct 07 11:37:28 crc kubenswrapper[4700]: I1007 11:37:28.998419 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-qt97c"] Oct 07 11:37:29 crc kubenswrapper[4700]: I1007 11:37:29.098956 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-6ftwf"] Oct 07 11:37:29 crc kubenswrapper[4700]: E1007 11:37:29.099508 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34edfc11-9aef-4d9f-9887-d4b698771823" containerName="keystone-bootstrap" Oct 07 11:37:29 crc kubenswrapper[4700]: I1007 11:37:29.099532 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="34edfc11-9aef-4d9f-9887-d4b698771823" containerName="keystone-bootstrap" Oct 07 11:37:29 crc kubenswrapper[4700]: I1007 11:37:29.099872 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="34edfc11-9aef-4d9f-9887-d4b698771823" containerName="keystone-bootstrap" Oct 07 11:37:29 crc kubenswrapper[4700]: I1007 11:37:29.100775 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6ftwf" Oct 07 11:37:29 crc kubenswrapper[4700]: I1007 11:37:29.104175 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 07 11:37:29 crc kubenswrapper[4700]: I1007 11:37:29.104755 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 07 11:37:29 crc kubenswrapper[4700]: I1007 11:37:29.105233 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 07 11:37:29 crc kubenswrapper[4700]: I1007 11:37:29.105541 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-djssz" Oct 07 11:37:29 crc kubenswrapper[4700]: I1007 11:37:29.118539 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6ftwf"] Oct 07 11:37:29 crc kubenswrapper[4700]: I1007 11:37:29.188674 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e383a2d1-cd6e-4658-9148-ef9c25c63430-fernet-keys\") pod \"keystone-bootstrap-6ftwf\" (UID: \"e383a2d1-cd6e-4658-9148-ef9c25c63430\") " pod="openstack/keystone-bootstrap-6ftwf" Oct 07 11:37:29 crc kubenswrapper[4700]: I1007 11:37:29.188715 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e383a2d1-cd6e-4658-9148-ef9c25c63430-config-data\") pod \"keystone-bootstrap-6ftwf\" (UID: \"e383a2d1-cd6e-4658-9148-ef9c25c63430\") " pod="openstack/keystone-bootstrap-6ftwf" Oct 07 11:37:29 crc kubenswrapper[4700]: I1007 11:37:29.188749 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e383a2d1-cd6e-4658-9148-ef9c25c63430-scripts\") pod \"keystone-bootstrap-6ftwf\" (UID: \"e383a2d1-cd6e-4658-9148-ef9c25c63430\") " pod="openstack/keystone-bootstrap-6ftwf" Oct 07 11:37:29 crc kubenswrapper[4700]: I1007 11:37:29.188774 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e383a2d1-cd6e-4658-9148-ef9c25c63430-credential-keys\") pod \"keystone-bootstrap-6ftwf\" (UID: \"e383a2d1-cd6e-4658-9148-ef9c25c63430\") " pod="openstack/keystone-bootstrap-6ftwf" Oct 07 11:37:29 crc kubenswrapper[4700]: I1007 11:37:29.188792 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcmbv\" (UniqueName: \"kubernetes.io/projected/e383a2d1-cd6e-4658-9148-ef9c25c63430-kube-api-access-qcmbv\") pod \"keystone-bootstrap-6ftwf\" (UID: \"e383a2d1-cd6e-4658-9148-ef9c25c63430\") " pod="openstack/keystone-bootstrap-6ftwf" Oct 07 11:37:29 crc kubenswrapper[4700]: I1007 11:37:29.189322 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e383a2d1-cd6e-4658-9148-ef9c25c63430-combined-ca-bundle\") pod \"keystone-bootstrap-6ftwf\" (UID: \"e383a2d1-cd6e-4658-9148-ef9c25c63430\") " pod="openstack/keystone-bootstrap-6ftwf" Oct 07 11:37:29 crc kubenswrapper[4700]: I1007 11:37:29.291395 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e383a2d1-cd6e-4658-9148-ef9c25c63430-combined-ca-bundle\") pod \"keystone-bootstrap-6ftwf\" (UID: \"e383a2d1-cd6e-4658-9148-ef9c25c63430\") " pod="openstack/keystone-bootstrap-6ftwf" Oct 07 11:37:29 crc kubenswrapper[4700]: I1007 11:37:29.291473 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e383a2d1-cd6e-4658-9148-ef9c25c63430-fernet-keys\") pod \"keystone-bootstrap-6ftwf\" (UID: \"e383a2d1-cd6e-4658-9148-ef9c25c63430\") " pod="openstack/keystone-bootstrap-6ftwf" Oct 07 11:37:29 crc kubenswrapper[4700]: I1007 11:37:29.291522 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e383a2d1-cd6e-4658-9148-ef9c25c63430-config-data\") pod \"keystone-bootstrap-6ftwf\" (UID: \"e383a2d1-cd6e-4658-9148-ef9c25c63430\") " pod="openstack/keystone-bootstrap-6ftwf" Oct 07 11:37:29 crc kubenswrapper[4700]: I1007 11:37:29.291572 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e383a2d1-cd6e-4658-9148-ef9c25c63430-scripts\") pod \"keystone-bootstrap-6ftwf\" (UID: \"e383a2d1-cd6e-4658-9148-ef9c25c63430\") " pod="openstack/keystone-bootstrap-6ftwf" Oct 07 11:37:29 crc kubenswrapper[4700]: I1007 11:37:29.291618 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e383a2d1-cd6e-4658-9148-ef9c25c63430-credential-keys\") pod \"keystone-bootstrap-6ftwf\" (UID: \"e383a2d1-cd6e-4658-9148-ef9c25c63430\") " pod="openstack/keystone-bootstrap-6ftwf" Oct 07 11:37:29 crc kubenswrapper[4700]: I1007 11:37:29.291656 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcmbv\" (UniqueName: \"kubernetes.io/projected/e383a2d1-cd6e-4658-9148-ef9c25c63430-kube-api-access-qcmbv\") pod \"keystone-bootstrap-6ftwf\" (UID: \"e383a2d1-cd6e-4658-9148-ef9c25c63430\") " pod="openstack/keystone-bootstrap-6ftwf" Oct 07 11:37:29 crc kubenswrapper[4700]: I1007 11:37:29.296901 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e383a2d1-cd6e-4658-9148-ef9c25c63430-fernet-keys\") pod \"keystone-bootstrap-6ftwf\" (UID: \"e383a2d1-cd6e-4658-9148-ef9c25c63430\") " pod="openstack/keystone-bootstrap-6ftwf" Oct 07 11:37:29 crc kubenswrapper[4700]: I1007 11:37:29.297507 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e383a2d1-cd6e-4658-9148-ef9c25c63430-credential-keys\") pod \"keystone-bootstrap-6ftwf\" (UID: \"e383a2d1-cd6e-4658-9148-ef9c25c63430\") " pod="openstack/keystone-bootstrap-6ftwf" Oct 07 11:37:29 crc kubenswrapper[4700]: I1007 11:37:29.297854 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e383a2d1-cd6e-4658-9148-ef9c25c63430-scripts\") pod \"keystone-bootstrap-6ftwf\" (UID: \"e383a2d1-cd6e-4658-9148-ef9c25c63430\") " pod="openstack/keystone-bootstrap-6ftwf" Oct 07 11:37:29 crc kubenswrapper[4700]: I1007 11:37:29.298222 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e383a2d1-cd6e-4658-9148-ef9c25c63430-config-data\") pod \"keystone-bootstrap-6ftwf\" (UID: \"e383a2d1-cd6e-4658-9148-ef9c25c63430\") " pod="openstack/keystone-bootstrap-6ftwf" Oct 07 11:37:29 crc kubenswrapper[4700]: I1007 11:37:29.308253 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e383a2d1-cd6e-4658-9148-ef9c25c63430-combined-ca-bundle\") pod \"keystone-bootstrap-6ftwf\" (UID: \"e383a2d1-cd6e-4658-9148-ef9c25c63430\") " pod="openstack/keystone-bootstrap-6ftwf" Oct 07 11:37:29 crc kubenswrapper[4700]: I1007 11:37:29.310905 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcmbv\" (UniqueName: \"kubernetes.io/projected/e383a2d1-cd6e-4658-9148-ef9c25c63430-kube-api-access-qcmbv\") pod \"keystone-bootstrap-6ftwf\" (UID: \"e383a2d1-cd6e-4658-9148-ef9c25c63430\") " pod="openstack/keystone-bootstrap-6ftwf" Oct 07 11:37:29 crc kubenswrapper[4700]: I1007 11:37:29.423764 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6ftwf" Oct 07 11:37:29 crc kubenswrapper[4700]: I1007 11:37:29.967174 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34edfc11-9aef-4d9f-9887-d4b698771823" path="/var/lib/kubelet/pods/34edfc11-9aef-4d9f-9887-d4b698771823/volumes" Oct 07 11:37:32 crc kubenswrapper[4700]: I1007 11:37:32.636379 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-dmn6s" podUID="944eef79-0417-4a6d-b965-807e00d64351" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: i/o timeout" Oct 07 11:37:34 crc kubenswrapper[4700]: I1007 11:37:34.926138 4700 scope.go:117] "RemoveContainer" containerID="7f01812cabe4682f649dfaad561fb88073dcbe34a1b10ae10b3b67017a8e9cdd" Oct 07 11:37:34 crc kubenswrapper[4700]: I1007 11:37:34.996524 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-dmn6s" Oct 07 11:37:35 crc kubenswrapper[4700]: I1007 11:37:35.093754 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vb4z4\" (UniqueName: \"kubernetes.io/projected/944eef79-0417-4a6d-b965-807e00d64351-kube-api-access-vb4z4\") pod \"944eef79-0417-4a6d-b965-807e00d64351\" (UID: \"944eef79-0417-4a6d-b965-807e00d64351\") " Oct 07 11:37:35 crc kubenswrapper[4700]: I1007 11:37:35.093857 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/944eef79-0417-4a6d-b965-807e00d64351-ovsdbserver-nb\") pod \"944eef79-0417-4a6d-b965-807e00d64351\" (UID: \"944eef79-0417-4a6d-b965-807e00d64351\") " Oct 07 11:37:35 crc kubenswrapper[4700]: I1007 11:37:35.093929 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/944eef79-0417-4a6d-b965-807e00d64351-dns-svc\") pod \"944eef79-0417-4a6d-b965-807e00d64351\" (UID: \"944eef79-0417-4a6d-b965-807e00d64351\") " Oct 07 11:37:35 crc kubenswrapper[4700]: I1007 11:37:35.094022 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/944eef79-0417-4a6d-b965-807e00d64351-ovsdbserver-sb\") pod \"944eef79-0417-4a6d-b965-807e00d64351\" (UID: \"944eef79-0417-4a6d-b965-807e00d64351\") " Oct 07 11:37:35 crc kubenswrapper[4700]: I1007 11:37:35.094137 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/944eef79-0417-4a6d-b965-807e00d64351-config\") pod \"944eef79-0417-4a6d-b965-807e00d64351\" (UID: \"944eef79-0417-4a6d-b965-807e00d64351\") " Oct 07 11:37:35 crc kubenswrapper[4700]: I1007 11:37:35.094343 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/944eef79-0417-4a6d-b965-807e00d64351-dns-swift-storage-0\") pod \"944eef79-0417-4a6d-b965-807e00d64351\" (UID: \"944eef79-0417-4a6d-b965-807e00d64351\") " Oct 07 11:37:35 crc kubenswrapper[4700]: I1007 11:37:35.100863 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/944eef79-0417-4a6d-b965-807e00d64351-kube-api-access-vb4z4" (OuterVolumeSpecName: "kube-api-access-vb4z4") pod "944eef79-0417-4a6d-b965-807e00d64351" (UID: "944eef79-0417-4a6d-b965-807e00d64351"). InnerVolumeSpecName "kube-api-access-vb4z4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:37:35 crc kubenswrapper[4700]: I1007 11:37:35.154281 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/944eef79-0417-4a6d-b965-807e00d64351-config" (OuterVolumeSpecName: "config") pod "944eef79-0417-4a6d-b965-807e00d64351" (UID: "944eef79-0417-4a6d-b965-807e00d64351"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:37:35 crc kubenswrapper[4700]: I1007 11:37:35.157531 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/944eef79-0417-4a6d-b965-807e00d64351-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "944eef79-0417-4a6d-b965-807e00d64351" (UID: "944eef79-0417-4a6d-b965-807e00d64351"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:37:35 crc kubenswrapper[4700]: I1007 11:37:35.159397 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/944eef79-0417-4a6d-b965-807e00d64351-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "944eef79-0417-4a6d-b965-807e00d64351" (UID: "944eef79-0417-4a6d-b965-807e00d64351"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:37:35 crc kubenswrapper[4700]: I1007 11:37:35.172943 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/944eef79-0417-4a6d-b965-807e00d64351-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "944eef79-0417-4a6d-b965-807e00d64351" (UID: "944eef79-0417-4a6d-b965-807e00d64351"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:37:35 crc kubenswrapper[4700]: I1007 11:37:35.181877 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/944eef79-0417-4a6d-b965-807e00d64351-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "944eef79-0417-4a6d-b965-807e00d64351" (UID: "944eef79-0417-4a6d-b965-807e00d64351"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:37:35 crc kubenswrapper[4700]: I1007 11:37:35.197890 4700 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/944eef79-0417-4a6d-b965-807e00d64351-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:35 crc kubenswrapper[4700]: I1007 11:37:35.197927 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vb4z4\" (UniqueName: \"kubernetes.io/projected/944eef79-0417-4a6d-b965-807e00d64351-kube-api-access-vb4z4\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:35 crc kubenswrapper[4700]: I1007 11:37:35.197941 4700 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/944eef79-0417-4a6d-b965-807e00d64351-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:35 crc kubenswrapper[4700]: I1007 11:37:35.197954 4700 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/944eef79-0417-4a6d-b965-807e00d64351-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:35 crc kubenswrapper[4700]: I1007 11:37:35.197967 4700 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/944eef79-0417-4a6d-b965-807e00d64351-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:35 crc kubenswrapper[4700]: I1007 11:37:35.197978 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/944eef79-0417-4a6d-b965-807e00d64351-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:35 crc kubenswrapper[4700]: I1007 11:37:35.865381 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-dmn6s" Oct 07 11:37:35 crc kubenswrapper[4700]: I1007 11:37:35.865383 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-dmn6s" event={"ID":"944eef79-0417-4a6d-b965-807e00d64351","Type":"ContainerDied","Data":"22e346303c4630bdcceea97bd18b48d5b84c50bc45f81fbe5dd3287cced08707"} Oct 07 11:37:35 crc kubenswrapper[4700]: I1007 11:37:35.905007 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-dmn6s"] Oct 07 11:37:35 crc kubenswrapper[4700]: I1007 11:37:35.910657 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-dmn6s"] Oct 07 11:37:35 crc kubenswrapper[4700]: I1007 11:37:35.966714 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="944eef79-0417-4a6d-b965-807e00d64351" path="/var/lib/kubelet/pods/944eef79-0417-4a6d-b965-807e00d64351/volumes" Oct 07 11:37:36 crc kubenswrapper[4700]: E1007 11:37:36.148286 4700 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Oct 07 11:37:36 crc kubenswrapper[4700]: E1007 11:37:36.148487 4700 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-twqj8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-bqgg5_openstack(bdc86b02-ca87-4bda-869a-6fd42e5f5f1b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 11:37:36 crc kubenswrapper[4700]: E1007 11:37:36.149803 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-bqgg5" podUID="bdc86b02-ca87-4bda-869a-6fd42e5f5f1b" Oct 07 11:37:36 crc kubenswrapper[4700]: I1007 11:37:36.158270 4700 scope.go:117] "RemoveContainer" containerID="4e9b768553c2b9249b8d1f233103a31b3ca5fa6493e982c7980e5a413d9693da" Oct 07 11:37:36 crc kubenswrapper[4700]: I1007 11:37:36.288087 4700 scope.go:117] "RemoveContainer" containerID="8d00eec4587f4c0fa854acca76ec2dcd8e92c5e5839bf52a2cac24021dc98659" Oct 07 11:37:36 crc kubenswrapper[4700]: I1007 11:37:36.330632 4700 scope.go:117] "RemoveContainer" containerID="4e08da9ca45d7f088c3286e015ccfbe4bdf01fec95faf7e7272a0a3a4f18d0c5" Oct 07 11:37:36 crc kubenswrapper[4700]: I1007 11:37:36.350884 4700 scope.go:117] "RemoveContainer" containerID="eca12d80c3de6e4768a4a0dfaa7804d2098fff2cb9593407cbc853fc0c08904c" Oct 07 11:37:36 crc kubenswrapper[4700]: I1007 11:37:36.685258 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 11:37:36 crc kubenswrapper[4700]: I1007 11:37:36.693271 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b0d7-account-create-kdfv5"] Oct 07 11:37:36 crc kubenswrapper[4700]: W1007 11:37:36.693439 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f62089a_2daa_491e_bcdb_7f793df7cd99.slice/crio-a1623f978fcf4d67b261a6e3c9463e9d73f57e01b13bcb9dfb6af5bd60b69f55 WatchSource:0}: Error finding container a1623f978fcf4d67b261a6e3c9463e9d73f57e01b13bcb9dfb6af5bd60b69f55: Status 404 returned error can't find the container with id a1623f978fcf4d67b261a6e3c9463e9d73f57e01b13bcb9dfb6af5bd60b69f55 Oct 07 11:37:36 crc kubenswrapper[4700]: I1007 11:37:36.717833 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-f1ef-account-create-jrd5q"] Oct 07 11:37:36 crc kubenswrapper[4700]: I1007 11:37:36.763987 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 07 11:37:36 crc kubenswrapper[4700]: I1007 11:37:36.819272 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Oct 07 11:37:36 crc kubenswrapper[4700]: I1007 11:37:36.843331 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6ftwf"] Oct 07 11:37:36 crc kubenswrapper[4700]: I1007 11:37:36.895347 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ppsbl" event={"ID":"c7aaa144-2b28-48f6-8398-d4b0766f53f4","Type":"ContainerStarted","Data":"9ecaf3e0157a3c65e904791cca5a5879c87804fb536def8eb1e20b4919783cc9"} Oct 07 11:37:36 crc kubenswrapper[4700]: I1007 11:37:36.897653 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-nkx5c" event={"ID":"4ccc0ab6-f412-449d-bef2-82bbd06f3d9e","Type":"ContainerStarted","Data":"6745a98b7adf70ff20c6d8fe1332644b0c1436c7dec4ad9e2e9653dcb7a9a283"} Oct 07 11:37:36 crc kubenswrapper[4700]: I1007 11:37:36.915194 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-ppsbl" podStartSLOduration=3.549178064 podStartE2EDuration="22.915174701s" podCreationTimestamp="2025-10-07 11:37:14 +0000 UTC" firstStartedPulling="2025-10-07 11:37:15.507571391 +0000 UTC m=+1002.303970380" lastFinishedPulling="2025-10-07 11:37:34.873568018 +0000 UTC m=+1021.669967017" observedRunningTime="2025-10-07 11:37:36.912642995 +0000 UTC m=+1023.709041984" watchObservedRunningTime="2025-10-07 11:37:36.915174701 +0000 UTC m=+1023.711573690" Oct 07 11:37:36 crc kubenswrapper[4700]: I1007 11:37:36.915742 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-f1ef-account-create-jrd5q" event={"ID":"980702b0-2482-4109-8ece-d5ecef7f84fc","Type":"ContainerStarted","Data":"53c3b64c0d982a09ff805ebc1108270e771b7a0e4a64f1f1cb7426eb8abcfac6"} Oct 07 11:37:36 crc kubenswrapper[4700]: I1007 11:37:36.922191 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6ftwf" event={"ID":"e383a2d1-cd6e-4658-9148-ef9c25c63430","Type":"ContainerStarted","Data":"7dfeca2134ff739dd9fa1ef8ce79a9caa0b3f007d9e91ed37f6b943bad7fb56e"} Oct 07 11:37:36 crc kubenswrapper[4700]: I1007 11:37:36.935073 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f1fee1b-3e64-4266-b787-c2804804a232","Type":"ContainerStarted","Data":"1f0493224277713d33d3c5331e8bb336b71e555a8e3b1707531c13cd3a7e06ba"} Oct 07 11:37:36 crc kubenswrapper[4700]: I1007 11:37:36.937776 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4f62089a-2daa-491e-bcdb-7f793df7cd99","Type":"ContainerStarted","Data":"a1623f978fcf4d67b261a6e3c9463e9d73f57e01b13bcb9dfb6af5bd60b69f55"} Oct 07 11:37:36 crc kubenswrapper[4700]: I1007 11:37:36.948596 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-nkx5c" podStartSLOduration=2.931358606 podStartE2EDuration="21.948569565s" podCreationTimestamp="2025-10-07 11:37:15 +0000 UTC" firstStartedPulling="2025-10-07 11:37:17.110683127 +0000 UTC m=+1003.907082116" lastFinishedPulling="2025-10-07 11:37:36.127894086 +0000 UTC m=+1022.924293075" observedRunningTime="2025-10-07 11:37:36.932960407 +0000 UTC m=+1023.729359396" watchObservedRunningTime="2025-10-07 11:37:36.948569565 +0000 UTC m=+1023.744968554" Oct 07 11:37:36 crc kubenswrapper[4700]: I1007 11:37:36.952160 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b0d7-account-create-kdfv5" event={"ID":"f4cbf147-32c1-4ba8-b7b0-f91d1df46101","Type":"ContainerStarted","Data":"42e5f04fd015d239128e0de38ad025f1235c677075810b493e6f9ab412bb045c"} Oct 07 11:37:36 crc kubenswrapper[4700]: E1007 11:37:36.956414 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-bqgg5" podUID="bdc86b02-ca87-4bda-869a-6fd42e5f5f1b" Oct 07 11:37:36 crc kubenswrapper[4700]: I1007 11:37:36.982277 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 11:37:37 crc kubenswrapper[4700]: W1007 11:37:37.011913 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcba6fb68_aeed_4282_ab35_a7cc04b39f52.slice/crio-b8701cccb134785eebf4bd59ef51de73fb7d5891b3a7d4738de6869724c3e920 WatchSource:0}: Error finding container b8701cccb134785eebf4bd59ef51de73fb7d5891b3a7d4738de6869724c3e920: Status 404 returned error can't find the container with id b8701cccb134785eebf4bd59ef51de73fb7d5891b3a7d4738de6869724c3e920 Oct 07 11:37:37 crc kubenswrapper[4700]: I1007 11:37:37.637475 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-dmn6s" podUID="944eef79-0417-4a6d-b965-807e00d64351" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: i/o timeout" Oct 07 11:37:37 crc kubenswrapper[4700]: I1007 11:37:37.981470 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f1fee1b-3e64-4266-b787-c2804804a232","Type":"ContainerStarted","Data":"baa5af43a455336c1da8d2c15456158bfd440fe9e926e442d93be4e6efa18560"} Oct 07 11:37:37 crc kubenswrapper[4700]: I1007 11:37:37.985248 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cba6fb68-aeed-4282-ab35-a7cc04b39f52","Type":"ContainerStarted","Data":"0112a381ded2be88f59f7f1f1a3984214d56e25e9f218ab1c136ca1b27748040"} Oct 07 11:37:37 crc kubenswrapper[4700]: I1007 11:37:37.985291 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cba6fb68-aeed-4282-ab35-a7cc04b39f52","Type":"ContainerStarted","Data":"b8701cccb134785eebf4bd59ef51de73fb7d5891b3a7d4738de6869724c3e920"} Oct 07 11:37:38 crc kubenswrapper[4700]: I1007 11:37:38.002563 4700 generic.go:334] "Generic (PLEG): container finished" podID="980702b0-2482-4109-8ece-d5ecef7f84fc" containerID="e7570c0d7e2d342d603bedebed72f3dbb2dc11e306ee4c4366531fd76e44ed6a" exitCode=0 Oct 07 11:37:38 crc kubenswrapper[4700]: I1007 11:37:38.002677 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-f1ef-account-create-jrd5q" event={"ID":"980702b0-2482-4109-8ece-d5ecef7f84fc","Type":"ContainerDied","Data":"e7570c0d7e2d342d603bedebed72f3dbb2dc11e306ee4c4366531fd76e44ed6a"} Oct 07 11:37:38 crc kubenswrapper[4700]: I1007 11:37:38.007499 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4f62089a-2daa-491e-bcdb-7f793df7cd99","Type":"ContainerStarted","Data":"857a04c2d9ccedf07de6eec685d503be77a0237c655ba26ff0e0a2cfe018d57b"} Oct 07 11:37:38 crc kubenswrapper[4700]: I1007 11:37:38.008832 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6ftwf" event={"ID":"e383a2d1-cd6e-4658-9148-ef9c25c63430","Type":"ContainerStarted","Data":"9819328bfdc94d3425997889115d50d17e11b13b7ab4f63ae8a7b884ef3f8890"} Oct 07 11:37:38 crc kubenswrapper[4700]: I1007 11:37:38.011249 4700 generic.go:334] "Generic (PLEG): container finished" podID="f4cbf147-32c1-4ba8-b7b0-f91d1df46101" containerID="a2f607e9371a10c56ee4950077f53fad0da7e1ebb8672a4013168a25ebad525e" exitCode=0 Oct 07 11:37:38 crc kubenswrapper[4700]: I1007 11:37:38.011413 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b0d7-account-create-kdfv5" event={"ID":"f4cbf147-32c1-4ba8-b7b0-f91d1df46101","Type":"ContainerDied","Data":"a2f607e9371a10c56ee4950077f53fad0da7e1ebb8672a4013168a25ebad525e"} Oct 07 11:37:38 crc kubenswrapper[4700]: I1007 11:37:38.056765 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-6ftwf" podStartSLOduration=9.056744527 podStartE2EDuration="9.056744527s" podCreationTimestamp="2025-10-07 11:37:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:37:38.052630419 +0000 UTC m=+1024.849029408" watchObservedRunningTime="2025-10-07 11:37:38.056744527 +0000 UTC m=+1024.853143516" Oct 07 11:37:39 crc kubenswrapper[4700]: I1007 11:37:39.024068 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cba6fb68-aeed-4282-ab35-a7cc04b39f52","Type":"ContainerStarted","Data":"54c2c37fcb8e59df9fa07e4fc2f35213d67d2d5206f780dd01cd2bed58173646"} Oct 07 11:37:39 crc kubenswrapper[4700]: I1007 11:37:39.031868 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4f62089a-2daa-491e-bcdb-7f793df7cd99","Type":"ContainerStarted","Data":"ab00a65ae7be37741c52aa0fdf0fd85e68897595cfa5bb120b2d48d05e2b3de6"} Oct 07 11:37:39 crc kubenswrapper[4700]: I1007 11:37:39.058317 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=15.058286839 podStartE2EDuration="15.058286839s" podCreationTimestamp="2025-10-07 11:37:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:37:39.052029595 +0000 UTC m=+1025.848428584" watchObservedRunningTime="2025-10-07 11:37:39.058286839 +0000 UTC m=+1025.854685828" Oct 07 11:37:39 crc kubenswrapper[4700]: I1007 11:37:39.078234 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=15.078219611 podStartE2EDuration="15.078219611s" podCreationTimestamp="2025-10-07 11:37:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:37:39.07703549 +0000 UTC m=+1025.873434479" watchObservedRunningTime="2025-10-07 11:37:39.078219611 +0000 UTC m=+1025.874618590" Oct 07 11:37:39 crc kubenswrapper[4700]: I1007 11:37:39.341709 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-f1ef-account-create-jrd5q" Oct 07 11:37:39 crc kubenswrapper[4700]: I1007 11:37:39.461962 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b0d7-account-create-kdfv5" Oct 07 11:37:39 crc kubenswrapper[4700]: I1007 11:37:39.474884 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lwjc\" (UniqueName: \"kubernetes.io/projected/980702b0-2482-4109-8ece-d5ecef7f84fc-kube-api-access-6lwjc\") pod \"980702b0-2482-4109-8ece-d5ecef7f84fc\" (UID: \"980702b0-2482-4109-8ece-d5ecef7f84fc\") " Oct 07 11:37:39 crc kubenswrapper[4700]: I1007 11:37:39.490827 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/980702b0-2482-4109-8ece-d5ecef7f84fc-kube-api-access-6lwjc" (OuterVolumeSpecName: "kube-api-access-6lwjc") pod "980702b0-2482-4109-8ece-d5ecef7f84fc" (UID: "980702b0-2482-4109-8ece-d5ecef7f84fc"). InnerVolumeSpecName "kube-api-access-6lwjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:37:39 crc kubenswrapper[4700]: I1007 11:37:39.576272 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnhsr\" (UniqueName: \"kubernetes.io/projected/f4cbf147-32c1-4ba8-b7b0-f91d1df46101-kube-api-access-cnhsr\") pod \"f4cbf147-32c1-4ba8-b7b0-f91d1df46101\" (UID: \"f4cbf147-32c1-4ba8-b7b0-f91d1df46101\") " Oct 07 11:37:39 crc kubenswrapper[4700]: I1007 11:37:39.576695 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lwjc\" (UniqueName: \"kubernetes.io/projected/980702b0-2482-4109-8ece-d5ecef7f84fc-kube-api-access-6lwjc\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:39 crc kubenswrapper[4700]: I1007 11:37:39.580768 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4cbf147-32c1-4ba8-b7b0-f91d1df46101-kube-api-access-cnhsr" (OuterVolumeSpecName: "kube-api-access-cnhsr") pod "f4cbf147-32c1-4ba8-b7b0-f91d1df46101" (UID: "f4cbf147-32c1-4ba8-b7b0-f91d1df46101"). InnerVolumeSpecName "kube-api-access-cnhsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:37:39 crc kubenswrapper[4700]: I1007 11:37:39.678375 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnhsr\" (UniqueName: \"kubernetes.io/projected/f4cbf147-32c1-4ba8-b7b0-f91d1df46101-kube-api-access-cnhsr\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:40 crc kubenswrapper[4700]: I1007 11:37:40.046791 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-f1ef-account-create-jrd5q" Oct 07 11:37:40 crc kubenswrapper[4700]: I1007 11:37:40.046798 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-f1ef-account-create-jrd5q" event={"ID":"980702b0-2482-4109-8ece-d5ecef7f84fc","Type":"ContainerDied","Data":"53c3b64c0d982a09ff805ebc1108270e771b7a0e4a64f1f1cb7426eb8abcfac6"} Oct 07 11:37:40 crc kubenswrapper[4700]: I1007 11:37:40.046926 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53c3b64c0d982a09ff805ebc1108270e771b7a0e4a64f1f1cb7426eb8abcfac6" Oct 07 11:37:40 crc kubenswrapper[4700]: I1007 11:37:40.050293 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b0d7-account-create-kdfv5" Oct 07 11:37:40 crc kubenswrapper[4700]: I1007 11:37:40.051456 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b0d7-account-create-kdfv5" event={"ID":"f4cbf147-32c1-4ba8-b7b0-f91d1df46101","Type":"ContainerDied","Data":"42e5f04fd015d239128e0de38ad025f1235c677075810b493e6f9ab412bb045c"} Oct 07 11:37:40 crc kubenswrapper[4700]: I1007 11:37:40.051495 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42e5f04fd015d239128e0de38ad025f1235c677075810b493e6f9ab412bb045c" Oct 07 11:37:40 crc kubenswrapper[4700]: E1007 11:37:40.131547 4700 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod980702b0_2482_4109_8ece_d5ecef7f84fc.slice/crio-53c3b64c0d982a09ff805ebc1108270e771b7a0e4a64f1f1cb7426eb8abcfac6\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7aaa144_2b28_48f6_8398_d4b0766f53f4.slice/crio-conmon-9ecaf3e0157a3c65e904791cca5a5879c87804fb536def8eb1e20b4919783cc9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod980702b0_2482_4109_8ece_d5ecef7f84fc.slice\": RecentStats: unable to find data in memory cache]" Oct 07 11:37:40 crc kubenswrapper[4700]: I1007 11:37:40.869099 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-lp48h"] Oct 07 11:37:40 crc kubenswrapper[4700]: E1007 11:37:40.869652 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4cbf147-32c1-4ba8-b7b0-f91d1df46101" containerName="mariadb-account-create" Oct 07 11:37:40 crc kubenswrapper[4700]: I1007 11:37:40.869671 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4cbf147-32c1-4ba8-b7b0-f91d1df46101" containerName="mariadb-account-create" Oct 07 11:37:40 crc kubenswrapper[4700]: E1007 11:37:40.869690 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="944eef79-0417-4a6d-b965-807e00d64351" containerName="dnsmasq-dns" Oct 07 11:37:40 crc kubenswrapper[4700]: I1007 11:37:40.869698 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="944eef79-0417-4a6d-b965-807e00d64351" containerName="dnsmasq-dns" Oct 07 11:37:40 crc kubenswrapper[4700]: E1007 11:37:40.869711 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="944eef79-0417-4a6d-b965-807e00d64351" containerName="init" Oct 07 11:37:40 crc kubenswrapper[4700]: I1007 11:37:40.869719 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="944eef79-0417-4a6d-b965-807e00d64351" containerName="init" Oct 07 11:37:40 crc kubenswrapper[4700]: E1007 11:37:40.869735 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="980702b0-2482-4109-8ece-d5ecef7f84fc" containerName="mariadb-account-create" Oct 07 11:37:40 crc kubenswrapper[4700]: I1007 11:37:40.869743 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="980702b0-2482-4109-8ece-d5ecef7f84fc" containerName="mariadb-account-create" Oct 07 11:37:40 crc kubenswrapper[4700]: I1007 11:37:40.869958 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="944eef79-0417-4a6d-b965-807e00d64351" containerName="dnsmasq-dns" Oct 07 11:37:40 crc kubenswrapper[4700]: I1007 11:37:40.869985 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4cbf147-32c1-4ba8-b7b0-f91d1df46101" containerName="mariadb-account-create" Oct 07 11:37:40 crc kubenswrapper[4700]: I1007 11:37:40.869999 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="980702b0-2482-4109-8ece-d5ecef7f84fc" containerName="mariadb-account-create" Oct 07 11:37:40 crc kubenswrapper[4700]: I1007 11:37:40.870807 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-lp48h" Oct 07 11:37:40 crc kubenswrapper[4700]: I1007 11:37:40.872747 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-bxh9j" Oct 07 11:37:40 crc kubenswrapper[4700]: I1007 11:37:40.873227 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Oct 07 11:37:40 crc kubenswrapper[4700]: I1007 11:37:40.882896 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-lp48h"] Oct 07 11:37:41 crc kubenswrapper[4700]: I1007 11:37:41.002571 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84e22581-4f7e-45b6-9932-aaa790c7825f-config-data\") pod \"heat-db-sync-lp48h\" (UID: \"84e22581-4f7e-45b6-9932-aaa790c7825f\") " pod="openstack/heat-db-sync-lp48h" Oct 07 11:37:41 crc kubenswrapper[4700]: I1007 11:37:41.002634 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74v9p\" (UniqueName: \"kubernetes.io/projected/84e22581-4f7e-45b6-9932-aaa790c7825f-kube-api-access-74v9p\") pod \"heat-db-sync-lp48h\" (UID: \"84e22581-4f7e-45b6-9932-aaa790c7825f\") " pod="openstack/heat-db-sync-lp48h" Oct 07 11:37:41 crc kubenswrapper[4700]: I1007 11:37:41.002676 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84e22581-4f7e-45b6-9932-aaa790c7825f-combined-ca-bundle\") pod \"heat-db-sync-lp48h\" (UID: \"84e22581-4f7e-45b6-9932-aaa790c7825f\") " pod="openstack/heat-db-sync-lp48h" Oct 07 11:37:41 crc kubenswrapper[4700]: I1007 11:37:41.064293 4700 generic.go:334] "Generic (PLEG): container finished" podID="c7aaa144-2b28-48f6-8398-d4b0766f53f4" containerID="9ecaf3e0157a3c65e904791cca5a5879c87804fb536def8eb1e20b4919783cc9" exitCode=0 Oct 07 11:37:41 crc kubenswrapper[4700]: I1007 11:37:41.064389 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ppsbl" event={"ID":"c7aaa144-2b28-48f6-8398-d4b0766f53f4","Type":"ContainerDied","Data":"9ecaf3e0157a3c65e904791cca5a5879c87804fb536def8eb1e20b4919783cc9"} Oct 07 11:37:41 crc kubenswrapper[4700]: I1007 11:37:41.070032 4700 generic.go:334] "Generic (PLEG): container finished" podID="4ccc0ab6-f412-449d-bef2-82bbd06f3d9e" containerID="6745a98b7adf70ff20c6d8fe1332644b0c1436c7dec4ad9e2e9653dcb7a9a283" exitCode=0 Oct 07 11:37:41 crc kubenswrapper[4700]: I1007 11:37:41.070105 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-nkx5c" event={"ID":"4ccc0ab6-f412-449d-bef2-82bbd06f3d9e","Type":"ContainerDied","Data":"6745a98b7adf70ff20c6d8fe1332644b0c1436c7dec4ad9e2e9653dcb7a9a283"} Oct 07 11:37:41 crc kubenswrapper[4700]: I1007 11:37:41.071770 4700 generic.go:334] "Generic (PLEG): container finished" podID="e383a2d1-cd6e-4658-9148-ef9c25c63430" containerID="9819328bfdc94d3425997889115d50d17e11b13b7ab4f63ae8a7b884ef3f8890" exitCode=0 Oct 07 11:37:41 crc kubenswrapper[4700]: I1007 11:37:41.071796 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6ftwf" event={"ID":"e383a2d1-cd6e-4658-9148-ef9c25c63430","Type":"ContainerDied","Data":"9819328bfdc94d3425997889115d50d17e11b13b7ab4f63ae8a7b884ef3f8890"} Oct 07 11:37:41 crc kubenswrapper[4700]: I1007 11:37:41.103937 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84e22581-4f7e-45b6-9932-aaa790c7825f-config-data\") pod \"heat-db-sync-lp48h\" (UID: \"84e22581-4f7e-45b6-9932-aaa790c7825f\") " pod="openstack/heat-db-sync-lp48h" Oct 07 11:37:41 crc kubenswrapper[4700]: I1007 11:37:41.104009 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74v9p\" (UniqueName: \"kubernetes.io/projected/84e22581-4f7e-45b6-9932-aaa790c7825f-kube-api-access-74v9p\") pod \"heat-db-sync-lp48h\" (UID: \"84e22581-4f7e-45b6-9932-aaa790c7825f\") " pod="openstack/heat-db-sync-lp48h" Oct 07 11:37:41 crc kubenswrapper[4700]: I1007 11:37:41.104055 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84e22581-4f7e-45b6-9932-aaa790c7825f-combined-ca-bundle\") pod \"heat-db-sync-lp48h\" (UID: \"84e22581-4f7e-45b6-9932-aaa790c7825f\") " pod="openstack/heat-db-sync-lp48h" Oct 07 11:37:41 crc kubenswrapper[4700]: I1007 11:37:41.121433 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84e22581-4f7e-45b6-9932-aaa790c7825f-combined-ca-bundle\") pod \"heat-db-sync-lp48h\" (UID: \"84e22581-4f7e-45b6-9932-aaa790c7825f\") " pod="openstack/heat-db-sync-lp48h" Oct 07 11:37:41 crc kubenswrapper[4700]: I1007 11:37:41.121464 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84e22581-4f7e-45b6-9932-aaa790c7825f-config-data\") pod \"heat-db-sync-lp48h\" (UID: \"84e22581-4f7e-45b6-9932-aaa790c7825f\") " pod="openstack/heat-db-sync-lp48h" Oct 07 11:37:41 crc kubenswrapper[4700]: I1007 11:37:41.129454 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74v9p\" (UniqueName: \"kubernetes.io/projected/84e22581-4f7e-45b6-9932-aaa790c7825f-kube-api-access-74v9p\") pod \"heat-db-sync-lp48h\" (UID: \"84e22581-4f7e-45b6-9932-aaa790c7825f\") " pod="openstack/heat-db-sync-lp48h" Oct 07 11:37:41 crc kubenswrapper[4700]: I1007 11:37:41.169939 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-z8zs9"] Oct 07 11:37:41 crc kubenswrapper[4700]: I1007 11:37:41.171217 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-z8zs9" Oct 07 11:37:41 crc kubenswrapper[4700]: I1007 11:37:41.173059 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 07 11:37:41 crc kubenswrapper[4700]: I1007 11:37:41.173345 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 07 11:37:41 crc kubenswrapper[4700]: I1007 11:37:41.173537 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-2nz5w" Oct 07 11:37:41 crc kubenswrapper[4700]: I1007 11:37:41.179783 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-z8zs9"] Oct 07 11:37:41 crc kubenswrapper[4700]: I1007 11:37:41.211435 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-lp48h" Oct 07 11:37:41 crc kubenswrapper[4700]: I1007 11:37:41.306884 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/587378c1-b0c6-4d31-980f-0b6e8e271903-combined-ca-bundle\") pod \"neutron-db-sync-z8zs9\" (UID: \"587378c1-b0c6-4d31-980f-0b6e8e271903\") " pod="openstack/neutron-db-sync-z8zs9" Oct 07 11:37:41 crc kubenswrapper[4700]: I1007 11:37:41.307103 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/587378c1-b0c6-4d31-980f-0b6e8e271903-config\") pod \"neutron-db-sync-z8zs9\" (UID: \"587378c1-b0c6-4d31-980f-0b6e8e271903\") " pod="openstack/neutron-db-sync-z8zs9" Oct 07 11:37:41 crc kubenswrapper[4700]: I1007 11:37:41.307213 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5xzm\" (UniqueName: \"kubernetes.io/projected/587378c1-b0c6-4d31-980f-0b6e8e271903-kube-api-access-t5xzm\") pod \"neutron-db-sync-z8zs9\" (UID: \"587378c1-b0c6-4d31-980f-0b6e8e271903\") " pod="openstack/neutron-db-sync-z8zs9" Oct 07 11:37:41 crc kubenswrapper[4700]: I1007 11:37:41.408298 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5xzm\" (UniqueName: \"kubernetes.io/projected/587378c1-b0c6-4d31-980f-0b6e8e271903-kube-api-access-t5xzm\") pod \"neutron-db-sync-z8zs9\" (UID: \"587378c1-b0c6-4d31-980f-0b6e8e271903\") " pod="openstack/neutron-db-sync-z8zs9" Oct 07 11:37:41 crc kubenswrapper[4700]: I1007 11:37:41.408432 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/587378c1-b0c6-4d31-980f-0b6e8e271903-combined-ca-bundle\") pod \"neutron-db-sync-z8zs9\" (UID: \"587378c1-b0c6-4d31-980f-0b6e8e271903\") " pod="openstack/neutron-db-sync-z8zs9" Oct 07 11:37:41 crc kubenswrapper[4700]: I1007 11:37:41.408487 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/587378c1-b0c6-4d31-980f-0b6e8e271903-config\") pod \"neutron-db-sync-z8zs9\" (UID: \"587378c1-b0c6-4d31-980f-0b6e8e271903\") " pod="openstack/neutron-db-sync-z8zs9" Oct 07 11:37:41 crc kubenswrapper[4700]: I1007 11:37:41.421683 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/587378c1-b0c6-4d31-980f-0b6e8e271903-config\") pod \"neutron-db-sync-z8zs9\" (UID: \"587378c1-b0c6-4d31-980f-0b6e8e271903\") " pod="openstack/neutron-db-sync-z8zs9" Oct 07 11:37:41 crc kubenswrapper[4700]: I1007 11:37:41.425716 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/587378c1-b0c6-4d31-980f-0b6e8e271903-combined-ca-bundle\") pod \"neutron-db-sync-z8zs9\" (UID: \"587378c1-b0c6-4d31-980f-0b6e8e271903\") " pod="openstack/neutron-db-sync-z8zs9" Oct 07 11:37:41 crc kubenswrapper[4700]: I1007 11:37:41.431142 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5xzm\" (UniqueName: \"kubernetes.io/projected/587378c1-b0c6-4d31-980f-0b6e8e271903-kube-api-access-t5xzm\") pod \"neutron-db-sync-z8zs9\" (UID: \"587378c1-b0c6-4d31-980f-0b6e8e271903\") " pod="openstack/neutron-db-sync-z8zs9" Oct 07 11:37:41 crc kubenswrapper[4700]: I1007 11:37:41.509551 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-z8zs9" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.073425 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-nkx5c" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.079143 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ppsbl" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.086924 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6ftwf" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.104851 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6ftwf" event={"ID":"e383a2d1-cd6e-4658-9148-ef9c25c63430","Type":"ContainerDied","Data":"7dfeca2134ff739dd9fa1ef8ce79a9caa0b3f007d9e91ed37f6b943bad7fb56e"} Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.104890 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7dfeca2134ff739dd9fa1ef8ce79a9caa0b3f007d9e91ed37f6b943bad7fb56e" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.104959 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6ftwf" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.115615 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ppsbl" event={"ID":"c7aaa144-2b28-48f6-8398-d4b0766f53f4","Type":"ContainerDied","Data":"7c05091c2213054a504d9d62120ccc9b9f2ea9f1e2f563af4298a89c2cc25d50"} Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.115659 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c05091c2213054a504d9d62120ccc9b9f2ea9f1e2f563af4298a89c2cc25d50" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.115758 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ppsbl" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.124991 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-nkx5c" event={"ID":"4ccc0ab6-f412-449d-bef2-82bbd06f3d9e","Type":"ContainerDied","Data":"b8b402674eb52d8f10672ac41540ca304baca3b9b8d358a3303540b554f921ee"} Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.125029 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8b402674eb52d8f10672ac41540ca304baca3b9b8d358a3303540b554f921ee" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.125094 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-nkx5c" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.248215 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4ccc0ab6-f412-449d-bef2-82bbd06f3d9e-db-sync-config-data\") pod \"4ccc0ab6-f412-449d-bef2-82bbd06f3d9e\" (UID: \"4ccc0ab6-f412-449d-bef2-82bbd06f3d9e\") " Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.248534 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7aaa144-2b28-48f6-8398-d4b0766f53f4-config-data\") pod \"c7aaa144-2b28-48f6-8398-d4b0766f53f4\" (UID: \"c7aaa144-2b28-48f6-8398-d4b0766f53f4\") " Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.248586 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e383a2d1-cd6e-4658-9148-ef9c25c63430-scripts\") pod \"e383a2d1-cd6e-4658-9148-ef9c25c63430\" (UID: \"e383a2d1-cd6e-4658-9148-ef9c25c63430\") " Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.248645 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7aaa144-2b28-48f6-8398-d4b0766f53f4-scripts\") pod \"c7aaa144-2b28-48f6-8398-d4b0766f53f4\" (UID: \"c7aaa144-2b28-48f6-8398-d4b0766f53f4\") " Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.248666 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtjtd\" (UniqueName: \"kubernetes.io/projected/4ccc0ab6-f412-449d-bef2-82bbd06f3d9e-kube-api-access-qtjtd\") pod \"4ccc0ab6-f412-449d-bef2-82bbd06f3d9e\" (UID: \"4ccc0ab6-f412-449d-bef2-82bbd06f3d9e\") " Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.248703 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e383a2d1-cd6e-4658-9148-ef9c25c63430-fernet-keys\") pod \"e383a2d1-cd6e-4658-9148-ef9c25c63430\" (UID: \"e383a2d1-cd6e-4658-9148-ef9c25c63430\") " Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.248738 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e383a2d1-cd6e-4658-9148-ef9c25c63430-config-data\") pod \"e383a2d1-cd6e-4658-9148-ef9c25c63430\" (UID: \"e383a2d1-cd6e-4658-9148-ef9c25c63430\") " Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.248781 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e383a2d1-cd6e-4658-9148-ef9c25c63430-credential-keys\") pod \"e383a2d1-cd6e-4658-9148-ef9c25c63430\" (UID: \"e383a2d1-cd6e-4658-9148-ef9c25c63430\") " Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.248830 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcmbv\" (UniqueName: \"kubernetes.io/projected/e383a2d1-cd6e-4658-9148-ef9c25c63430-kube-api-access-qcmbv\") pod \"e383a2d1-cd6e-4658-9148-ef9c25c63430\" (UID: \"e383a2d1-cd6e-4658-9148-ef9c25c63430\") " Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.248846 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7aaa144-2b28-48f6-8398-d4b0766f53f4-combined-ca-bundle\") pod \"c7aaa144-2b28-48f6-8398-d4b0766f53f4\" (UID: \"c7aaa144-2b28-48f6-8398-d4b0766f53f4\") " Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.248869 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ccc0ab6-f412-449d-bef2-82bbd06f3d9e-combined-ca-bundle\") pod \"4ccc0ab6-f412-449d-bef2-82bbd06f3d9e\" (UID: \"4ccc0ab6-f412-449d-bef2-82bbd06f3d9e\") " Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.248902 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e383a2d1-cd6e-4658-9148-ef9c25c63430-combined-ca-bundle\") pod \"e383a2d1-cd6e-4658-9148-ef9c25c63430\" (UID: \"e383a2d1-cd6e-4658-9148-ef9c25c63430\") " Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.248922 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7aaa144-2b28-48f6-8398-d4b0766f53f4-logs\") pod \"c7aaa144-2b28-48f6-8398-d4b0766f53f4\" (UID: \"c7aaa144-2b28-48f6-8398-d4b0766f53f4\") " Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.248945 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qj4w\" (UniqueName: \"kubernetes.io/projected/c7aaa144-2b28-48f6-8398-d4b0766f53f4-kube-api-access-9qj4w\") pod \"c7aaa144-2b28-48f6-8398-d4b0766f53f4\" (UID: \"c7aaa144-2b28-48f6-8398-d4b0766f53f4\") " Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.252757 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7aaa144-2b28-48f6-8398-d4b0766f53f4-logs" (OuterVolumeSpecName: "logs") pod "c7aaa144-2b28-48f6-8398-d4b0766f53f4" (UID: "c7aaa144-2b28-48f6-8398-d4b0766f53f4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.256831 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e383a2d1-cd6e-4658-9148-ef9c25c63430-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e383a2d1-cd6e-4658-9148-ef9c25c63430" (UID: "e383a2d1-cd6e-4658-9148-ef9c25c63430"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.256999 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ccc0ab6-f412-449d-bef2-82bbd06f3d9e-kube-api-access-qtjtd" (OuterVolumeSpecName: "kube-api-access-qtjtd") pod "4ccc0ab6-f412-449d-bef2-82bbd06f3d9e" (UID: "4ccc0ab6-f412-449d-bef2-82bbd06f3d9e"). InnerVolumeSpecName "kube-api-access-qtjtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.257228 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e383a2d1-cd6e-4658-9148-ef9c25c63430-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e383a2d1-cd6e-4658-9148-ef9c25c63430" (UID: "e383a2d1-cd6e-4658-9148-ef9c25c63430"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.257263 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ccc0ab6-f412-449d-bef2-82bbd06f3d9e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4ccc0ab6-f412-449d-bef2-82bbd06f3d9e" (UID: "4ccc0ab6-f412-449d-bef2-82bbd06f3d9e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.257527 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7aaa144-2b28-48f6-8398-d4b0766f53f4-kube-api-access-9qj4w" (OuterVolumeSpecName: "kube-api-access-9qj4w") pod "c7aaa144-2b28-48f6-8398-d4b0766f53f4" (UID: "c7aaa144-2b28-48f6-8398-d4b0766f53f4"). InnerVolumeSpecName "kube-api-access-9qj4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.269551 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e383a2d1-cd6e-4658-9148-ef9c25c63430-kube-api-access-qcmbv" (OuterVolumeSpecName: "kube-api-access-qcmbv") pod "e383a2d1-cd6e-4658-9148-ef9c25c63430" (UID: "e383a2d1-cd6e-4658-9148-ef9c25c63430"). InnerVolumeSpecName "kube-api-access-qcmbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.269706 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7aaa144-2b28-48f6-8398-d4b0766f53f4-scripts" (OuterVolumeSpecName: "scripts") pod "c7aaa144-2b28-48f6-8398-d4b0766f53f4" (UID: "c7aaa144-2b28-48f6-8398-d4b0766f53f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.270870 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e383a2d1-cd6e-4658-9148-ef9c25c63430-scripts" (OuterVolumeSpecName: "scripts") pod "e383a2d1-cd6e-4658-9148-ef9c25c63430" (UID: "e383a2d1-cd6e-4658-9148-ef9c25c63430"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.286214 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ccc0ab6-f412-449d-bef2-82bbd06f3d9e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ccc0ab6-f412-449d-bef2-82bbd06f3d9e" (UID: "4ccc0ab6-f412-449d-bef2-82bbd06f3d9e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.299283 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e383a2d1-cd6e-4658-9148-ef9c25c63430-config-data" (OuterVolumeSpecName: "config-data") pod "e383a2d1-cd6e-4658-9148-ef9c25c63430" (UID: "e383a2d1-cd6e-4658-9148-ef9c25c63430"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.308566 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e383a2d1-cd6e-4658-9148-ef9c25c63430-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e383a2d1-cd6e-4658-9148-ef9c25c63430" (UID: "e383a2d1-cd6e-4658-9148-ef9c25c63430"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.327238 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7aaa144-2b28-48f6-8398-d4b0766f53f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7aaa144-2b28-48f6-8398-d4b0766f53f4" (UID: "c7aaa144-2b28-48f6-8398-d4b0766f53f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.345820 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-55444c599f-s65df"] Oct 07 11:37:43 crc kubenswrapper[4700]: E1007 11:37:43.346134 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e383a2d1-cd6e-4658-9148-ef9c25c63430" containerName="keystone-bootstrap" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.346145 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="e383a2d1-cd6e-4658-9148-ef9c25c63430" containerName="keystone-bootstrap" Oct 07 11:37:43 crc kubenswrapper[4700]: E1007 11:37:43.346160 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ccc0ab6-f412-449d-bef2-82bbd06f3d9e" containerName="barbican-db-sync" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.346166 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ccc0ab6-f412-449d-bef2-82bbd06f3d9e" containerName="barbican-db-sync" Oct 07 11:37:43 crc kubenswrapper[4700]: E1007 11:37:43.346182 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7aaa144-2b28-48f6-8398-d4b0766f53f4" containerName="placement-db-sync" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.346188 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7aaa144-2b28-48f6-8398-d4b0766f53f4" containerName="placement-db-sync" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.347167 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ccc0ab6-f412-449d-bef2-82bbd06f3d9e" containerName="barbican-db-sync" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.347194 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="e383a2d1-cd6e-4658-9148-ef9c25c63430" containerName="keystone-bootstrap" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.347206 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7aaa144-2b28-48f6-8398-d4b0766f53f4" containerName="placement-db-sync" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.348165 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-55444c599f-s65df" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.350789 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7aaa144-2b28-48f6-8398-d4b0766f53f4-config-data" (OuterVolumeSpecName: "config-data") pod "c7aaa144-2b28-48f6-8398-d4b0766f53f4" (UID: "c7aaa144-2b28-48f6-8398-d4b0766f53f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.351876 4700 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4ccc0ab6-f412-449d-bef2-82bbd06f3d9e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.351905 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7aaa144-2b28-48f6-8398-d4b0766f53f4-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.351919 4700 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e383a2d1-cd6e-4658-9148-ef9c25c63430-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.351931 4700 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7aaa144-2b28-48f6-8398-d4b0766f53f4-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.351945 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtjtd\" (UniqueName: \"kubernetes.io/projected/4ccc0ab6-f412-449d-bef2-82bbd06f3d9e-kube-api-access-qtjtd\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.351956 4700 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e383a2d1-cd6e-4658-9148-ef9c25c63430-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.351966 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e383a2d1-cd6e-4658-9148-ef9c25c63430-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.351977 4700 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e383a2d1-cd6e-4658-9148-ef9c25c63430-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.351989 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7aaa144-2b28-48f6-8398-d4b0766f53f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.352002 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcmbv\" (UniqueName: \"kubernetes.io/projected/e383a2d1-cd6e-4658-9148-ef9c25c63430-kube-api-access-qcmbv\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.352013 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ccc0ab6-f412-449d-bef2-82bbd06f3d9e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.352023 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e383a2d1-cd6e-4658-9148-ef9c25c63430-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.352034 4700 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7aaa144-2b28-48f6-8398-d4b0766f53f4-logs\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.352046 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qj4w\" (UniqueName: \"kubernetes.io/projected/c7aaa144-2b28-48f6-8398-d4b0766f53f4-kube-api-access-9qj4w\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.353085 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.377891 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-55444c599f-s65df"] Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.442713 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-65bb648478-f5h6h"] Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.478160 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-65bb648478-f5h6h" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.479620 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f28d9836-f2c1-4a60-97fd-324ba6b0331b-combined-ca-bundle\") pod \"barbican-worker-55444c599f-s65df\" (UID: \"f28d9836-f2c1-4a60-97fd-324ba6b0331b\") " pod="openstack/barbican-worker-55444c599f-s65df" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.480277 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f28d9836-f2c1-4a60-97fd-324ba6b0331b-config-data-custom\") pod \"barbican-worker-55444c599f-s65df\" (UID: \"f28d9836-f2c1-4a60-97fd-324ba6b0331b\") " pod="openstack/barbican-worker-55444c599f-s65df" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.480352 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htnh9\" (UniqueName: \"kubernetes.io/projected/f28d9836-f2c1-4a60-97fd-324ba6b0331b-kube-api-access-htnh9\") pod \"barbican-worker-55444c599f-s65df\" (UID: \"f28d9836-f2c1-4a60-97fd-324ba6b0331b\") " pod="openstack/barbican-worker-55444c599f-s65df" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.480440 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f28d9836-f2c1-4a60-97fd-324ba6b0331b-config-data\") pod \"barbican-worker-55444c599f-s65df\" (UID: \"f28d9836-f2c1-4a60-97fd-324ba6b0331b\") " pod="openstack/barbican-worker-55444c599f-s65df" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.480661 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f28d9836-f2c1-4a60-97fd-324ba6b0331b-logs\") pod \"barbican-worker-55444c599f-s65df\" (UID: \"f28d9836-f2c1-4a60-97fd-324ba6b0331b\") " pod="openstack/barbican-worker-55444c599f-s65df" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.482124 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.489810 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-65bb648478-f5h6h"] Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.503058 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-lp48h"] Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.515740 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-z8zs9"] Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.533372 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-795l6"] Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.535638 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5ff467f-795l6" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.542334 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-795l6"] Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.581868 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f28d9836-f2c1-4a60-97fd-324ba6b0331b-config-data-custom\") pod \"barbican-worker-55444c599f-s65df\" (UID: \"f28d9836-f2c1-4a60-97fd-324ba6b0331b\") " pod="openstack/barbican-worker-55444c599f-s65df" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.581909 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htnh9\" (UniqueName: \"kubernetes.io/projected/f28d9836-f2c1-4a60-97fd-324ba6b0331b-kube-api-access-htnh9\") pod \"barbican-worker-55444c599f-s65df\" (UID: \"f28d9836-f2c1-4a60-97fd-324ba6b0331b\") " pod="openstack/barbican-worker-55444c599f-s65df" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.581941 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5fce846e-41b6-4b09-a5db-236763b8e5f9-dns-swift-storage-0\") pod \"dnsmasq-dns-59d5ff467f-795l6\" (UID: \"5fce846e-41b6-4b09-a5db-236763b8e5f9\") " pod="openstack/dnsmasq-dns-59d5ff467f-795l6" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.581961 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5fce846e-41b6-4b09-a5db-236763b8e5f9-ovsdbserver-nb\") pod \"dnsmasq-dns-59d5ff467f-795l6\" (UID: \"5fce846e-41b6-4b09-a5db-236763b8e5f9\") " pod="openstack/dnsmasq-dns-59d5ff467f-795l6" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.581984 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f28d9836-f2c1-4a60-97fd-324ba6b0331b-config-data\") pod \"barbican-worker-55444c599f-s65df\" (UID: \"f28d9836-f2c1-4a60-97fd-324ba6b0331b\") " pod="openstack/barbican-worker-55444c599f-s65df" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.582001 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5fce846e-41b6-4b09-a5db-236763b8e5f9-dns-svc\") pod \"dnsmasq-dns-59d5ff467f-795l6\" (UID: \"5fce846e-41b6-4b09-a5db-236763b8e5f9\") " pod="openstack/dnsmasq-dns-59d5ff467f-795l6" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.582029 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99rhm\" (UniqueName: \"kubernetes.io/projected/5fce846e-41b6-4b09-a5db-236763b8e5f9-kube-api-access-99rhm\") pod \"dnsmasq-dns-59d5ff467f-795l6\" (UID: \"5fce846e-41b6-4b09-a5db-236763b8e5f9\") " pod="openstack/dnsmasq-dns-59d5ff467f-795l6" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.582060 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62cb738e-4901-49c1-8516-02b0c2a44482-config-data-custom\") pod \"barbican-keystone-listener-65bb648478-f5h6h\" (UID: \"62cb738e-4901-49c1-8516-02b0c2a44482\") " pod="openstack/barbican-keystone-listener-65bb648478-f5h6h" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.582084 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62cb738e-4901-49c1-8516-02b0c2a44482-combined-ca-bundle\") pod \"barbican-keystone-listener-65bb648478-f5h6h\" (UID: \"62cb738e-4901-49c1-8516-02b0c2a44482\") " pod="openstack/barbican-keystone-listener-65bb648478-f5h6h" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.582107 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62cb738e-4901-49c1-8516-02b0c2a44482-config-data\") pod \"barbican-keystone-listener-65bb648478-f5h6h\" (UID: \"62cb738e-4901-49c1-8516-02b0c2a44482\") " pod="openstack/barbican-keystone-listener-65bb648478-f5h6h" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.582122 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fce846e-41b6-4b09-a5db-236763b8e5f9-config\") pod \"dnsmasq-dns-59d5ff467f-795l6\" (UID: \"5fce846e-41b6-4b09-a5db-236763b8e5f9\") " pod="openstack/dnsmasq-dns-59d5ff467f-795l6" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.582141 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd7wf\" (UniqueName: \"kubernetes.io/projected/62cb738e-4901-49c1-8516-02b0c2a44482-kube-api-access-dd7wf\") pod \"barbican-keystone-listener-65bb648478-f5h6h\" (UID: \"62cb738e-4901-49c1-8516-02b0c2a44482\") " pod="openstack/barbican-keystone-listener-65bb648478-f5h6h" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.582170 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62cb738e-4901-49c1-8516-02b0c2a44482-logs\") pod \"barbican-keystone-listener-65bb648478-f5h6h\" (UID: \"62cb738e-4901-49c1-8516-02b0c2a44482\") " pod="openstack/barbican-keystone-listener-65bb648478-f5h6h" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.582186 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f28d9836-f2c1-4a60-97fd-324ba6b0331b-logs\") pod \"barbican-worker-55444c599f-s65df\" (UID: \"f28d9836-f2c1-4a60-97fd-324ba6b0331b\") " pod="openstack/barbican-worker-55444c599f-s65df" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.582215 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5fce846e-41b6-4b09-a5db-236763b8e5f9-ovsdbserver-sb\") pod \"dnsmasq-dns-59d5ff467f-795l6\" (UID: \"5fce846e-41b6-4b09-a5db-236763b8e5f9\") " pod="openstack/dnsmasq-dns-59d5ff467f-795l6" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.582238 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f28d9836-f2c1-4a60-97fd-324ba6b0331b-combined-ca-bundle\") pod \"barbican-worker-55444c599f-s65df\" (UID: \"f28d9836-f2c1-4a60-97fd-324ba6b0331b\") " pod="openstack/barbican-worker-55444c599f-s65df" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.586157 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f28d9836-f2c1-4a60-97fd-324ba6b0331b-logs\") pod \"barbican-worker-55444c599f-s65df\" (UID: \"f28d9836-f2c1-4a60-97fd-324ba6b0331b\") " pod="openstack/barbican-worker-55444c599f-s65df" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.593568 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f28d9836-f2c1-4a60-97fd-324ba6b0331b-config-data-custom\") pod \"barbican-worker-55444c599f-s65df\" (UID: \"f28d9836-f2c1-4a60-97fd-324ba6b0331b\") " pod="openstack/barbican-worker-55444c599f-s65df" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.595004 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f28d9836-f2c1-4a60-97fd-324ba6b0331b-config-data\") pod \"barbican-worker-55444c599f-s65df\" (UID: \"f28d9836-f2c1-4a60-97fd-324ba6b0331b\") " pod="openstack/barbican-worker-55444c599f-s65df" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.595497 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f28d9836-f2c1-4a60-97fd-324ba6b0331b-combined-ca-bundle\") pod \"barbican-worker-55444c599f-s65df\" (UID: \"f28d9836-f2c1-4a60-97fd-324ba6b0331b\") " pod="openstack/barbican-worker-55444c599f-s65df" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.601813 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-85c8cc6bbb-jj2g6"] Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.603617 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-85c8cc6bbb-jj2g6" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.613593 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.613628 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htnh9\" (UniqueName: \"kubernetes.io/projected/f28d9836-f2c1-4a60-97fd-324ba6b0331b-kube-api-access-htnh9\") pod \"barbican-worker-55444c599f-s65df\" (UID: \"f28d9836-f2c1-4a60-97fd-324ba6b0331b\") " pod="openstack/barbican-worker-55444c599f-s65df" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.628809 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-85c8cc6bbb-jj2g6"] Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.682962 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5fce846e-41b6-4b09-a5db-236763b8e5f9-dns-svc\") pod \"dnsmasq-dns-59d5ff467f-795l6\" (UID: \"5fce846e-41b6-4b09-a5db-236763b8e5f9\") " pod="openstack/dnsmasq-dns-59d5ff467f-795l6" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.683289 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b640b42-7f62-466a-b115-6d9a530e35a6-config-data-custom\") pod \"barbican-api-85c8cc6bbb-jj2g6\" (UID: \"3b640b42-7f62-466a-b115-6d9a530e35a6\") " pod="openstack/barbican-api-85c8cc6bbb-jj2g6" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.683355 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99rhm\" (UniqueName: \"kubernetes.io/projected/5fce846e-41b6-4b09-a5db-236763b8e5f9-kube-api-access-99rhm\") pod \"dnsmasq-dns-59d5ff467f-795l6\" (UID: \"5fce846e-41b6-4b09-a5db-236763b8e5f9\") " pod="openstack/dnsmasq-dns-59d5ff467f-795l6" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.683371 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b640b42-7f62-466a-b115-6d9a530e35a6-config-data\") pod \"barbican-api-85c8cc6bbb-jj2g6\" (UID: \"3b640b42-7f62-466a-b115-6d9a530e35a6\") " pod="openstack/barbican-api-85c8cc6bbb-jj2g6" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.683408 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62cb738e-4901-49c1-8516-02b0c2a44482-config-data-custom\") pod \"barbican-keystone-listener-65bb648478-f5h6h\" (UID: \"62cb738e-4901-49c1-8516-02b0c2a44482\") " pod="openstack/barbican-keystone-listener-65bb648478-f5h6h" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.683434 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62cb738e-4901-49c1-8516-02b0c2a44482-combined-ca-bundle\") pod \"barbican-keystone-listener-65bb648478-f5h6h\" (UID: \"62cb738e-4901-49c1-8516-02b0c2a44482\") " pod="openstack/barbican-keystone-listener-65bb648478-f5h6h" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.683456 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62cb738e-4901-49c1-8516-02b0c2a44482-config-data\") pod \"barbican-keystone-listener-65bb648478-f5h6h\" (UID: \"62cb738e-4901-49c1-8516-02b0c2a44482\") " pod="openstack/barbican-keystone-listener-65bb648478-f5h6h" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.683473 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fce846e-41b6-4b09-a5db-236763b8e5f9-config\") pod \"dnsmasq-dns-59d5ff467f-795l6\" (UID: \"5fce846e-41b6-4b09-a5db-236763b8e5f9\") " pod="openstack/dnsmasq-dns-59d5ff467f-795l6" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.683497 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd7wf\" (UniqueName: \"kubernetes.io/projected/62cb738e-4901-49c1-8516-02b0c2a44482-kube-api-access-dd7wf\") pod \"barbican-keystone-listener-65bb648478-f5h6h\" (UID: \"62cb738e-4901-49c1-8516-02b0c2a44482\") " pod="openstack/barbican-keystone-listener-65bb648478-f5h6h" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.683518 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b640b42-7f62-466a-b115-6d9a530e35a6-combined-ca-bundle\") pod \"barbican-api-85c8cc6bbb-jj2g6\" (UID: \"3b640b42-7f62-466a-b115-6d9a530e35a6\") " pod="openstack/barbican-api-85c8cc6bbb-jj2g6" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.683549 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62cb738e-4901-49c1-8516-02b0c2a44482-logs\") pod \"barbican-keystone-listener-65bb648478-f5h6h\" (UID: \"62cb738e-4901-49c1-8516-02b0c2a44482\") " pod="openstack/barbican-keystone-listener-65bb648478-f5h6h" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.683572 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b640b42-7f62-466a-b115-6d9a530e35a6-logs\") pod \"barbican-api-85c8cc6bbb-jj2g6\" (UID: \"3b640b42-7f62-466a-b115-6d9a530e35a6\") " pod="openstack/barbican-api-85c8cc6bbb-jj2g6" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.683603 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgtln\" (UniqueName: \"kubernetes.io/projected/3b640b42-7f62-466a-b115-6d9a530e35a6-kube-api-access-tgtln\") pod \"barbican-api-85c8cc6bbb-jj2g6\" (UID: \"3b640b42-7f62-466a-b115-6d9a530e35a6\") " pod="openstack/barbican-api-85c8cc6bbb-jj2g6" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.683634 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5fce846e-41b6-4b09-a5db-236763b8e5f9-ovsdbserver-sb\") pod \"dnsmasq-dns-59d5ff467f-795l6\" (UID: \"5fce846e-41b6-4b09-a5db-236763b8e5f9\") " pod="openstack/dnsmasq-dns-59d5ff467f-795l6" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.683710 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5fce846e-41b6-4b09-a5db-236763b8e5f9-dns-swift-storage-0\") pod \"dnsmasq-dns-59d5ff467f-795l6\" (UID: \"5fce846e-41b6-4b09-a5db-236763b8e5f9\") " pod="openstack/dnsmasq-dns-59d5ff467f-795l6" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.683742 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5fce846e-41b6-4b09-a5db-236763b8e5f9-ovsdbserver-nb\") pod \"dnsmasq-dns-59d5ff467f-795l6\" (UID: \"5fce846e-41b6-4b09-a5db-236763b8e5f9\") " pod="openstack/dnsmasq-dns-59d5ff467f-795l6" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.684078 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62cb738e-4901-49c1-8516-02b0c2a44482-logs\") pod \"barbican-keystone-listener-65bb648478-f5h6h\" (UID: \"62cb738e-4901-49c1-8516-02b0c2a44482\") " pod="openstack/barbican-keystone-listener-65bb648478-f5h6h" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.684517 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5fce846e-41b6-4b09-a5db-236763b8e5f9-dns-svc\") pod \"dnsmasq-dns-59d5ff467f-795l6\" (UID: \"5fce846e-41b6-4b09-a5db-236763b8e5f9\") " pod="openstack/dnsmasq-dns-59d5ff467f-795l6" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.684639 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fce846e-41b6-4b09-a5db-236763b8e5f9-config\") pod \"dnsmasq-dns-59d5ff467f-795l6\" (UID: \"5fce846e-41b6-4b09-a5db-236763b8e5f9\") " pod="openstack/dnsmasq-dns-59d5ff467f-795l6" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.684810 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5fce846e-41b6-4b09-a5db-236763b8e5f9-ovsdbserver-nb\") pod \"dnsmasq-dns-59d5ff467f-795l6\" (UID: \"5fce846e-41b6-4b09-a5db-236763b8e5f9\") " pod="openstack/dnsmasq-dns-59d5ff467f-795l6" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.685028 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5fce846e-41b6-4b09-a5db-236763b8e5f9-dns-swift-storage-0\") pod \"dnsmasq-dns-59d5ff467f-795l6\" (UID: \"5fce846e-41b6-4b09-a5db-236763b8e5f9\") " pod="openstack/dnsmasq-dns-59d5ff467f-795l6" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.686565 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5fce846e-41b6-4b09-a5db-236763b8e5f9-ovsdbserver-sb\") pod \"dnsmasq-dns-59d5ff467f-795l6\" (UID: \"5fce846e-41b6-4b09-a5db-236763b8e5f9\") " pod="openstack/dnsmasq-dns-59d5ff467f-795l6" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.686885 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62cb738e-4901-49c1-8516-02b0c2a44482-config-data-custom\") pod \"barbican-keystone-listener-65bb648478-f5h6h\" (UID: \"62cb738e-4901-49c1-8516-02b0c2a44482\") " pod="openstack/barbican-keystone-listener-65bb648478-f5h6h" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.687558 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62cb738e-4901-49c1-8516-02b0c2a44482-combined-ca-bundle\") pod \"barbican-keystone-listener-65bb648478-f5h6h\" (UID: \"62cb738e-4901-49c1-8516-02b0c2a44482\") " pod="openstack/barbican-keystone-listener-65bb648478-f5h6h" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.688462 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62cb738e-4901-49c1-8516-02b0c2a44482-config-data\") pod \"barbican-keystone-listener-65bb648478-f5h6h\" (UID: \"62cb738e-4901-49c1-8516-02b0c2a44482\") " pod="openstack/barbican-keystone-listener-65bb648478-f5h6h" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.695925 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-55444c599f-s65df" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.701128 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd7wf\" (UniqueName: \"kubernetes.io/projected/62cb738e-4901-49c1-8516-02b0c2a44482-kube-api-access-dd7wf\") pod \"barbican-keystone-listener-65bb648478-f5h6h\" (UID: \"62cb738e-4901-49c1-8516-02b0c2a44482\") " pod="openstack/barbican-keystone-listener-65bb648478-f5h6h" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.704232 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99rhm\" (UniqueName: \"kubernetes.io/projected/5fce846e-41b6-4b09-a5db-236763b8e5f9-kube-api-access-99rhm\") pod \"dnsmasq-dns-59d5ff467f-795l6\" (UID: \"5fce846e-41b6-4b09-a5db-236763b8e5f9\") " pod="openstack/dnsmasq-dns-59d5ff467f-795l6" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.785001 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b640b42-7f62-466a-b115-6d9a530e35a6-config-data-custom\") pod \"barbican-api-85c8cc6bbb-jj2g6\" (UID: \"3b640b42-7f62-466a-b115-6d9a530e35a6\") " pod="openstack/barbican-api-85c8cc6bbb-jj2g6" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.785055 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b640b42-7f62-466a-b115-6d9a530e35a6-config-data\") pod \"barbican-api-85c8cc6bbb-jj2g6\" (UID: \"3b640b42-7f62-466a-b115-6d9a530e35a6\") " pod="openstack/barbican-api-85c8cc6bbb-jj2g6" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.785116 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b640b42-7f62-466a-b115-6d9a530e35a6-combined-ca-bundle\") pod \"barbican-api-85c8cc6bbb-jj2g6\" (UID: \"3b640b42-7f62-466a-b115-6d9a530e35a6\") " pod="openstack/barbican-api-85c8cc6bbb-jj2g6" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.785151 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b640b42-7f62-466a-b115-6d9a530e35a6-logs\") pod \"barbican-api-85c8cc6bbb-jj2g6\" (UID: \"3b640b42-7f62-466a-b115-6d9a530e35a6\") " pod="openstack/barbican-api-85c8cc6bbb-jj2g6" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.785184 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgtln\" (UniqueName: \"kubernetes.io/projected/3b640b42-7f62-466a-b115-6d9a530e35a6-kube-api-access-tgtln\") pod \"barbican-api-85c8cc6bbb-jj2g6\" (UID: \"3b640b42-7f62-466a-b115-6d9a530e35a6\") " pod="openstack/barbican-api-85c8cc6bbb-jj2g6" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.786019 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b640b42-7f62-466a-b115-6d9a530e35a6-logs\") pod \"barbican-api-85c8cc6bbb-jj2g6\" (UID: \"3b640b42-7f62-466a-b115-6d9a530e35a6\") " pod="openstack/barbican-api-85c8cc6bbb-jj2g6" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.789682 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b640b42-7f62-466a-b115-6d9a530e35a6-config-data\") pod \"barbican-api-85c8cc6bbb-jj2g6\" (UID: \"3b640b42-7f62-466a-b115-6d9a530e35a6\") " pod="openstack/barbican-api-85c8cc6bbb-jj2g6" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.792534 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b640b42-7f62-466a-b115-6d9a530e35a6-combined-ca-bundle\") pod \"barbican-api-85c8cc6bbb-jj2g6\" (UID: \"3b640b42-7f62-466a-b115-6d9a530e35a6\") " pod="openstack/barbican-api-85c8cc6bbb-jj2g6" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.792669 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b640b42-7f62-466a-b115-6d9a530e35a6-config-data-custom\") pod \"barbican-api-85c8cc6bbb-jj2g6\" (UID: \"3b640b42-7f62-466a-b115-6d9a530e35a6\") " pod="openstack/barbican-api-85c8cc6bbb-jj2g6" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.811667 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgtln\" (UniqueName: \"kubernetes.io/projected/3b640b42-7f62-466a-b115-6d9a530e35a6-kube-api-access-tgtln\") pod \"barbican-api-85c8cc6bbb-jj2g6\" (UID: \"3b640b42-7f62-466a-b115-6d9a530e35a6\") " pod="openstack/barbican-api-85c8cc6bbb-jj2g6" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.847520 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-65bb648478-f5h6h" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.881690 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5ff467f-795l6" Oct 07 11:37:43 crc kubenswrapper[4700]: I1007 11:37:43.949879 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-85c8cc6bbb-jj2g6" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.215409 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f1fee1b-3e64-4266-b787-c2804804a232","Type":"ContainerStarted","Data":"8ba875b9970322122937419ca40d4a433e564a2f21d72bb77973f8eac13b7334"} Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.230585 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-lp48h" event={"ID":"84e22581-4f7e-45b6-9932-aaa790c7825f","Type":"ContainerStarted","Data":"a56619c5f39829f3abaadaf91251ff9cacc623356bfc156f635c57b51b6199be"} Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.233579 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-z8zs9" event={"ID":"587378c1-b0c6-4d31-980f-0b6e8e271903","Type":"ContainerStarted","Data":"f4bc679c9b20142445a482ac4aba8ea4e86969665d47bbf62800fd88ca36709c"} Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.233628 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-z8zs9" event={"ID":"587378c1-b0c6-4d31-980f-0b6e8e271903","Type":"ContainerStarted","Data":"d701a11c28a34cb1bb813d00b32fbdf8fe89c08461926908957ee1f67ba8f5d2"} Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.249816 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-55444c599f-s65df"] Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.269477 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-8444f487fd-js794"] Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.270999 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8444f487fd-js794" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.273297 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.273529 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.273822 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-djssz" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.274047 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.274191 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.274346 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.286064 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-z8zs9" podStartSLOduration=3.286047417 podStartE2EDuration="3.286047417s" podCreationTimestamp="2025-10-07 11:37:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:37:44.279169267 +0000 UTC m=+1031.075568256" watchObservedRunningTime="2025-10-07 11:37:44.286047417 +0000 UTC m=+1031.082446406" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.286346 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-65bb648478-f5h6h"] Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.309137 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2300bc48-b64d-42ea-bc78-be6ca9508d5b-internal-tls-certs\") pod \"keystone-8444f487fd-js794\" (UID: \"2300bc48-b64d-42ea-bc78-be6ca9508d5b\") " pod="openstack/keystone-8444f487fd-js794" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.309174 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2300bc48-b64d-42ea-bc78-be6ca9508d5b-public-tls-certs\") pod \"keystone-8444f487fd-js794\" (UID: \"2300bc48-b64d-42ea-bc78-be6ca9508d5b\") " pod="openstack/keystone-8444f487fd-js794" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.309200 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2300bc48-b64d-42ea-bc78-be6ca9508d5b-config-data\") pod \"keystone-8444f487fd-js794\" (UID: \"2300bc48-b64d-42ea-bc78-be6ca9508d5b\") " pod="openstack/keystone-8444f487fd-js794" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.309242 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4mtf\" (UniqueName: \"kubernetes.io/projected/2300bc48-b64d-42ea-bc78-be6ca9508d5b-kube-api-access-m4mtf\") pod \"keystone-8444f487fd-js794\" (UID: \"2300bc48-b64d-42ea-bc78-be6ca9508d5b\") " pod="openstack/keystone-8444f487fd-js794" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.319124 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2300bc48-b64d-42ea-bc78-be6ca9508d5b-scripts\") pod \"keystone-8444f487fd-js794\" (UID: \"2300bc48-b64d-42ea-bc78-be6ca9508d5b\") " pod="openstack/keystone-8444f487fd-js794" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.319200 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2300bc48-b64d-42ea-bc78-be6ca9508d5b-credential-keys\") pod \"keystone-8444f487fd-js794\" (UID: \"2300bc48-b64d-42ea-bc78-be6ca9508d5b\") " pod="openstack/keystone-8444f487fd-js794" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.319336 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2300bc48-b64d-42ea-bc78-be6ca9508d5b-combined-ca-bundle\") pod \"keystone-8444f487fd-js794\" (UID: \"2300bc48-b64d-42ea-bc78-be6ca9508d5b\") " pod="openstack/keystone-8444f487fd-js794" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.319393 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2300bc48-b64d-42ea-bc78-be6ca9508d5b-fernet-keys\") pod \"keystone-8444f487fd-js794\" (UID: \"2300bc48-b64d-42ea-bc78-be6ca9508d5b\") " pod="openstack/keystone-8444f487fd-js794" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.336175 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8444f487fd-js794"] Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.363818 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7d99cbbb56-xqb5x"] Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.365483 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7d99cbbb56-xqb5x" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.368645 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.368909 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.369103 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-7lc5j" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.369235 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.369390 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.410065 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7d99cbbb56-xqb5x"] Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.422682 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2300bc48-b64d-42ea-bc78-be6ca9508d5b-combined-ca-bundle\") pod \"keystone-8444f487fd-js794\" (UID: \"2300bc48-b64d-42ea-bc78-be6ca9508d5b\") " pod="openstack/keystone-8444f487fd-js794" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.422754 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2300bc48-b64d-42ea-bc78-be6ca9508d5b-fernet-keys\") pod \"keystone-8444f487fd-js794\" (UID: \"2300bc48-b64d-42ea-bc78-be6ca9508d5b\") " pod="openstack/keystone-8444f487fd-js794" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.422794 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36efa6df-bc80-48f6-8611-e8dff3530d8e-logs\") pod \"placement-7d99cbbb56-xqb5x\" (UID: \"36efa6df-bc80-48f6-8611-e8dff3530d8e\") " pod="openstack/placement-7d99cbbb56-xqb5x" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.422817 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cmnp\" (UniqueName: \"kubernetes.io/projected/36efa6df-bc80-48f6-8611-e8dff3530d8e-kube-api-access-5cmnp\") pod \"placement-7d99cbbb56-xqb5x\" (UID: \"36efa6df-bc80-48f6-8611-e8dff3530d8e\") " pod="openstack/placement-7d99cbbb56-xqb5x" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.422879 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/36efa6df-bc80-48f6-8611-e8dff3530d8e-internal-tls-certs\") pod \"placement-7d99cbbb56-xqb5x\" (UID: \"36efa6df-bc80-48f6-8611-e8dff3530d8e\") " pod="openstack/placement-7d99cbbb56-xqb5x" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.422906 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2300bc48-b64d-42ea-bc78-be6ca9508d5b-internal-tls-certs\") pod \"keystone-8444f487fd-js794\" (UID: \"2300bc48-b64d-42ea-bc78-be6ca9508d5b\") " pod="openstack/keystone-8444f487fd-js794" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.422927 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2300bc48-b64d-42ea-bc78-be6ca9508d5b-public-tls-certs\") pod \"keystone-8444f487fd-js794\" (UID: \"2300bc48-b64d-42ea-bc78-be6ca9508d5b\") " pod="openstack/keystone-8444f487fd-js794" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.422948 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2300bc48-b64d-42ea-bc78-be6ca9508d5b-config-data\") pod \"keystone-8444f487fd-js794\" (UID: \"2300bc48-b64d-42ea-bc78-be6ca9508d5b\") " pod="openstack/keystone-8444f487fd-js794" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.422989 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4mtf\" (UniqueName: \"kubernetes.io/projected/2300bc48-b64d-42ea-bc78-be6ca9508d5b-kube-api-access-m4mtf\") pod \"keystone-8444f487fd-js794\" (UID: \"2300bc48-b64d-42ea-bc78-be6ca9508d5b\") " pod="openstack/keystone-8444f487fd-js794" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.423014 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36efa6df-bc80-48f6-8611-e8dff3530d8e-config-data\") pod \"placement-7d99cbbb56-xqb5x\" (UID: \"36efa6df-bc80-48f6-8611-e8dff3530d8e\") " pod="openstack/placement-7d99cbbb56-xqb5x" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.423055 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36efa6df-bc80-48f6-8611-e8dff3530d8e-scripts\") pod \"placement-7d99cbbb56-xqb5x\" (UID: \"36efa6df-bc80-48f6-8611-e8dff3530d8e\") " pod="openstack/placement-7d99cbbb56-xqb5x" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.423080 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36efa6df-bc80-48f6-8611-e8dff3530d8e-combined-ca-bundle\") pod \"placement-7d99cbbb56-xqb5x\" (UID: \"36efa6df-bc80-48f6-8611-e8dff3530d8e\") " pod="openstack/placement-7d99cbbb56-xqb5x" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.423104 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2300bc48-b64d-42ea-bc78-be6ca9508d5b-scripts\") pod \"keystone-8444f487fd-js794\" (UID: \"2300bc48-b64d-42ea-bc78-be6ca9508d5b\") " pod="openstack/keystone-8444f487fd-js794" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.423145 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2300bc48-b64d-42ea-bc78-be6ca9508d5b-credential-keys\") pod \"keystone-8444f487fd-js794\" (UID: \"2300bc48-b64d-42ea-bc78-be6ca9508d5b\") " pod="openstack/keystone-8444f487fd-js794" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.423178 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36efa6df-bc80-48f6-8611-e8dff3530d8e-public-tls-certs\") pod \"placement-7d99cbbb56-xqb5x\" (UID: \"36efa6df-bc80-48f6-8611-e8dff3530d8e\") " pod="openstack/placement-7d99cbbb56-xqb5x" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.432151 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2300bc48-b64d-42ea-bc78-be6ca9508d5b-config-data\") pod \"keystone-8444f487fd-js794\" (UID: \"2300bc48-b64d-42ea-bc78-be6ca9508d5b\") " pod="openstack/keystone-8444f487fd-js794" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.432791 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2300bc48-b64d-42ea-bc78-be6ca9508d5b-combined-ca-bundle\") pod \"keystone-8444f487fd-js794\" (UID: \"2300bc48-b64d-42ea-bc78-be6ca9508d5b\") " pod="openstack/keystone-8444f487fd-js794" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.434805 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2300bc48-b64d-42ea-bc78-be6ca9508d5b-fernet-keys\") pod \"keystone-8444f487fd-js794\" (UID: \"2300bc48-b64d-42ea-bc78-be6ca9508d5b\") " pod="openstack/keystone-8444f487fd-js794" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.435178 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2300bc48-b64d-42ea-bc78-be6ca9508d5b-internal-tls-certs\") pod \"keystone-8444f487fd-js794\" (UID: \"2300bc48-b64d-42ea-bc78-be6ca9508d5b\") " pod="openstack/keystone-8444f487fd-js794" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.439772 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2300bc48-b64d-42ea-bc78-be6ca9508d5b-public-tls-certs\") pod \"keystone-8444f487fd-js794\" (UID: \"2300bc48-b64d-42ea-bc78-be6ca9508d5b\") " pod="openstack/keystone-8444f487fd-js794" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.440065 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2300bc48-b64d-42ea-bc78-be6ca9508d5b-scripts\") pod \"keystone-8444f487fd-js794\" (UID: \"2300bc48-b64d-42ea-bc78-be6ca9508d5b\") " pod="openstack/keystone-8444f487fd-js794" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.440391 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2300bc48-b64d-42ea-bc78-be6ca9508d5b-credential-keys\") pod \"keystone-8444f487fd-js794\" (UID: \"2300bc48-b64d-42ea-bc78-be6ca9508d5b\") " pod="openstack/keystone-8444f487fd-js794" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.449809 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4mtf\" (UniqueName: \"kubernetes.io/projected/2300bc48-b64d-42ea-bc78-be6ca9508d5b-kube-api-access-m4mtf\") pod \"keystone-8444f487fd-js794\" (UID: \"2300bc48-b64d-42ea-bc78-be6ca9508d5b\") " pod="openstack/keystone-8444f487fd-js794" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.467120 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.467168 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.484587 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.484637 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.515005 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.517760 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.525451 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/36efa6df-bc80-48f6-8611-e8dff3530d8e-internal-tls-certs\") pod \"placement-7d99cbbb56-xqb5x\" (UID: \"36efa6df-bc80-48f6-8611-e8dff3530d8e\") " pod="openstack/placement-7d99cbbb56-xqb5x" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.525570 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36efa6df-bc80-48f6-8611-e8dff3530d8e-config-data\") pod \"placement-7d99cbbb56-xqb5x\" (UID: \"36efa6df-bc80-48f6-8611-e8dff3530d8e\") " pod="openstack/placement-7d99cbbb56-xqb5x" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.525633 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36efa6df-bc80-48f6-8611-e8dff3530d8e-scripts\") pod \"placement-7d99cbbb56-xqb5x\" (UID: \"36efa6df-bc80-48f6-8611-e8dff3530d8e\") " pod="openstack/placement-7d99cbbb56-xqb5x" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.525665 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36efa6df-bc80-48f6-8611-e8dff3530d8e-combined-ca-bundle\") pod \"placement-7d99cbbb56-xqb5x\" (UID: \"36efa6df-bc80-48f6-8611-e8dff3530d8e\") " pod="openstack/placement-7d99cbbb56-xqb5x" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.525742 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36efa6df-bc80-48f6-8611-e8dff3530d8e-public-tls-certs\") pod \"placement-7d99cbbb56-xqb5x\" (UID: \"36efa6df-bc80-48f6-8611-e8dff3530d8e\") " pod="openstack/placement-7d99cbbb56-xqb5x" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.525886 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36efa6df-bc80-48f6-8611-e8dff3530d8e-logs\") pod \"placement-7d99cbbb56-xqb5x\" (UID: \"36efa6df-bc80-48f6-8611-e8dff3530d8e\") " pod="openstack/placement-7d99cbbb56-xqb5x" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.525912 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cmnp\" (UniqueName: \"kubernetes.io/projected/36efa6df-bc80-48f6-8611-e8dff3530d8e-kube-api-access-5cmnp\") pod \"placement-7d99cbbb56-xqb5x\" (UID: \"36efa6df-bc80-48f6-8611-e8dff3530d8e\") " pod="openstack/placement-7d99cbbb56-xqb5x" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.527927 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36efa6df-bc80-48f6-8611-e8dff3530d8e-logs\") pod \"placement-7d99cbbb56-xqb5x\" (UID: \"36efa6df-bc80-48f6-8611-e8dff3530d8e\") " pod="openstack/placement-7d99cbbb56-xqb5x" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.535269 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/36efa6df-bc80-48f6-8611-e8dff3530d8e-internal-tls-certs\") pod \"placement-7d99cbbb56-xqb5x\" (UID: \"36efa6df-bc80-48f6-8611-e8dff3530d8e\") " pod="openstack/placement-7d99cbbb56-xqb5x" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.545912 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36efa6df-bc80-48f6-8611-e8dff3530d8e-config-data\") pod \"placement-7d99cbbb56-xqb5x\" (UID: \"36efa6df-bc80-48f6-8611-e8dff3530d8e\") " pod="openstack/placement-7d99cbbb56-xqb5x" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.546083 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36efa6df-bc80-48f6-8611-e8dff3530d8e-scripts\") pod \"placement-7d99cbbb56-xqb5x\" (UID: \"36efa6df-bc80-48f6-8611-e8dff3530d8e\") " pod="openstack/placement-7d99cbbb56-xqb5x" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.552635 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.559854 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36efa6df-bc80-48f6-8611-e8dff3530d8e-public-tls-certs\") pod \"placement-7d99cbbb56-xqb5x\" (UID: \"36efa6df-bc80-48f6-8611-e8dff3530d8e\") " pod="openstack/placement-7d99cbbb56-xqb5x" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.574334 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cmnp\" (UniqueName: \"kubernetes.io/projected/36efa6df-bc80-48f6-8611-e8dff3530d8e-kube-api-access-5cmnp\") pod \"placement-7d99cbbb56-xqb5x\" (UID: \"36efa6df-bc80-48f6-8611-e8dff3530d8e\") " pod="openstack/placement-7d99cbbb56-xqb5x" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.585915 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36efa6df-bc80-48f6-8611-e8dff3530d8e-combined-ca-bundle\") pod \"placement-7d99cbbb56-xqb5x\" (UID: \"36efa6df-bc80-48f6-8611-e8dff3530d8e\") " pod="openstack/placement-7d99cbbb56-xqb5x" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.595166 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.605847 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-795l6"] Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.658225 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8444f487fd-js794" Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.697831 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-85c8cc6bbb-jj2g6"] Oct 07 11:37:44 crc kubenswrapper[4700]: I1007 11:37:44.716089 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7d99cbbb56-xqb5x" Oct 07 11:37:45 crc kubenswrapper[4700]: I1007 11:37:45.245585 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-65bb648478-f5h6h" event={"ID":"62cb738e-4901-49c1-8516-02b0c2a44482","Type":"ContainerStarted","Data":"c11e2a45be6880f1998db7620752207508392c3a1846e29ab2a4dba6ea460064"} Oct 07 11:37:45 crc kubenswrapper[4700]: I1007 11:37:45.253335 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-85c8cc6bbb-jj2g6" event={"ID":"3b640b42-7f62-466a-b115-6d9a530e35a6","Type":"ContainerStarted","Data":"a79b741caa427e79abebaea6f104a5385a2eb562a49992b8e0bb6dccb5dd4015"} Oct 07 11:37:45 crc kubenswrapper[4700]: I1007 11:37:45.266399 4700 generic.go:334] "Generic (PLEG): container finished" podID="5fce846e-41b6-4b09-a5db-236763b8e5f9" containerID="056c7b430e0188ff314111a56c4956d1ceb343cde4e15133609ce7a369e8a830" exitCode=0 Oct 07 11:37:45 crc kubenswrapper[4700]: I1007 11:37:45.266464 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-795l6" event={"ID":"5fce846e-41b6-4b09-a5db-236763b8e5f9","Type":"ContainerDied","Data":"056c7b430e0188ff314111a56c4956d1ceb343cde4e15133609ce7a369e8a830"} Oct 07 11:37:45 crc kubenswrapper[4700]: I1007 11:37:45.266490 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-795l6" event={"ID":"5fce846e-41b6-4b09-a5db-236763b8e5f9","Type":"ContainerStarted","Data":"62415a0b8fe11be852914acc361137694130eb0cf4d453a8cfa3931efb545382"} Oct 07 11:37:45 crc kubenswrapper[4700]: I1007 11:37:45.273264 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-55444c599f-s65df" event={"ID":"f28d9836-f2c1-4a60-97fd-324ba6b0331b","Type":"ContainerStarted","Data":"a4b6f7ef4876fe8172ccdaba31ca9d255e7102d0256caa408c584183605b09b3"} Oct 07 11:37:45 crc kubenswrapper[4700]: I1007 11:37:45.274714 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 07 11:37:45 crc kubenswrapper[4700]: I1007 11:37:45.274750 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 07 11:37:45 crc kubenswrapper[4700]: I1007 11:37:45.274760 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 07 11:37:45 crc kubenswrapper[4700]: I1007 11:37:45.274772 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 07 11:37:45 crc kubenswrapper[4700]: I1007 11:37:45.451815 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8444f487fd-js794"] Oct 07 11:37:45 crc kubenswrapper[4700]: I1007 11:37:45.591928 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7d99cbbb56-xqb5x"] Oct 07 11:37:46 crc kubenswrapper[4700]: I1007 11:37:46.334479 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8444f487fd-js794" event={"ID":"2300bc48-b64d-42ea-bc78-be6ca9508d5b","Type":"ContainerStarted","Data":"10b9a959f24244c7bc9d2fbf92a6a71d57e16b1d7a1451f82dba603599ce71e0"} Oct 07 11:37:46 crc kubenswrapper[4700]: I1007 11:37:46.335003 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8444f487fd-js794" event={"ID":"2300bc48-b64d-42ea-bc78-be6ca9508d5b","Type":"ContainerStarted","Data":"ebaca41c65d197cf374b6340ae65d498fb0b24aaaf30d2f7ecf6f6ff350ccf6d"} Oct 07 11:37:46 crc kubenswrapper[4700]: I1007 11:37:46.339797 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-746789dcd4-wsdtq"] Oct 07 11:37:46 crc kubenswrapper[4700]: I1007 11:37:46.341415 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-746789dcd4-wsdtq" Oct 07 11:37:46 crc kubenswrapper[4700]: I1007 11:37:46.346595 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-85c8cc6bbb-jj2g6" event={"ID":"3b640b42-7f62-466a-b115-6d9a530e35a6","Type":"ContainerStarted","Data":"9d4ef75aeb731bef542854e31e3e24f596e9d158208027f738c25f2d21ee234e"} Oct 07 11:37:46 crc kubenswrapper[4700]: I1007 11:37:46.346761 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 07 11:37:46 crc kubenswrapper[4700]: I1007 11:37:46.346772 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-85c8cc6bbb-jj2g6" event={"ID":"3b640b42-7f62-466a-b115-6d9a530e35a6","Type":"ContainerStarted","Data":"99d8cdd0dc4c59f3a514986754227c73f83585b40756fceb170120a7f36b96ff"} Oct 07 11:37:46 crc kubenswrapper[4700]: I1007 11:37:46.347088 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 07 11:37:46 crc kubenswrapper[4700]: I1007 11:37:46.355161 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7d99cbbb56-xqb5x" event={"ID":"36efa6df-bc80-48f6-8611-e8dff3530d8e","Type":"ContainerStarted","Data":"7648830bccc5bf1932d84a81ff3235e4919eef3439f6d04f7e21ad702b4a71e5"} Oct 07 11:37:46 crc kubenswrapper[4700]: I1007 11:37:46.355199 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7d99cbbb56-xqb5x" event={"ID":"36efa6df-bc80-48f6-8611-e8dff3530d8e","Type":"ContainerStarted","Data":"7e78b3d8e23531e19bf0f22092c84ec4715827a09e1f2292c16eec95cf239c1f"} Oct 07 11:37:46 crc kubenswrapper[4700]: I1007 11:37:46.358663 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-795l6" event={"ID":"5fce846e-41b6-4b09-a5db-236763b8e5f9","Type":"ContainerStarted","Data":"ac3e04171ef90a66417460183beaf2d07c8259a7eb36db92868dc5bc4b3aab4d"} Oct 07 11:37:46 crc kubenswrapper[4700]: I1007 11:37:46.359423 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59d5ff467f-795l6" Oct 07 11:37:46 crc kubenswrapper[4700]: I1007 11:37:46.391354 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-746789dcd4-wsdtq"] Oct 07 11:37:46 crc kubenswrapper[4700]: I1007 11:37:46.409096 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q7vr\" (UniqueName: \"kubernetes.io/projected/c1e6ae51-277f-403c-a01a-5786160b1298-kube-api-access-5q7vr\") pod \"barbican-api-746789dcd4-wsdtq\" (UID: \"c1e6ae51-277f-403c-a01a-5786160b1298\") " pod="openstack/barbican-api-746789dcd4-wsdtq" Oct 07 11:37:46 crc kubenswrapper[4700]: I1007 11:37:46.409144 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1e6ae51-277f-403c-a01a-5786160b1298-config-data\") pod \"barbican-api-746789dcd4-wsdtq\" (UID: \"c1e6ae51-277f-403c-a01a-5786160b1298\") " pod="openstack/barbican-api-746789dcd4-wsdtq" Oct 07 11:37:46 crc kubenswrapper[4700]: I1007 11:37:46.409175 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1e6ae51-277f-403c-a01a-5786160b1298-logs\") pod \"barbican-api-746789dcd4-wsdtq\" (UID: \"c1e6ae51-277f-403c-a01a-5786160b1298\") " pod="openstack/barbican-api-746789dcd4-wsdtq" Oct 07 11:37:46 crc kubenswrapper[4700]: I1007 11:37:46.409195 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1e6ae51-277f-403c-a01a-5786160b1298-public-tls-certs\") pod \"barbican-api-746789dcd4-wsdtq\" (UID: \"c1e6ae51-277f-403c-a01a-5786160b1298\") " pod="openstack/barbican-api-746789dcd4-wsdtq" Oct 07 11:37:46 crc kubenswrapper[4700]: I1007 11:37:46.409287 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1e6ae51-277f-403c-a01a-5786160b1298-combined-ca-bundle\") pod \"barbican-api-746789dcd4-wsdtq\" (UID: \"c1e6ae51-277f-403c-a01a-5786160b1298\") " pod="openstack/barbican-api-746789dcd4-wsdtq" Oct 07 11:37:46 crc kubenswrapper[4700]: I1007 11:37:46.409373 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1e6ae51-277f-403c-a01a-5786160b1298-internal-tls-certs\") pod \"barbican-api-746789dcd4-wsdtq\" (UID: \"c1e6ae51-277f-403c-a01a-5786160b1298\") " pod="openstack/barbican-api-746789dcd4-wsdtq" Oct 07 11:37:46 crc kubenswrapper[4700]: I1007 11:37:46.409482 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c1e6ae51-277f-403c-a01a-5786160b1298-config-data-custom\") pod \"barbican-api-746789dcd4-wsdtq\" (UID: \"c1e6ae51-277f-403c-a01a-5786160b1298\") " pod="openstack/barbican-api-746789dcd4-wsdtq" Oct 07 11:37:46 crc kubenswrapper[4700]: I1007 11:37:46.411112 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59d5ff467f-795l6" podStartSLOduration=3.4110959530000002 podStartE2EDuration="3.411095953s" podCreationTimestamp="2025-10-07 11:37:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:37:46.405762414 +0000 UTC m=+1033.202161403" watchObservedRunningTime="2025-10-07 11:37:46.411095953 +0000 UTC m=+1033.207494942" Oct 07 11:37:46 crc kubenswrapper[4700]: I1007 11:37:46.510736 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q7vr\" (UniqueName: \"kubernetes.io/projected/c1e6ae51-277f-403c-a01a-5786160b1298-kube-api-access-5q7vr\") pod \"barbican-api-746789dcd4-wsdtq\" (UID: \"c1e6ae51-277f-403c-a01a-5786160b1298\") " pod="openstack/barbican-api-746789dcd4-wsdtq" Oct 07 11:37:46 crc kubenswrapper[4700]: I1007 11:37:46.510771 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1e6ae51-277f-403c-a01a-5786160b1298-config-data\") pod \"barbican-api-746789dcd4-wsdtq\" (UID: \"c1e6ae51-277f-403c-a01a-5786160b1298\") " pod="openstack/barbican-api-746789dcd4-wsdtq" Oct 07 11:37:46 crc kubenswrapper[4700]: I1007 11:37:46.510800 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1e6ae51-277f-403c-a01a-5786160b1298-public-tls-certs\") pod \"barbican-api-746789dcd4-wsdtq\" (UID: \"c1e6ae51-277f-403c-a01a-5786160b1298\") " pod="openstack/barbican-api-746789dcd4-wsdtq" Oct 07 11:37:46 crc kubenswrapper[4700]: I1007 11:37:46.510816 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1e6ae51-277f-403c-a01a-5786160b1298-logs\") pod \"barbican-api-746789dcd4-wsdtq\" (UID: \"c1e6ae51-277f-403c-a01a-5786160b1298\") " pod="openstack/barbican-api-746789dcd4-wsdtq" Oct 07 11:37:46 crc kubenswrapper[4700]: I1007 11:37:46.510874 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1e6ae51-277f-403c-a01a-5786160b1298-combined-ca-bundle\") pod \"barbican-api-746789dcd4-wsdtq\" (UID: \"c1e6ae51-277f-403c-a01a-5786160b1298\") " pod="openstack/barbican-api-746789dcd4-wsdtq" Oct 07 11:37:46 crc kubenswrapper[4700]: I1007 11:37:46.510902 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1e6ae51-277f-403c-a01a-5786160b1298-internal-tls-certs\") pod \"barbican-api-746789dcd4-wsdtq\" (UID: \"c1e6ae51-277f-403c-a01a-5786160b1298\") " pod="openstack/barbican-api-746789dcd4-wsdtq" Oct 07 11:37:46 crc kubenswrapper[4700]: I1007 11:37:46.510953 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c1e6ae51-277f-403c-a01a-5786160b1298-config-data-custom\") pod \"barbican-api-746789dcd4-wsdtq\" (UID: \"c1e6ae51-277f-403c-a01a-5786160b1298\") " pod="openstack/barbican-api-746789dcd4-wsdtq" Oct 07 11:37:46 crc kubenswrapper[4700]: I1007 11:37:46.511797 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1e6ae51-277f-403c-a01a-5786160b1298-logs\") pod \"barbican-api-746789dcd4-wsdtq\" (UID: \"c1e6ae51-277f-403c-a01a-5786160b1298\") " pod="openstack/barbican-api-746789dcd4-wsdtq" Oct 07 11:37:46 crc kubenswrapper[4700]: I1007 11:37:46.516757 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c1e6ae51-277f-403c-a01a-5786160b1298-config-data-custom\") pod \"barbican-api-746789dcd4-wsdtq\" (UID: \"c1e6ae51-277f-403c-a01a-5786160b1298\") " pod="openstack/barbican-api-746789dcd4-wsdtq" Oct 07 11:37:46 crc kubenswrapper[4700]: I1007 11:37:46.516793 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1e6ae51-277f-403c-a01a-5786160b1298-internal-tls-certs\") pod \"barbican-api-746789dcd4-wsdtq\" (UID: \"c1e6ae51-277f-403c-a01a-5786160b1298\") " pod="openstack/barbican-api-746789dcd4-wsdtq" Oct 07 11:37:46 crc kubenswrapper[4700]: I1007 11:37:46.521751 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1e6ae51-277f-403c-a01a-5786160b1298-public-tls-certs\") pod \"barbican-api-746789dcd4-wsdtq\" (UID: \"c1e6ae51-277f-403c-a01a-5786160b1298\") " pod="openstack/barbican-api-746789dcd4-wsdtq" Oct 07 11:37:46 crc kubenswrapper[4700]: I1007 11:37:46.523710 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1e6ae51-277f-403c-a01a-5786160b1298-config-data\") pod \"barbican-api-746789dcd4-wsdtq\" (UID: \"c1e6ae51-277f-403c-a01a-5786160b1298\") " pod="openstack/barbican-api-746789dcd4-wsdtq" Oct 07 11:37:46 crc kubenswrapper[4700]: I1007 11:37:46.546059 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q7vr\" (UniqueName: \"kubernetes.io/projected/c1e6ae51-277f-403c-a01a-5786160b1298-kube-api-access-5q7vr\") pod \"barbican-api-746789dcd4-wsdtq\" (UID: \"c1e6ae51-277f-403c-a01a-5786160b1298\") " pod="openstack/barbican-api-746789dcd4-wsdtq" Oct 07 11:37:46 crc kubenswrapper[4700]: I1007 11:37:46.558663 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1e6ae51-277f-403c-a01a-5786160b1298-combined-ca-bundle\") pod \"barbican-api-746789dcd4-wsdtq\" (UID: \"c1e6ae51-277f-403c-a01a-5786160b1298\") " pod="openstack/barbican-api-746789dcd4-wsdtq" Oct 07 11:37:46 crc kubenswrapper[4700]: I1007 11:37:46.680285 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-746789dcd4-wsdtq" Oct 07 11:37:47 crc kubenswrapper[4700]: I1007 11:37:47.340503 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-746789dcd4-wsdtq"] Oct 07 11:37:47 crc kubenswrapper[4700]: I1007 11:37:47.385690 4700 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 11:37:47 crc kubenswrapper[4700]: I1007 11:37:47.385728 4700 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 11:37:47 crc kubenswrapper[4700]: I1007 11:37:47.386084 4700 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 11:37:47 crc kubenswrapper[4700]: I1007 11:37:47.386105 4700 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 11:37:47 crc kubenswrapper[4700]: I1007 11:37:47.386171 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-746789dcd4-wsdtq" event={"ID":"c1e6ae51-277f-403c-a01a-5786160b1298","Type":"ContainerStarted","Data":"00aae8da0b07c50b6b6081a9053b4d4ad79560110b6b8641cab50b2afa801d09"} Oct 07 11:37:47 crc kubenswrapper[4700]: I1007 11:37:47.386893 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-8444f487fd-js794" Oct 07 11:37:47 crc kubenswrapper[4700]: I1007 11:37:47.448817 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-85c8cc6bbb-jj2g6" podStartSLOduration=4.448687424 podStartE2EDuration="4.448687424s" podCreationTimestamp="2025-10-07 11:37:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:37:47.446676602 +0000 UTC m=+1034.243075611" watchObservedRunningTime="2025-10-07 11:37:47.448687424 +0000 UTC m=+1034.245086433" Oct 07 11:37:47 crc kubenswrapper[4700]: I1007 11:37:47.454971 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-8444f487fd-js794" podStartSLOduration=3.454953958 podStartE2EDuration="3.454953958s" podCreationTimestamp="2025-10-07 11:37:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:37:47.418863758 +0000 UTC m=+1034.215262747" watchObservedRunningTime="2025-10-07 11:37:47.454953958 +0000 UTC m=+1034.251352947" Oct 07 11:37:47 crc kubenswrapper[4700]: I1007 11:37:47.866286 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 07 11:37:47 crc kubenswrapper[4700]: I1007 11:37:47.975016 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 07 11:37:48 crc kubenswrapper[4700]: I1007 11:37:48.036068 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 07 11:37:48 crc kubenswrapper[4700]: I1007 11:37:48.041288 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 07 11:37:48 crc kubenswrapper[4700]: I1007 11:37:48.405525 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7d99cbbb56-xqb5x" event={"ID":"36efa6df-bc80-48f6-8611-e8dff3530d8e","Type":"ContainerStarted","Data":"237a3cee585f74724f891fcc569625ce52093f9f8f9289e6657eec9a9f92686a"} Oct 07 11:37:48 crc kubenswrapper[4700]: I1007 11:37:48.426522 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7d99cbbb56-xqb5x" podStartSLOduration=4.426509153 podStartE2EDuration="4.426509153s" podCreationTimestamp="2025-10-07 11:37:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:37:48.424961153 +0000 UTC m=+1035.221360142" watchObservedRunningTime="2025-10-07 11:37:48.426509153 +0000 UTC m=+1035.222908132" Oct 07 11:37:48 crc kubenswrapper[4700]: I1007 11:37:48.951541 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-85c8cc6bbb-jj2g6" Oct 07 11:37:48 crc kubenswrapper[4700]: I1007 11:37:48.951905 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-85c8cc6bbb-jj2g6" Oct 07 11:37:49 crc kubenswrapper[4700]: I1007 11:37:49.416118 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-746789dcd4-wsdtq" event={"ID":"c1e6ae51-277f-403c-a01a-5786160b1298","Type":"ContainerStarted","Data":"c69e4e5cde4c8e92edcfd878a44d92eb2d470c04f78d605035c59c26d1940d14"} Oct 07 11:37:49 crc kubenswrapper[4700]: I1007 11:37:49.416666 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7d99cbbb56-xqb5x" Oct 07 11:37:49 crc kubenswrapper[4700]: I1007 11:37:49.416693 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7d99cbbb56-xqb5x" Oct 07 11:37:53 crc kubenswrapper[4700]: I1007 11:37:53.883569 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59d5ff467f-795l6" Oct 07 11:37:53 crc kubenswrapper[4700]: I1007 11:37:53.994871 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-jkbj4"] Oct 07 11:37:53 crc kubenswrapper[4700]: I1007 11:37:53.995126 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8b5c85b87-jkbj4" podUID="57d2908e-ab77-494f-91e7-dcdabee84614" containerName="dnsmasq-dns" containerID="cri-o://f5095e70592475b3be954a50f424d62a14a9c7e593aed7f0f40b3c20ec5499f7" gracePeriod=10 Oct 07 11:37:54 crc kubenswrapper[4700]: I1007 11:37:54.468215 4700 generic.go:334] "Generic (PLEG): container finished" podID="57d2908e-ab77-494f-91e7-dcdabee84614" containerID="f5095e70592475b3be954a50f424d62a14a9c7e593aed7f0f40b3c20ec5499f7" exitCode=0 Oct 07 11:37:54 crc kubenswrapper[4700]: I1007 11:37:54.468322 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-jkbj4" event={"ID":"57d2908e-ab77-494f-91e7-dcdabee84614","Type":"ContainerDied","Data":"f5095e70592475b3be954a50f424d62a14a9c7e593aed7f0f40b3c20ec5499f7"} Oct 07 11:37:54 crc kubenswrapper[4700]: I1007 11:37:54.742415 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8b5c85b87-jkbj4" podUID="57d2908e-ab77-494f-91e7-dcdabee84614" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.140:5353: connect: connection refused" Oct 07 11:37:55 crc kubenswrapper[4700]: I1007 11:37:55.484779 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-85c8cc6bbb-jj2g6" Oct 07 11:37:55 crc kubenswrapper[4700]: I1007 11:37:55.606514 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-85c8cc6bbb-jj2g6" Oct 07 11:37:58 crc kubenswrapper[4700]: I1007 11:37:58.905856 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-jkbj4" Oct 07 11:37:59 crc kubenswrapper[4700]: I1007 11:37:59.084020 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57d2908e-ab77-494f-91e7-dcdabee84614-dns-svc\") pod \"57d2908e-ab77-494f-91e7-dcdabee84614\" (UID: \"57d2908e-ab77-494f-91e7-dcdabee84614\") " Oct 07 11:37:59 crc kubenswrapper[4700]: I1007 11:37:59.084135 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57d2908e-ab77-494f-91e7-dcdabee84614-ovsdbserver-nb\") pod \"57d2908e-ab77-494f-91e7-dcdabee84614\" (UID: \"57d2908e-ab77-494f-91e7-dcdabee84614\") " Oct 07 11:37:59 crc kubenswrapper[4700]: I1007 11:37:59.084177 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57d2908e-ab77-494f-91e7-dcdabee84614-ovsdbserver-sb\") pod \"57d2908e-ab77-494f-91e7-dcdabee84614\" (UID: \"57d2908e-ab77-494f-91e7-dcdabee84614\") " Oct 07 11:37:59 crc kubenswrapper[4700]: I1007 11:37:59.084259 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57d2908e-ab77-494f-91e7-dcdabee84614-config\") pod \"57d2908e-ab77-494f-91e7-dcdabee84614\" (UID: \"57d2908e-ab77-494f-91e7-dcdabee84614\") " Oct 07 11:37:59 crc kubenswrapper[4700]: I1007 11:37:59.084296 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fx6kv\" (UniqueName: \"kubernetes.io/projected/57d2908e-ab77-494f-91e7-dcdabee84614-kube-api-access-fx6kv\") pod \"57d2908e-ab77-494f-91e7-dcdabee84614\" (UID: \"57d2908e-ab77-494f-91e7-dcdabee84614\") " Oct 07 11:37:59 crc kubenswrapper[4700]: I1007 11:37:59.084404 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/57d2908e-ab77-494f-91e7-dcdabee84614-dns-swift-storage-0\") pod \"57d2908e-ab77-494f-91e7-dcdabee84614\" (UID: \"57d2908e-ab77-494f-91e7-dcdabee84614\") " Oct 07 11:37:59 crc kubenswrapper[4700]: I1007 11:37:59.094742 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57d2908e-ab77-494f-91e7-dcdabee84614-kube-api-access-fx6kv" (OuterVolumeSpecName: "kube-api-access-fx6kv") pod "57d2908e-ab77-494f-91e7-dcdabee84614" (UID: "57d2908e-ab77-494f-91e7-dcdabee84614"). InnerVolumeSpecName "kube-api-access-fx6kv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:37:59 crc kubenswrapper[4700]: I1007 11:37:59.151621 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57d2908e-ab77-494f-91e7-dcdabee84614-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "57d2908e-ab77-494f-91e7-dcdabee84614" (UID: "57d2908e-ab77-494f-91e7-dcdabee84614"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:37:59 crc kubenswrapper[4700]: I1007 11:37:59.154625 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57d2908e-ab77-494f-91e7-dcdabee84614-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "57d2908e-ab77-494f-91e7-dcdabee84614" (UID: "57d2908e-ab77-494f-91e7-dcdabee84614"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:37:59 crc kubenswrapper[4700]: I1007 11:37:59.165552 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57d2908e-ab77-494f-91e7-dcdabee84614-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "57d2908e-ab77-494f-91e7-dcdabee84614" (UID: "57d2908e-ab77-494f-91e7-dcdabee84614"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:37:59 crc kubenswrapper[4700]: I1007 11:37:59.175743 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57d2908e-ab77-494f-91e7-dcdabee84614-config" (OuterVolumeSpecName: "config") pod "57d2908e-ab77-494f-91e7-dcdabee84614" (UID: "57d2908e-ab77-494f-91e7-dcdabee84614"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:37:59 crc kubenswrapper[4700]: I1007 11:37:59.188866 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57d2908e-ab77-494f-91e7-dcdabee84614-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "57d2908e-ab77-494f-91e7-dcdabee84614" (UID: "57d2908e-ab77-494f-91e7-dcdabee84614"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:37:59 crc kubenswrapper[4700]: I1007 11:37:59.189026 4700 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57d2908e-ab77-494f-91e7-dcdabee84614-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:59 crc kubenswrapper[4700]: I1007 11:37:59.189038 4700 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57d2908e-ab77-494f-91e7-dcdabee84614-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:59 crc kubenswrapper[4700]: I1007 11:37:59.189051 4700 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57d2908e-ab77-494f-91e7-dcdabee84614-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:59 crc kubenswrapper[4700]: I1007 11:37:59.189060 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57d2908e-ab77-494f-91e7-dcdabee84614-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:59 crc kubenswrapper[4700]: I1007 11:37:59.189068 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fx6kv\" (UniqueName: \"kubernetes.io/projected/57d2908e-ab77-494f-91e7-dcdabee84614-kube-api-access-fx6kv\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:59 crc kubenswrapper[4700]: I1007 11:37:59.189077 4700 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/57d2908e-ab77-494f-91e7-dcdabee84614-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 11:37:59 crc kubenswrapper[4700]: I1007 11:37:59.517937 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-jkbj4" event={"ID":"57d2908e-ab77-494f-91e7-dcdabee84614","Type":"ContainerDied","Data":"83e44672eea7cd7130dd1e5e7895bc0a70fd9d4b89898f2d5f8124045e3ce716"} Oct 07 11:37:59 crc kubenswrapper[4700]: I1007 11:37:59.517995 4700 scope.go:117] "RemoveContainer" containerID="f5095e70592475b3be954a50f424d62a14a9c7e593aed7f0f40b3c20ec5499f7" Oct 07 11:37:59 crc kubenswrapper[4700]: I1007 11:37:59.518137 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-jkbj4" Oct 07 11:37:59 crc kubenswrapper[4700]: I1007 11:37:59.553358 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-jkbj4"] Oct 07 11:37:59 crc kubenswrapper[4700]: I1007 11:37:59.559739 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-jkbj4"] Oct 07 11:37:59 crc kubenswrapper[4700]: I1007 11:37:59.976984 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57d2908e-ab77-494f-91e7-dcdabee84614" path="/var/lib/kubelet/pods/57d2908e-ab77-494f-91e7-dcdabee84614/volumes" Oct 07 11:38:00 crc kubenswrapper[4700]: I1007 11:38:00.339753 4700 scope.go:117] "RemoveContainer" containerID="08f28cacd2f8e7af07963b9aa11dd780a92ddb49d61f1f09366669f1c8790b29" Oct 07 11:38:00 crc kubenswrapper[4700]: E1007 11:38:00.356657 4700 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24:latest" Oct 07 11:38:00 crc kubenswrapper[4700]: E1007 11:38:00.356868 4700 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24:latest,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hl8jd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(4f1fee1b-3e64-4266-b787-c2804804a232): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 11:38:00 crc kubenswrapper[4700]: E1007 11:38:00.358095 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="4f1fee1b-3e64-4266-b787-c2804804a232" Oct 07 11:38:00 crc kubenswrapper[4700]: I1007 11:38:00.554794 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4f1fee1b-3e64-4266-b787-c2804804a232" containerName="ceilometer-central-agent" containerID="cri-o://1f0493224277713d33d3c5331e8bb336b71e555a8e3b1707531c13cd3a7e06ba" gracePeriod=30 Oct 07 11:38:00 crc kubenswrapper[4700]: I1007 11:38:00.555210 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4f1fee1b-3e64-4266-b787-c2804804a232" containerName="sg-core" containerID="cri-o://8ba875b9970322122937419ca40d4a433e564a2f21d72bb77973f8eac13b7334" gracePeriod=30 Oct 07 11:38:00 crc kubenswrapper[4700]: I1007 11:38:00.555362 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4f1fee1b-3e64-4266-b787-c2804804a232" containerName="ceilometer-notification-agent" containerID="cri-o://baa5af43a455336c1da8d2c15456158bfd440fe9e926e442d93be4e6efa18560" gracePeriod=30 Oct 07 11:38:00 crc kubenswrapper[4700]: E1007 11:38:00.664230 4700 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f1fee1b_3e64_4266_b787_c2804804a232.slice/crio-8ba875b9970322122937419ca40d4a433e564a2f21d72bb77973f8eac13b7334.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f1fee1b_3e64_4266_b787_c2804804a232.slice/crio-conmon-8ba875b9970322122937419ca40d4a433e564a2f21d72bb77973f8eac13b7334.scope\": RecentStats: unable to find data in memory cache]" Oct 07 11:38:01 crc kubenswrapper[4700]: I1007 11:38:01.564914 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-65bb648478-f5h6h" event={"ID":"62cb738e-4901-49c1-8516-02b0c2a44482","Type":"ContainerStarted","Data":"7e9378d2ad6a481062a80176e487f11103a99ce7a9be2dc2dee37ab941c600d3"} Oct 07 11:38:01 crc kubenswrapper[4700]: I1007 11:38:01.565251 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-65bb648478-f5h6h" event={"ID":"62cb738e-4901-49c1-8516-02b0c2a44482","Type":"ContainerStarted","Data":"ec703d2e3e1c251328f6a2ca9ff8b64cdebf29522baa4f1cb7caddb99154f5cc"} Oct 07 11:38:01 crc kubenswrapper[4700]: I1007 11:38:01.567083 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-746789dcd4-wsdtq" event={"ID":"c1e6ae51-277f-403c-a01a-5786160b1298","Type":"ContainerStarted","Data":"387f8935b95a98c9231323b8fe3570c41828998c71adab9345720a3ac07ae16f"} Oct 07 11:38:01 crc kubenswrapper[4700]: I1007 11:38:01.567357 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-746789dcd4-wsdtq" Oct 07 11:38:01 crc kubenswrapper[4700]: I1007 11:38:01.571265 4700 generic.go:334] "Generic (PLEG): container finished" podID="4f1fee1b-3e64-4266-b787-c2804804a232" containerID="8ba875b9970322122937419ca40d4a433e564a2f21d72bb77973f8eac13b7334" exitCode=2 Oct 07 11:38:01 crc kubenswrapper[4700]: I1007 11:38:01.571307 4700 generic.go:334] "Generic (PLEG): container finished" podID="4f1fee1b-3e64-4266-b787-c2804804a232" containerID="1f0493224277713d33d3c5331e8bb336b71e555a8e3b1707531c13cd3a7e06ba" exitCode=0 Oct 07 11:38:01 crc kubenswrapper[4700]: I1007 11:38:01.571370 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f1fee1b-3e64-4266-b787-c2804804a232","Type":"ContainerDied","Data":"8ba875b9970322122937419ca40d4a433e564a2f21d72bb77973f8eac13b7334"} Oct 07 11:38:01 crc kubenswrapper[4700]: I1007 11:38:01.571400 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f1fee1b-3e64-4266-b787-c2804804a232","Type":"ContainerDied","Data":"1f0493224277713d33d3c5331e8bb336b71e555a8e3b1707531c13cd3a7e06ba"} Oct 07 11:38:01 crc kubenswrapper[4700]: I1007 11:38:01.572953 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-lp48h" event={"ID":"84e22581-4f7e-45b6-9932-aaa790c7825f","Type":"ContainerStarted","Data":"f47c5fc54b832e365d48fd596c6fe520b6734a1d06be329d0926b66509fcc143"} Oct 07 11:38:01 crc kubenswrapper[4700]: I1007 11:38:01.576753 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-55444c599f-s65df" event={"ID":"f28d9836-f2c1-4a60-97fd-324ba6b0331b","Type":"ContainerStarted","Data":"b05c86a98880d9d520beded784728bba006dcd091e7057653fec557a275649af"} Oct 07 11:38:01 crc kubenswrapper[4700]: I1007 11:38:01.576804 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-55444c599f-s65df" event={"ID":"f28d9836-f2c1-4a60-97fd-324ba6b0331b","Type":"ContainerStarted","Data":"7a2bb83598464f7828c75daef2b130678a0c39178676c9e23f317f67c6d5ddcb"} Oct 07 11:38:01 crc kubenswrapper[4700]: I1007 11:38:01.584249 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-bqgg5" event={"ID":"bdc86b02-ca87-4bda-869a-6fd42e5f5f1b","Type":"ContainerStarted","Data":"8b96c58a36186d71ac832d39b310bc7f1fae8b9a4826f510d713d33a95b97532"} Oct 07 11:38:01 crc kubenswrapper[4700]: I1007 11:38:01.621282 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-65bb648478-f5h6h" podStartSLOduration=2.646786606 podStartE2EDuration="18.621255034s" podCreationTimestamp="2025-10-07 11:37:43 +0000 UTC" firstStartedPulling="2025-10-07 11:37:44.327629945 +0000 UTC m=+1031.124028924" lastFinishedPulling="2025-10-07 11:38:00.302098333 +0000 UTC m=+1047.098497352" observedRunningTime="2025-10-07 11:38:01.588549802 +0000 UTC m=+1048.384948791" watchObservedRunningTime="2025-10-07 11:38:01.621255034 +0000 UTC m=+1048.417654023" Oct 07 11:38:01 crc kubenswrapper[4700]: I1007 11:38:01.630680 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-746789dcd4-wsdtq" podStartSLOduration=15.630646578 podStartE2EDuration="15.630646578s" podCreationTimestamp="2025-10-07 11:37:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:38:01.613225805 +0000 UTC m=+1048.409624794" watchObservedRunningTime="2025-10-07 11:38:01.630646578 +0000 UTC m=+1048.427045567" Oct 07 11:38:01 crc kubenswrapper[4700]: I1007 11:38:01.651052 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-bqgg5" podStartSLOduration=3.320165394 podStartE2EDuration="46.651027939s" podCreationTimestamp="2025-10-07 11:37:15 +0000 UTC" firstStartedPulling="2025-10-07 11:37:17.13371785 +0000 UTC m=+1003.930116839" lastFinishedPulling="2025-10-07 11:38:00.464580395 +0000 UTC m=+1047.260979384" observedRunningTime="2025-10-07 11:38:01.63800799 +0000 UTC m=+1048.434406969" watchObservedRunningTime="2025-10-07 11:38:01.651027939 +0000 UTC m=+1048.447426928" Oct 07 11:38:01 crc kubenswrapper[4700]: I1007 11:38:01.660965 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-55444c599f-s65df" podStartSLOduration=2.680466473 podStartE2EDuration="18.660922827s" podCreationTimestamp="2025-10-07 11:37:43 +0000 UTC" firstStartedPulling="2025-10-07 11:37:44.321731051 +0000 UTC m=+1031.118130040" lastFinishedPulling="2025-10-07 11:38:00.302187395 +0000 UTC m=+1047.098586394" observedRunningTime="2025-10-07 11:38:01.657009435 +0000 UTC m=+1048.453408434" watchObservedRunningTime="2025-10-07 11:38:01.660922827 +0000 UTC m=+1048.457321826" Oct 07 11:38:01 crc kubenswrapper[4700]: I1007 11:38:01.681162 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-746789dcd4-wsdtq" Oct 07 11:38:01 crc kubenswrapper[4700]: I1007 11:38:01.685431 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-lp48h" podStartSLOduration=4.86808109 podStartE2EDuration="21.685410285s" podCreationTimestamp="2025-10-07 11:37:40 +0000 UTC" firstStartedPulling="2025-10-07 11:37:43.482337814 +0000 UTC m=+1030.278736803" lastFinishedPulling="2025-10-07 11:38:00.299666969 +0000 UTC m=+1047.096065998" observedRunningTime="2025-10-07 11:38:01.676918584 +0000 UTC m=+1048.473317593" watchObservedRunningTime="2025-10-07 11:38:01.685410285 +0000 UTC m=+1048.481809274" Oct 07 11:38:03 crc kubenswrapper[4700]: I1007 11:38:03.369557 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-746789dcd4-wsdtq" Oct 07 11:38:05 crc kubenswrapper[4700]: I1007 11:38:05.627770 4700 generic.go:334] "Generic (PLEG): container finished" podID="4f1fee1b-3e64-4266-b787-c2804804a232" containerID="baa5af43a455336c1da8d2c15456158bfd440fe9e926e442d93be4e6efa18560" exitCode=0 Oct 07 11:38:05 crc kubenswrapper[4700]: I1007 11:38:05.627827 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f1fee1b-3e64-4266-b787-c2804804a232","Type":"ContainerDied","Data":"baa5af43a455336c1da8d2c15456158bfd440fe9e926e442d93be4e6efa18560"} Oct 07 11:38:05 crc kubenswrapper[4700]: I1007 11:38:05.628377 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f1fee1b-3e64-4266-b787-c2804804a232","Type":"ContainerDied","Data":"35e0eb63f58e38d3c93eb63ed4c6ccc9c56b01729c0bdd379305b5abe24008c9"} Oct 07 11:38:05 crc kubenswrapper[4700]: I1007 11:38:05.628399 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35e0eb63f58e38d3c93eb63ed4c6ccc9c56b01729c0bdd379305b5abe24008c9" Oct 07 11:38:05 crc kubenswrapper[4700]: I1007 11:38:05.660460 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 11:38:05 crc kubenswrapper[4700]: I1007 11:38:05.809231 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f1fee1b-3e64-4266-b787-c2804804a232-config-data\") pod \"4f1fee1b-3e64-4266-b787-c2804804a232\" (UID: \"4f1fee1b-3e64-4266-b787-c2804804a232\") " Oct 07 11:38:05 crc kubenswrapper[4700]: I1007 11:38:05.809730 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f1fee1b-3e64-4266-b787-c2804804a232-log-httpd\") pod \"4f1fee1b-3e64-4266-b787-c2804804a232\" (UID: \"4f1fee1b-3e64-4266-b787-c2804804a232\") " Oct 07 11:38:05 crc kubenswrapper[4700]: I1007 11:38:05.809847 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f1fee1b-3e64-4266-b787-c2804804a232-sg-core-conf-yaml\") pod \"4f1fee1b-3e64-4266-b787-c2804804a232\" (UID: \"4f1fee1b-3e64-4266-b787-c2804804a232\") " Oct 07 11:38:05 crc kubenswrapper[4700]: I1007 11:38:05.809973 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f1fee1b-3e64-4266-b787-c2804804a232-combined-ca-bundle\") pod \"4f1fee1b-3e64-4266-b787-c2804804a232\" (UID: \"4f1fee1b-3e64-4266-b787-c2804804a232\") " Oct 07 11:38:05 crc kubenswrapper[4700]: I1007 11:38:05.810054 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f1fee1b-3e64-4266-b787-c2804804a232-scripts\") pod \"4f1fee1b-3e64-4266-b787-c2804804a232\" (UID: \"4f1fee1b-3e64-4266-b787-c2804804a232\") " Oct 07 11:38:05 crc kubenswrapper[4700]: I1007 11:38:05.810197 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hl8jd\" (UniqueName: \"kubernetes.io/projected/4f1fee1b-3e64-4266-b787-c2804804a232-kube-api-access-hl8jd\") pod \"4f1fee1b-3e64-4266-b787-c2804804a232\" (UID: \"4f1fee1b-3e64-4266-b787-c2804804a232\") " Oct 07 11:38:05 crc kubenswrapper[4700]: I1007 11:38:05.810279 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f1fee1b-3e64-4266-b787-c2804804a232-run-httpd\") pod \"4f1fee1b-3e64-4266-b787-c2804804a232\" (UID: \"4f1fee1b-3e64-4266-b787-c2804804a232\") " Oct 07 11:38:05 crc kubenswrapper[4700]: I1007 11:38:05.811187 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f1fee1b-3e64-4266-b787-c2804804a232-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4f1fee1b-3e64-4266-b787-c2804804a232" (UID: "4f1fee1b-3e64-4266-b787-c2804804a232"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:38:05 crc kubenswrapper[4700]: I1007 11:38:05.811444 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f1fee1b-3e64-4266-b787-c2804804a232-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4f1fee1b-3e64-4266-b787-c2804804a232" (UID: "4f1fee1b-3e64-4266-b787-c2804804a232"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:38:05 crc kubenswrapper[4700]: I1007 11:38:05.828652 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f1fee1b-3e64-4266-b787-c2804804a232-kube-api-access-hl8jd" (OuterVolumeSpecName: "kube-api-access-hl8jd") pod "4f1fee1b-3e64-4266-b787-c2804804a232" (UID: "4f1fee1b-3e64-4266-b787-c2804804a232"). InnerVolumeSpecName "kube-api-access-hl8jd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:38:05 crc kubenswrapper[4700]: I1007 11:38:05.829463 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f1fee1b-3e64-4266-b787-c2804804a232-scripts" (OuterVolumeSpecName: "scripts") pod "4f1fee1b-3e64-4266-b787-c2804804a232" (UID: "4f1fee1b-3e64-4266-b787-c2804804a232"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:05 crc kubenswrapper[4700]: I1007 11:38:05.844113 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f1fee1b-3e64-4266-b787-c2804804a232-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4f1fee1b-3e64-4266-b787-c2804804a232" (UID: "4f1fee1b-3e64-4266-b787-c2804804a232"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:05 crc kubenswrapper[4700]: I1007 11:38:05.858670 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f1fee1b-3e64-4266-b787-c2804804a232-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f1fee1b-3e64-4266-b787-c2804804a232" (UID: "4f1fee1b-3e64-4266-b787-c2804804a232"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:05 crc kubenswrapper[4700]: I1007 11:38:05.882666 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f1fee1b-3e64-4266-b787-c2804804a232-config-data" (OuterVolumeSpecName: "config-data") pod "4f1fee1b-3e64-4266-b787-c2804804a232" (UID: "4f1fee1b-3e64-4266-b787-c2804804a232"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:05 crc kubenswrapper[4700]: I1007 11:38:05.912507 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f1fee1b-3e64-4266-b787-c2804804a232-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:05 crc kubenswrapper[4700]: I1007 11:38:05.912554 4700 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f1fee1b-3e64-4266-b787-c2804804a232-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:05 crc kubenswrapper[4700]: I1007 11:38:05.912570 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hl8jd\" (UniqueName: \"kubernetes.io/projected/4f1fee1b-3e64-4266-b787-c2804804a232-kube-api-access-hl8jd\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:05 crc kubenswrapper[4700]: I1007 11:38:05.912586 4700 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f1fee1b-3e64-4266-b787-c2804804a232-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:05 crc kubenswrapper[4700]: I1007 11:38:05.912596 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f1fee1b-3e64-4266-b787-c2804804a232-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:05 crc kubenswrapper[4700]: I1007 11:38:05.912607 4700 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f1fee1b-3e64-4266-b787-c2804804a232-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:05 crc kubenswrapper[4700]: I1007 11:38:05.912618 4700 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f1fee1b-3e64-4266-b787-c2804804a232-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.640331 4700 generic.go:334] "Generic (PLEG): container finished" podID="84e22581-4f7e-45b6-9932-aaa790c7825f" containerID="f47c5fc54b832e365d48fd596c6fe520b6734a1d06be329d0926b66509fcc143" exitCode=0 Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.640428 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-lp48h" event={"ID":"84e22581-4f7e-45b6-9932-aaa790c7825f","Type":"ContainerDied","Data":"f47c5fc54b832e365d48fd596c6fe520b6734a1d06be329d0926b66509fcc143"} Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.640455 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.728659 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.734703 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.749886 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:38:06 crc kubenswrapper[4700]: E1007 11:38:06.750255 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57d2908e-ab77-494f-91e7-dcdabee84614" containerName="init" Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.750271 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="57d2908e-ab77-494f-91e7-dcdabee84614" containerName="init" Oct 07 11:38:06 crc kubenswrapper[4700]: E1007 11:38:06.750293 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57d2908e-ab77-494f-91e7-dcdabee84614" containerName="dnsmasq-dns" Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.750300 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="57d2908e-ab77-494f-91e7-dcdabee84614" containerName="dnsmasq-dns" Oct 07 11:38:06 crc kubenswrapper[4700]: E1007 11:38:06.750443 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f1fee1b-3e64-4266-b787-c2804804a232" containerName="sg-core" Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.750451 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f1fee1b-3e64-4266-b787-c2804804a232" containerName="sg-core" Oct 07 11:38:06 crc kubenswrapper[4700]: E1007 11:38:06.750480 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f1fee1b-3e64-4266-b787-c2804804a232" containerName="ceilometer-notification-agent" Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.750488 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f1fee1b-3e64-4266-b787-c2804804a232" containerName="ceilometer-notification-agent" Oct 07 11:38:06 crc kubenswrapper[4700]: E1007 11:38:06.750500 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f1fee1b-3e64-4266-b787-c2804804a232" containerName="ceilometer-central-agent" Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.750506 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f1fee1b-3e64-4266-b787-c2804804a232" containerName="ceilometer-central-agent" Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.750676 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f1fee1b-3e64-4266-b787-c2804804a232" containerName="sg-core" Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.750685 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f1fee1b-3e64-4266-b787-c2804804a232" containerName="ceilometer-notification-agent" Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.750700 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="57d2908e-ab77-494f-91e7-dcdabee84614" containerName="dnsmasq-dns" Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.750711 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f1fee1b-3e64-4266-b787-c2804804a232" containerName="ceilometer-central-agent" Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.752150 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.755006 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.755365 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.773777 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.829423 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a92f3ac-8266-4315-92f9-87bfffdc5660-scripts\") pod \"ceilometer-0\" (UID: \"6a92f3ac-8266-4315-92f9-87bfffdc5660\") " pod="openstack/ceilometer-0" Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.829508 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a92f3ac-8266-4315-92f9-87bfffdc5660-log-httpd\") pod \"ceilometer-0\" (UID: \"6a92f3ac-8266-4315-92f9-87bfffdc5660\") " pod="openstack/ceilometer-0" Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.829737 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a92f3ac-8266-4315-92f9-87bfffdc5660-run-httpd\") pod \"ceilometer-0\" (UID: \"6a92f3ac-8266-4315-92f9-87bfffdc5660\") " pod="openstack/ceilometer-0" Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.829808 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkmfw\" (UniqueName: \"kubernetes.io/projected/6a92f3ac-8266-4315-92f9-87bfffdc5660-kube-api-access-bkmfw\") pod \"ceilometer-0\" (UID: \"6a92f3ac-8266-4315-92f9-87bfffdc5660\") " pod="openstack/ceilometer-0" Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.829887 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a92f3ac-8266-4315-92f9-87bfffdc5660-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6a92f3ac-8266-4315-92f9-87bfffdc5660\") " pod="openstack/ceilometer-0" Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.829915 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a92f3ac-8266-4315-92f9-87bfffdc5660-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6a92f3ac-8266-4315-92f9-87bfffdc5660\") " pod="openstack/ceilometer-0" Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.829963 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a92f3ac-8266-4315-92f9-87bfffdc5660-config-data\") pod \"ceilometer-0\" (UID: \"6a92f3ac-8266-4315-92f9-87bfffdc5660\") " pod="openstack/ceilometer-0" Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.931853 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a92f3ac-8266-4315-92f9-87bfffdc5660-scripts\") pod \"ceilometer-0\" (UID: \"6a92f3ac-8266-4315-92f9-87bfffdc5660\") " pod="openstack/ceilometer-0" Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.931975 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a92f3ac-8266-4315-92f9-87bfffdc5660-log-httpd\") pod \"ceilometer-0\" (UID: \"6a92f3ac-8266-4315-92f9-87bfffdc5660\") " pod="openstack/ceilometer-0" Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.932050 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a92f3ac-8266-4315-92f9-87bfffdc5660-run-httpd\") pod \"ceilometer-0\" (UID: \"6a92f3ac-8266-4315-92f9-87bfffdc5660\") " pod="openstack/ceilometer-0" Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.932108 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkmfw\" (UniqueName: \"kubernetes.io/projected/6a92f3ac-8266-4315-92f9-87bfffdc5660-kube-api-access-bkmfw\") pod \"ceilometer-0\" (UID: \"6a92f3ac-8266-4315-92f9-87bfffdc5660\") " pod="openstack/ceilometer-0" Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.932142 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a92f3ac-8266-4315-92f9-87bfffdc5660-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6a92f3ac-8266-4315-92f9-87bfffdc5660\") " pod="openstack/ceilometer-0" Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.932164 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a92f3ac-8266-4315-92f9-87bfffdc5660-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6a92f3ac-8266-4315-92f9-87bfffdc5660\") " pod="openstack/ceilometer-0" Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.932203 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a92f3ac-8266-4315-92f9-87bfffdc5660-config-data\") pod \"ceilometer-0\" (UID: \"6a92f3ac-8266-4315-92f9-87bfffdc5660\") " pod="openstack/ceilometer-0" Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.932517 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a92f3ac-8266-4315-92f9-87bfffdc5660-log-httpd\") pod \"ceilometer-0\" (UID: \"6a92f3ac-8266-4315-92f9-87bfffdc5660\") " pod="openstack/ceilometer-0" Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.932744 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a92f3ac-8266-4315-92f9-87bfffdc5660-run-httpd\") pod \"ceilometer-0\" (UID: \"6a92f3ac-8266-4315-92f9-87bfffdc5660\") " pod="openstack/ceilometer-0" Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.939116 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a92f3ac-8266-4315-92f9-87bfffdc5660-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6a92f3ac-8266-4315-92f9-87bfffdc5660\") " pod="openstack/ceilometer-0" Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.939465 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a92f3ac-8266-4315-92f9-87bfffdc5660-scripts\") pod \"ceilometer-0\" (UID: \"6a92f3ac-8266-4315-92f9-87bfffdc5660\") " pod="openstack/ceilometer-0" Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.940124 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a92f3ac-8266-4315-92f9-87bfffdc5660-config-data\") pod \"ceilometer-0\" (UID: \"6a92f3ac-8266-4315-92f9-87bfffdc5660\") " pod="openstack/ceilometer-0" Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.942826 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a92f3ac-8266-4315-92f9-87bfffdc5660-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6a92f3ac-8266-4315-92f9-87bfffdc5660\") " pod="openstack/ceilometer-0" Oct 07 11:38:06 crc kubenswrapper[4700]: I1007 11:38:06.955634 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkmfw\" (UniqueName: \"kubernetes.io/projected/6a92f3ac-8266-4315-92f9-87bfffdc5660-kube-api-access-bkmfw\") pod \"ceilometer-0\" (UID: \"6a92f3ac-8266-4315-92f9-87bfffdc5660\") " pod="openstack/ceilometer-0" Oct 07 11:38:07 crc kubenswrapper[4700]: I1007 11:38:07.076968 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 11:38:07 crc kubenswrapper[4700]: I1007 11:38:07.592920 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:38:07 crc kubenswrapper[4700]: W1007 11:38:07.600675 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a92f3ac_8266_4315_92f9_87bfffdc5660.slice/crio-5722fb1c2acc9096791376874910328a1debff62375f03abbfba347da299ef57 WatchSource:0}: Error finding container 5722fb1c2acc9096791376874910328a1debff62375f03abbfba347da299ef57: Status 404 returned error can't find the container with id 5722fb1c2acc9096791376874910328a1debff62375f03abbfba347da299ef57 Oct 07 11:38:07 crc kubenswrapper[4700]: I1007 11:38:07.651871 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a92f3ac-8266-4315-92f9-87bfffdc5660","Type":"ContainerStarted","Data":"5722fb1c2acc9096791376874910328a1debff62375f03abbfba347da299ef57"} Oct 07 11:38:07 crc kubenswrapper[4700]: I1007 11:38:07.969975 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f1fee1b-3e64-4266-b787-c2804804a232" path="/var/lib/kubelet/pods/4f1fee1b-3e64-4266-b787-c2804804a232/volumes" Oct 07 11:38:07 crc kubenswrapper[4700]: I1007 11:38:07.991980 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-lp48h" Oct 07 11:38:08 crc kubenswrapper[4700]: I1007 11:38:08.052024 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84e22581-4f7e-45b6-9932-aaa790c7825f-config-data\") pod \"84e22581-4f7e-45b6-9932-aaa790c7825f\" (UID: \"84e22581-4f7e-45b6-9932-aaa790c7825f\") " Oct 07 11:38:08 crc kubenswrapper[4700]: I1007 11:38:08.052109 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84e22581-4f7e-45b6-9932-aaa790c7825f-combined-ca-bundle\") pod \"84e22581-4f7e-45b6-9932-aaa790c7825f\" (UID: \"84e22581-4f7e-45b6-9932-aaa790c7825f\") " Oct 07 11:38:08 crc kubenswrapper[4700]: I1007 11:38:08.052912 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74v9p\" (UniqueName: \"kubernetes.io/projected/84e22581-4f7e-45b6-9932-aaa790c7825f-kube-api-access-74v9p\") pod \"84e22581-4f7e-45b6-9932-aaa790c7825f\" (UID: \"84e22581-4f7e-45b6-9932-aaa790c7825f\") " Oct 07 11:38:08 crc kubenswrapper[4700]: I1007 11:38:08.060510 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84e22581-4f7e-45b6-9932-aaa790c7825f-kube-api-access-74v9p" (OuterVolumeSpecName: "kube-api-access-74v9p") pod "84e22581-4f7e-45b6-9932-aaa790c7825f" (UID: "84e22581-4f7e-45b6-9932-aaa790c7825f"). InnerVolumeSpecName "kube-api-access-74v9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:38:08 crc kubenswrapper[4700]: I1007 11:38:08.084715 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84e22581-4f7e-45b6-9932-aaa790c7825f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84e22581-4f7e-45b6-9932-aaa790c7825f" (UID: "84e22581-4f7e-45b6-9932-aaa790c7825f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:08 crc kubenswrapper[4700]: I1007 11:38:08.121862 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84e22581-4f7e-45b6-9932-aaa790c7825f-config-data" (OuterVolumeSpecName: "config-data") pod "84e22581-4f7e-45b6-9932-aaa790c7825f" (UID: "84e22581-4f7e-45b6-9932-aaa790c7825f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:08 crc kubenswrapper[4700]: I1007 11:38:08.155330 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84e22581-4f7e-45b6-9932-aaa790c7825f-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:08 crc kubenswrapper[4700]: I1007 11:38:08.155366 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84e22581-4f7e-45b6-9932-aaa790c7825f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:08 crc kubenswrapper[4700]: I1007 11:38:08.155383 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74v9p\" (UniqueName: \"kubernetes.io/projected/84e22581-4f7e-45b6-9932-aaa790c7825f-kube-api-access-74v9p\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:08 crc kubenswrapper[4700]: I1007 11:38:08.339351 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-746789dcd4-wsdtq" Oct 07 11:38:08 crc kubenswrapper[4700]: I1007 11:38:08.475601 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-85c8cc6bbb-jj2g6"] Oct 07 11:38:08 crc kubenswrapper[4700]: I1007 11:38:08.476192 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-85c8cc6bbb-jj2g6" podUID="3b640b42-7f62-466a-b115-6d9a530e35a6" containerName="barbican-api-log" containerID="cri-o://99d8cdd0dc4c59f3a514986754227c73f83585b40756fceb170120a7f36b96ff" gracePeriod=30 Oct 07 11:38:08 crc kubenswrapper[4700]: I1007 11:38:08.476364 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-85c8cc6bbb-jj2g6" podUID="3b640b42-7f62-466a-b115-6d9a530e35a6" containerName="barbican-api" containerID="cri-o://9d4ef75aeb731bef542854e31e3e24f596e9d158208027f738c25f2d21ee234e" gracePeriod=30 Oct 07 11:38:08 crc kubenswrapper[4700]: I1007 11:38:08.671828 4700 generic.go:334] "Generic (PLEG): container finished" podID="3b640b42-7f62-466a-b115-6d9a530e35a6" containerID="99d8cdd0dc4c59f3a514986754227c73f83585b40756fceb170120a7f36b96ff" exitCode=143 Oct 07 11:38:08 crc kubenswrapper[4700]: I1007 11:38:08.671888 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-85c8cc6bbb-jj2g6" event={"ID":"3b640b42-7f62-466a-b115-6d9a530e35a6","Type":"ContainerDied","Data":"99d8cdd0dc4c59f3a514986754227c73f83585b40756fceb170120a7f36b96ff"} Oct 07 11:38:08 crc kubenswrapper[4700]: I1007 11:38:08.675448 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-lp48h" event={"ID":"84e22581-4f7e-45b6-9932-aaa790c7825f","Type":"ContainerDied","Data":"a56619c5f39829f3abaadaf91251ff9cacc623356bfc156f635c57b51b6199be"} Oct 07 11:38:08 crc kubenswrapper[4700]: I1007 11:38:08.675487 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a56619c5f39829f3abaadaf91251ff9cacc623356bfc156f635c57b51b6199be" Oct 07 11:38:08 crc kubenswrapper[4700]: I1007 11:38:08.676293 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-lp48h" Oct 07 11:38:08 crc kubenswrapper[4700]: I1007 11:38:08.680277 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a92f3ac-8266-4315-92f9-87bfffdc5660","Type":"ContainerStarted","Data":"e6d56bce692368beab37f3b7e247428068e180485ce31f6d8762a56c4b56248c"} Oct 07 11:38:08 crc kubenswrapper[4700]: I1007 11:38:08.682001 4700 generic.go:334] "Generic (PLEG): container finished" podID="bdc86b02-ca87-4bda-869a-6fd42e5f5f1b" containerID="8b96c58a36186d71ac832d39b310bc7f1fae8b9a4826f510d713d33a95b97532" exitCode=0 Oct 07 11:38:08 crc kubenswrapper[4700]: I1007 11:38:08.682028 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-bqgg5" event={"ID":"bdc86b02-ca87-4bda-869a-6fd42e5f5f1b","Type":"ContainerDied","Data":"8b96c58a36186d71ac832d39b310bc7f1fae8b9a4826f510d713d33a95b97532"} Oct 07 11:38:09 crc kubenswrapper[4700]: I1007 11:38:09.691612 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a92f3ac-8266-4315-92f9-87bfffdc5660","Type":"ContainerStarted","Data":"2739cdef6b7e2e259fb240f04cddb29fea6bbf362e7d5f1135d21f8fa61bb21b"} Oct 07 11:38:09 crc kubenswrapper[4700]: I1007 11:38:09.694136 4700 generic.go:334] "Generic (PLEG): container finished" podID="587378c1-b0c6-4d31-980f-0b6e8e271903" containerID="f4bc679c9b20142445a482ac4aba8ea4e86969665d47bbf62800fd88ca36709c" exitCode=0 Oct 07 11:38:09 crc kubenswrapper[4700]: I1007 11:38:09.694205 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-z8zs9" event={"ID":"587378c1-b0c6-4d31-980f-0b6e8e271903","Type":"ContainerDied","Data":"f4bc679c9b20142445a482ac4aba8ea4e86969665d47bbf62800fd88ca36709c"} Oct 07 11:38:10 crc kubenswrapper[4700]: I1007 11:38:10.038885 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-bqgg5" Oct 07 11:38:10 crc kubenswrapper[4700]: I1007 11:38:10.090178 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bdc86b02-ca87-4bda-869a-6fd42e5f5f1b-etc-machine-id\") pod \"bdc86b02-ca87-4bda-869a-6fd42e5f5f1b\" (UID: \"bdc86b02-ca87-4bda-869a-6fd42e5f5f1b\") " Oct 07 11:38:10 crc kubenswrapper[4700]: I1007 11:38:10.090253 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdc86b02-ca87-4bda-869a-6fd42e5f5f1b-combined-ca-bundle\") pod \"bdc86b02-ca87-4bda-869a-6fd42e5f5f1b\" (UID: \"bdc86b02-ca87-4bda-869a-6fd42e5f5f1b\") " Oct 07 11:38:10 crc kubenswrapper[4700]: I1007 11:38:10.090346 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdc86b02-ca87-4bda-869a-6fd42e5f5f1b-scripts\") pod \"bdc86b02-ca87-4bda-869a-6fd42e5f5f1b\" (UID: \"bdc86b02-ca87-4bda-869a-6fd42e5f5f1b\") " Oct 07 11:38:10 crc kubenswrapper[4700]: I1007 11:38:10.090381 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdc86b02-ca87-4bda-869a-6fd42e5f5f1b-config-data\") pod \"bdc86b02-ca87-4bda-869a-6fd42e5f5f1b\" (UID: \"bdc86b02-ca87-4bda-869a-6fd42e5f5f1b\") " Oct 07 11:38:10 crc kubenswrapper[4700]: I1007 11:38:10.090372 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bdc86b02-ca87-4bda-869a-6fd42e5f5f1b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "bdc86b02-ca87-4bda-869a-6fd42e5f5f1b" (UID: "bdc86b02-ca87-4bda-869a-6fd42e5f5f1b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 11:38:10 crc kubenswrapper[4700]: I1007 11:38:10.090480 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bdc86b02-ca87-4bda-869a-6fd42e5f5f1b-db-sync-config-data\") pod \"bdc86b02-ca87-4bda-869a-6fd42e5f5f1b\" (UID: \"bdc86b02-ca87-4bda-869a-6fd42e5f5f1b\") " Oct 07 11:38:10 crc kubenswrapper[4700]: I1007 11:38:10.090537 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twqj8\" (UniqueName: \"kubernetes.io/projected/bdc86b02-ca87-4bda-869a-6fd42e5f5f1b-kube-api-access-twqj8\") pod \"bdc86b02-ca87-4bda-869a-6fd42e5f5f1b\" (UID: \"bdc86b02-ca87-4bda-869a-6fd42e5f5f1b\") " Oct 07 11:38:10 crc kubenswrapper[4700]: I1007 11:38:10.091820 4700 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bdc86b02-ca87-4bda-869a-6fd42e5f5f1b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:10 crc kubenswrapper[4700]: I1007 11:38:10.098158 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdc86b02-ca87-4bda-869a-6fd42e5f5f1b-scripts" (OuterVolumeSpecName: "scripts") pod "bdc86b02-ca87-4bda-869a-6fd42e5f5f1b" (UID: "bdc86b02-ca87-4bda-869a-6fd42e5f5f1b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:10 crc kubenswrapper[4700]: I1007 11:38:10.099451 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdc86b02-ca87-4bda-869a-6fd42e5f5f1b-kube-api-access-twqj8" (OuterVolumeSpecName: "kube-api-access-twqj8") pod "bdc86b02-ca87-4bda-869a-6fd42e5f5f1b" (UID: "bdc86b02-ca87-4bda-869a-6fd42e5f5f1b"). InnerVolumeSpecName "kube-api-access-twqj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:38:10 crc kubenswrapper[4700]: I1007 11:38:10.101041 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdc86b02-ca87-4bda-869a-6fd42e5f5f1b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "bdc86b02-ca87-4bda-869a-6fd42e5f5f1b" (UID: "bdc86b02-ca87-4bda-869a-6fd42e5f5f1b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:10 crc kubenswrapper[4700]: I1007 11:38:10.124476 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdc86b02-ca87-4bda-869a-6fd42e5f5f1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bdc86b02-ca87-4bda-869a-6fd42e5f5f1b" (UID: "bdc86b02-ca87-4bda-869a-6fd42e5f5f1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:10 crc kubenswrapper[4700]: I1007 11:38:10.146277 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdc86b02-ca87-4bda-869a-6fd42e5f5f1b-config-data" (OuterVolumeSpecName: "config-data") pod "bdc86b02-ca87-4bda-869a-6fd42e5f5f1b" (UID: "bdc86b02-ca87-4bda-869a-6fd42e5f5f1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:10 crc kubenswrapper[4700]: I1007 11:38:10.193621 4700 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bdc86b02-ca87-4bda-869a-6fd42e5f5f1b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:10 crc kubenswrapper[4700]: I1007 11:38:10.193740 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twqj8\" (UniqueName: \"kubernetes.io/projected/bdc86b02-ca87-4bda-869a-6fd42e5f5f1b-kube-api-access-twqj8\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:10 crc kubenswrapper[4700]: I1007 11:38:10.193846 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdc86b02-ca87-4bda-869a-6fd42e5f5f1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:10 crc kubenswrapper[4700]: I1007 11:38:10.193905 4700 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdc86b02-ca87-4bda-869a-6fd42e5f5f1b-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:10 crc kubenswrapper[4700]: I1007 11:38:10.193959 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdc86b02-ca87-4bda-869a-6fd42e5f5f1b-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:10 crc kubenswrapper[4700]: I1007 11:38:10.710031 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a92f3ac-8266-4315-92f9-87bfffdc5660","Type":"ContainerStarted","Data":"93173686d028ce24f9157f7fdf9d1711876ac324a6aadc898e0a3f2526f1e730"} Oct 07 11:38:10 crc kubenswrapper[4700]: I1007 11:38:10.711834 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-bqgg5" event={"ID":"bdc86b02-ca87-4bda-869a-6fd42e5f5f1b","Type":"ContainerDied","Data":"c42280b85c4a6f7e681273f655f89bcbca0f0635ac011e1092394d6aa2478902"} Oct 07 11:38:10 crc kubenswrapper[4700]: I1007 11:38:10.711871 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-bqgg5" Oct 07 11:38:10 crc kubenswrapper[4700]: I1007 11:38:10.711880 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c42280b85c4a6f7e681273f655f89bcbca0f0635ac011e1092394d6aa2478902" Oct 07 11:38:10 crc kubenswrapper[4700]: E1007 11:38:10.966864 4700 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdc86b02_ca87_4bda_869a_6fd42e5f5f1b.slice/crio-c42280b85c4a6f7e681273f655f89bcbca0f0635ac011e1092394d6aa2478902\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdc86b02_ca87_4bda_869a_6fd42e5f5f1b.slice\": RecentStats: unable to find data in memory cache]" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.033899 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 11:38:11 crc kubenswrapper[4700]: E1007 11:38:11.034980 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84e22581-4f7e-45b6-9932-aaa790c7825f" containerName="heat-db-sync" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.035003 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="84e22581-4f7e-45b6-9932-aaa790c7825f" containerName="heat-db-sync" Oct 07 11:38:11 crc kubenswrapper[4700]: E1007 11:38:11.035037 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdc86b02-ca87-4bda-869a-6fd42e5f5f1b" containerName="cinder-db-sync" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.035046 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdc86b02-ca87-4bda-869a-6fd42e5f5f1b" containerName="cinder-db-sync" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.035245 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdc86b02-ca87-4bda-869a-6fd42e5f5f1b" containerName="cinder-db-sync" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.035289 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="84e22581-4f7e-45b6-9932-aaa790c7825f" containerName="heat-db-sync" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.036380 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.040117 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.040426 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.040552 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.040751 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-jtrf6" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.046679 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.115188 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9fa37e8f-6dfc-44f0-8fab-1dfa641475cb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9fa37e8f-6dfc-44f0-8fab-1dfa641475cb\") " pod="openstack/cinder-scheduler-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.115244 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9fa37e8f-6dfc-44f0-8fab-1dfa641475cb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9fa37e8f-6dfc-44f0-8fab-1dfa641475cb\") " pod="openstack/cinder-scheduler-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.115283 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fa37e8f-6dfc-44f0-8fab-1dfa641475cb-config-data\") pod \"cinder-scheduler-0\" (UID: \"9fa37e8f-6dfc-44f0-8fab-1dfa641475cb\") " pod="openstack/cinder-scheduler-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.115333 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fa37e8f-6dfc-44f0-8fab-1dfa641475cb-scripts\") pod \"cinder-scheduler-0\" (UID: \"9fa37e8f-6dfc-44f0-8fab-1dfa641475cb\") " pod="openstack/cinder-scheduler-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.115375 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fa37e8f-6dfc-44f0-8fab-1dfa641475cb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9fa37e8f-6dfc-44f0-8fab-1dfa641475cb\") " pod="openstack/cinder-scheduler-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.115430 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4h98\" (UniqueName: \"kubernetes.io/projected/9fa37e8f-6dfc-44f0-8fab-1dfa641475cb-kube-api-access-d4h98\") pod \"cinder-scheduler-0\" (UID: \"9fa37e8f-6dfc-44f0-8fab-1dfa641475cb\") " pod="openstack/cinder-scheduler-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.123077 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-sqm5t"] Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.128302 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c986f6d7-sqm5t" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.154877 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-sqm5t"] Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.159440 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-z8zs9" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.216537 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/587378c1-b0c6-4d31-980f-0b6e8e271903-config\") pod \"587378c1-b0c6-4d31-980f-0b6e8e271903\" (UID: \"587378c1-b0c6-4d31-980f-0b6e8e271903\") " Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.216619 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/587378c1-b0c6-4d31-980f-0b6e8e271903-combined-ca-bundle\") pod \"587378c1-b0c6-4d31-980f-0b6e8e271903\" (UID: \"587378c1-b0c6-4d31-980f-0b6e8e271903\") " Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.216927 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5xzm\" (UniqueName: \"kubernetes.io/projected/587378c1-b0c6-4d31-980f-0b6e8e271903-kube-api-access-t5xzm\") pod \"587378c1-b0c6-4d31-980f-0b6e8e271903\" (UID: \"587378c1-b0c6-4d31-980f-0b6e8e271903\") " Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.217227 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fa37e8f-6dfc-44f0-8fab-1dfa641475cb-config-data\") pod \"cinder-scheduler-0\" (UID: \"9fa37e8f-6dfc-44f0-8fab-1dfa641475cb\") " pod="openstack/cinder-scheduler-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.217252 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfv6d\" (UniqueName: \"kubernetes.io/projected/3d15f870-78db-4dac-8cd3-c50ff365d666-kube-api-access-hfv6d\") pod \"dnsmasq-dns-69c986f6d7-sqm5t\" (UID: \"3d15f870-78db-4dac-8cd3-c50ff365d666\") " pod="openstack/dnsmasq-dns-69c986f6d7-sqm5t" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.217271 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d15f870-78db-4dac-8cd3-c50ff365d666-dns-swift-storage-0\") pod \"dnsmasq-dns-69c986f6d7-sqm5t\" (UID: \"3d15f870-78db-4dac-8cd3-c50ff365d666\") " pod="openstack/dnsmasq-dns-69c986f6d7-sqm5t" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.217337 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d15f870-78db-4dac-8cd3-c50ff365d666-config\") pod \"dnsmasq-dns-69c986f6d7-sqm5t\" (UID: \"3d15f870-78db-4dac-8cd3-c50ff365d666\") " pod="openstack/dnsmasq-dns-69c986f6d7-sqm5t" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.217356 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fa37e8f-6dfc-44f0-8fab-1dfa641475cb-scripts\") pod \"cinder-scheduler-0\" (UID: \"9fa37e8f-6dfc-44f0-8fab-1dfa641475cb\") " pod="openstack/cinder-scheduler-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.217410 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d15f870-78db-4dac-8cd3-c50ff365d666-ovsdbserver-sb\") pod \"dnsmasq-dns-69c986f6d7-sqm5t\" (UID: \"3d15f870-78db-4dac-8cd3-c50ff365d666\") " pod="openstack/dnsmasq-dns-69c986f6d7-sqm5t" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.217440 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fa37e8f-6dfc-44f0-8fab-1dfa641475cb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9fa37e8f-6dfc-44f0-8fab-1dfa641475cb\") " pod="openstack/cinder-scheduler-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.217496 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d15f870-78db-4dac-8cd3-c50ff365d666-dns-svc\") pod \"dnsmasq-dns-69c986f6d7-sqm5t\" (UID: \"3d15f870-78db-4dac-8cd3-c50ff365d666\") " pod="openstack/dnsmasq-dns-69c986f6d7-sqm5t" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.217536 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4h98\" (UniqueName: \"kubernetes.io/projected/9fa37e8f-6dfc-44f0-8fab-1dfa641475cb-kube-api-access-d4h98\") pod \"cinder-scheduler-0\" (UID: \"9fa37e8f-6dfc-44f0-8fab-1dfa641475cb\") " pod="openstack/cinder-scheduler-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.217571 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d15f870-78db-4dac-8cd3-c50ff365d666-ovsdbserver-nb\") pod \"dnsmasq-dns-69c986f6d7-sqm5t\" (UID: \"3d15f870-78db-4dac-8cd3-c50ff365d666\") " pod="openstack/dnsmasq-dns-69c986f6d7-sqm5t" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.217601 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9fa37e8f-6dfc-44f0-8fab-1dfa641475cb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9fa37e8f-6dfc-44f0-8fab-1dfa641475cb\") " pod="openstack/cinder-scheduler-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.217642 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9fa37e8f-6dfc-44f0-8fab-1dfa641475cb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9fa37e8f-6dfc-44f0-8fab-1dfa641475cb\") " pod="openstack/cinder-scheduler-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.217740 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9fa37e8f-6dfc-44f0-8fab-1dfa641475cb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9fa37e8f-6dfc-44f0-8fab-1dfa641475cb\") " pod="openstack/cinder-scheduler-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.223754 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/587378c1-b0c6-4d31-980f-0b6e8e271903-kube-api-access-t5xzm" (OuterVolumeSpecName: "kube-api-access-t5xzm") pod "587378c1-b0c6-4d31-980f-0b6e8e271903" (UID: "587378c1-b0c6-4d31-980f-0b6e8e271903"). InnerVolumeSpecName "kube-api-access-t5xzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.228394 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fa37e8f-6dfc-44f0-8fab-1dfa641475cb-scripts\") pod \"cinder-scheduler-0\" (UID: \"9fa37e8f-6dfc-44f0-8fab-1dfa641475cb\") " pod="openstack/cinder-scheduler-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.228615 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fa37e8f-6dfc-44f0-8fab-1dfa641475cb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9fa37e8f-6dfc-44f0-8fab-1dfa641475cb\") " pod="openstack/cinder-scheduler-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.228621 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9fa37e8f-6dfc-44f0-8fab-1dfa641475cb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9fa37e8f-6dfc-44f0-8fab-1dfa641475cb\") " pod="openstack/cinder-scheduler-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.237755 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fa37e8f-6dfc-44f0-8fab-1dfa641475cb-config-data\") pod \"cinder-scheduler-0\" (UID: \"9fa37e8f-6dfc-44f0-8fab-1dfa641475cb\") " pod="openstack/cinder-scheduler-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.243136 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4h98\" (UniqueName: \"kubernetes.io/projected/9fa37e8f-6dfc-44f0-8fab-1dfa641475cb-kube-api-access-d4h98\") pod \"cinder-scheduler-0\" (UID: \"9fa37e8f-6dfc-44f0-8fab-1dfa641475cb\") " pod="openstack/cinder-scheduler-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.258432 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/587378c1-b0c6-4d31-980f-0b6e8e271903-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "587378c1-b0c6-4d31-980f-0b6e8e271903" (UID: "587378c1-b0c6-4d31-980f-0b6e8e271903"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.291566 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 07 11:38:11 crc kubenswrapper[4700]: E1007 11:38:11.292039 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="587378c1-b0c6-4d31-980f-0b6e8e271903" containerName="neutron-db-sync" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.292066 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="587378c1-b0c6-4d31-980f-0b6e8e271903" containerName="neutron-db-sync" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.292276 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="587378c1-b0c6-4d31-980f-0b6e8e271903" containerName="neutron-db-sync" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.293746 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.300828 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.314293 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/587378c1-b0c6-4d31-980f-0b6e8e271903-config" (OuterVolumeSpecName: "config") pod "587378c1-b0c6-4d31-980f-0b6e8e271903" (UID: "587378c1-b0c6-4d31-980f-0b6e8e271903"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.317528 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.318714 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d15f870-78db-4dac-8cd3-c50ff365d666-config\") pod \"dnsmasq-dns-69c986f6d7-sqm5t\" (UID: \"3d15f870-78db-4dac-8cd3-c50ff365d666\") " pod="openstack/dnsmasq-dns-69c986f6d7-sqm5t" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.318792 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d15f870-78db-4dac-8cd3-c50ff365d666-ovsdbserver-sb\") pod \"dnsmasq-dns-69c986f6d7-sqm5t\" (UID: \"3d15f870-78db-4dac-8cd3-c50ff365d666\") " pod="openstack/dnsmasq-dns-69c986f6d7-sqm5t" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.318853 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d15f870-78db-4dac-8cd3-c50ff365d666-dns-svc\") pod \"dnsmasq-dns-69c986f6d7-sqm5t\" (UID: \"3d15f870-78db-4dac-8cd3-c50ff365d666\") " pod="openstack/dnsmasq-dns-69c986f6d7-sqm5t" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.318902 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d15f870-78db-4dac-8cd3-c50ff365d666-ovsdbserver-nb\") pod \"dnsmasq-dns-69c986f6d7-sqm5t\" (UID: \"3d15f870-78db-4dac-8cd3-c50ff365d666\") " pod="openstack/dnsmasq-dns-69c986f6d7-sqm5t" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.318968 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfv6d\" (UniqueName: \"kubernetes.io/projected/3d15f870-78db-4dac-8cd3-c50ff365d666-kube-api-access-hfv6d\") pod \"dnsmasq-dns-69c986f6d7-sqm5t\" (UID: \"3d15f870-78db-4dac-8cd3-c50ff365d666\") " pod="openstack/dnsmasq-dns-69c986f6d7-sqm5t" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.318991 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d15f870-78db-4dac-8cd3-c50ff365d666-dns-swift-storage-0\") pod \"dnsmasq-dns-69c986f6d7-sqm5t\" (UID: \"3d15f870-78db-4dac-8cd3-c50ff365d666\") " pod="openstack/dnsmasq-dns-69c986f6d7-sqm5t" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.319041 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5xzm\" (UniqueName: \"kubernetes.io/projected/587378c1-b0c6-4d31-980f-0b6e8e271903-kube-api-access-t5xzm\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.319060 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/587378c1-b0c6-4d31-980f-0b6e8e271903-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.319071 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/587378c1-b0c6-4d31-980f-0b6e8e271903-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.319883 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d15f870-78db-4dac-8cd3-c50ff365d666-dns-swift-storage-0\") pod \"dnsmasq-dns-69c986f6d7-sqm5t\" (UID: \"3d15f870-78db-4dac-8cd3-c50ff365d666\") " pod="openstack/dnsmasq-dns-69c986f6d7-sqm5t" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.320398 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d15f870-78db-4dac-8cd3-c50ff365d666-config\") pod \"dnsmasq-dns-69c986f6d7-sqm5t\" (UID: \"3d15f870-78db-4dac-8cd3-c50ff365d666\") " pod="openstack/dnsmasq-dns-69c986f6d7-sqm5t" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.321227 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d15f870-78db-4dac-8cd3-c50ff365d666-ovsdbserver-nb\") pod \"dnsmasq-dns-69c986f6d7-sqm5t\" (UID: \"3d15f870-78db-4dac-8cd3-c50ff365d666\") " pod="openstack/dnsmasq-dns-69c986f6d7-sqm5t" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.324003 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d15f870-78db-4dac-8cd3-c50ff365d666-ovsdbserver-sb\") pod \"dnsmasq-dns-69c986f6d7-sqm5t\" (UID: \"3d15f870-78db-4dac-8cd3-c50ff365d666\") " pod="openstack/dnsmasq-dns-69c986f6d7-sqm5t" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.326173 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d15f870-78db-4dac-8cd3-c50ff365d666-dns-svc\") pod \"dnsmasq-dns-69c986f6d7-sqm5t\" (UID: \"3d15f870-78db-4dac-8cd3-c50ff365d666\") " pod="openstack/dnsmasq-dns-69c986f6d7-sqm5t" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.344905 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfv6d\" (UniqueName: \"kubernetes.io/projected/3d15f870-78db-4dac-8cd3-c50ff365d666-kube-api-access-hfv6d\") pod \"dnsmasq-dns-69c986f6d7-sqm5t\" (UID: \"3d15f870-78db-4dac-8cd3-c50ff365d666\") " pod="openstack/dnsmasq-dns-69c986f6d7-sqm5t" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.420322 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/792a56a7-50b6-45c4-9a3d-5c4104b86859-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"792a56a7-50b6-45c4-9a3d-5c4104b86859\") " pod="openstack/cinder-api-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.420454 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/792a56a7-50b6-45c4-9a3d-5c4104b86859-scripts\") pod \"cinder-api-0\" (UID: \"792a56a7-50b6-45c4-9a3d-5c4104b86859\") " pod="openstack/cinder-api-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.420477 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/792a56a7-50b6-45c4-9a3d-5c4104b86859-config-data-custom\") pod \"cinder-api-0\" (UID: \"792a56a7-50b6-45c4-9a3d-5c4104b86859\") " pod="openstack/cinder-api-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.420520 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/792a56a7-50b6-45c4-9a3d-5c4104b86859-logs\") pod \"cinder-api-0\" (UID: \"792a56a7-50b6-45c4-9a3d-5c4104b86859\") " pod="openstack/cinder-api-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.420558 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/792a56a7-50b6-45c4-9a3d-5c4104b86859-etc-machine-id\") pod \"cinder-api-0\" (UID: \"792a56a7-50b6-45c4-9a3d-5c4104b86859\") " pod="openstack/cinder-api-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.420623 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/792a56a7-50b6-45c4-9a3d-5c4104b86859-config-data\") pod \"cinder-api-0\" (UID: \"792a56a7-50b6-45c4-9a3d-5c4104b86859\") " pod="openstack/cinder-api-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.420668 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdksd\" (UniqueName: \"kubernetes.io/projected/792a56a7-50b6-45c4-9a3d-5c4104b86859-kube-api-access-sdksd\") pod \"cinder-api-0\" (UID: \"792a56a7-50b6-45c4-9a3d-5c4104b86859\") " pod="openstack/cinder-api-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.477656 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.495540 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c986f6d7-sqm5t" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.522712 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/792a56a7-50b6-45c4-9a3d-5c4104b86859-scripts\") pod \"cinder-api-0\" (UID: \"792a56a7-50b6-45c4-9a3d-5c4104b86859\") " pod="openstack/cinder-api-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.522758 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/792a56a7-50b6-45c4-9a3d-5c4104b86859-config-data-custom\") pod \"cinder-api-0\" (UID: \"792a56a7-50b6-45c4-9a3d-5c4104b86859\") " pod="openstack/cinder-api-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.522779 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/792a56a7-50b6-45c4-9a3d-5c4104b86859-logs\") pod \"cinder-api-0\" (UID: \"792a56a7-50b6-45c4-9a3d-5c4104b86859\") " pod="openstack/cinder-api-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.522793 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/792a56a7-50b6-45c4-9a3d-5c4104b86859-etc-machine-id\") pod \"cinder-api-0\" (UID: \"792a56a7-50b6-45c4-9a3d-5c4104b86859\") " pod="openstack/cinder-api-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.522841 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/792a56a7-50b6-45c4-9a3d-5c4104b86859-config-data\") pod \"cinder-api-0\" (UID: \"792a56a7-50b6-45c4-9a3d-5c4104b86859\") " pod="openstack/cinder-api-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.522885 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdksd\" (UniqueName: \"kubernetes.io/projected/792a56a7-50b6-45c4-9a3d-5c4104b86859-kube-api-access-sdksd\") pod \"cinder-api-0\" (UID: \"792a56a7-50b6-45c4-9a3d-5c4104b86859\") " pod="openstack/cinder-api-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.522910 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/792a56a7-50b6-45c4-9a3d-5c4104b86859-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"792a56a7-50b6-45c4-9a3d-5c4104b86859\") " pod="openstack/cinder-api-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.523635 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/792a56a7-50b6-45c4-9a3d-5c4104b86859-etc-machine-id\") pod \"cinder-api-0\" (UID: \"792a56a7-50b6-45c4-9a3d-5c4104b86859\") " pod="openstack/cinder-api-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.524155 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/792a56a7-50b6-45c4-9a3d-5c4104b86859-logs\") pod \"cinder-api-0\" (UID: \"792a56a7-50b6-45c4-9a3d-5c4104b86859\") " pod="openstack/cinder-api-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.527813 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/792a56a7-50b6-45c4-9a3d-5c4104b86859-config-data\") pod \"cinder-api-0\" (UID: \"792a56a7-50b6-45c4-9a3d-5c4104b86859\") " pod="openstack/cinder-api-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.528249 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/792a56a7-50b6-45c4-9a3d-5c4104b86859-config-data-custom\") pod \"cinder-api-0\" (UID: \"792a56a7-50b6-45c4-9a3d-5c4104b86859\") " pod="openstack/cinder-api-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.528797 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/792a56a7-50b6-45c4-9a3d-5c4104b86859-scripts\") pod \"cinder-api-0\" (UID: \"792a56a7-50b6-45c4-9a3d-5c4104b86859\") " pod="openstack/cinder-api-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.530992 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/792a56a7-50b6-45c4-9a3d-5c4104b86859-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"792a56a7-50b6-45c4-9a3d-5c4104b86859\") " pod="openstack/cinder-api-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.545160 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdksd\" (UniqueName: \"kubernetes.io/projected/792a56a7-50b6-45c4-9a3d-5c4104b86859-kube-api-access-sdksd\") pod \"cinder-api-0\" (UID: \"792a56a7-50b6-45c4-9a3d-5c4104b86859\") " pod="openstack/cinder-api-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.668711 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-85c8cc6bbb-jj2g6" podUID="3b640b42-7f62-466a-b115-6d9a530e35a6" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.155:9311/healthcheck\": read tcp 10.217.0.2:53532->10.217.0.155:9311: read: connection reset by peer" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.668933 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-85c8cc6bbb-jj2g6" podUID="3b640b42-7f62-466a-b115-6d9a530e35a6" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.155:9311/healthcheck\": read tcp 10.217.0.2:53534->10.217.0.155:9311: read: connection reset by peer" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.688639 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.757124 4700 generic.go:334] "Generic (PLEG): container finished" podID="3b640b42-7f62-466a-b115-6d9a530e35a6" containerID="9d4ef75aeb731bef542854e31e3e24f596e9d158208027f738c25f2d21ee234e" exitCode=0 Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.757215 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-85c8cc6bbb-jj2g6" event={"ID":"3b640b42-7f62-466a-b115-6d9a530e35a6","Type":"ContainerDied","Data":"9d4ef75aeb731bef542854e31e3e24f596e9d158208027f738c25f2d21ee234e"} Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.759718 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-z8zs9" event={"ID":"587378c1-b0c6-4d31-980f-0b6e8e271903","Type":"ContainerDied","Data":"d701a11c28a34cb1bb813d00b32fbdf8fe89c08461926908957ee1f67ba8f5d2"} Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.759745 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d701a11c28a34cb1bb813d00b32fbdf8fe89c08461926908957ee1f67ba8f5d2" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.759795 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-z8zs9" Oct 07 11:38:11 crc kubenswrapper[4700]: I1007 11:38:11.996113 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-sqm5t"] Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.089910 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-sqm5t"] Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.141609 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-t6mnp"] Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.147721 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-t6mnp" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.151496 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-t6mnp"] Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.256195 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22a1da61-42b1-4a7a-bf08-511689c59a16-dns-svc\") pod \"dnsmasq-dns-5784cf869f-t6mnp\" (UID: \"22a1da61-42b1-4a7a-bf08-511689c59a16\") " pod="openstack/dnsmasq-dns-5784cf869f-t6mnp" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.256319 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v76bz\" (UniqueName: \"kubernetes.io/projected/22a1da61-42b1-4a7a-bf08-511689c59a16-kube-api-access-v76bz\") pod \"dnsmasq-dns-5784cf869f-t6mnp\" (UID: \"22a1da61-42b1-4a7a-bf08-511689c59a16\") " pod="openstack/dnsmasq-dns-5784cf869f-t6mnp" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.256351 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22a1da61-42b1-4a7a-bf08-511689c59a16-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-t6mnp\" (UID: \"22a1da61-42b1-4a7a-bf08-511689c59a16\") " pod="openstack/dnsmasq-dns-5784cf869f-t6mnp" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.256382 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22a1da61-42b1-4a7a-bf08-511689c59a16-config\") pod \"dnsmasq-dns-5784cf869f-t6mnp\" (UID: \"22a1da61-42b1-4a7a-bf08-511689c59a16\") " pod="openstack/dnsmasq-dns-5784cf869f-t6mnp" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.256415 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22a1da61-42b1-4a7a-bf08-511689c59a16-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-t6mnp\" (UID: \"22a1da61-42b1-4a7a-bf08-511689c59a16\") " pod="openstack/dnsmasq-dns-5784cf869f-t6mnp" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.256446 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22a1da61-42b1-4a7a-bf08-511689c59a16-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-t6mnp\" (UID: \"22a1da61-42b1-4a7a-bf08-511689c59a16\") " pod="openstack/dnsmasq-dns-5784cf869f-t6mnp" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.318414 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7554f5f8dd-vxdsk"] Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.320354 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7554f5f8dd-vxdsk" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.329917 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-2nz5w" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.330322 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.330431 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.330531 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.359330 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v76bz\" (UniqueName: \"kubernetes.io/projected/22a1da61-42b1-4a7a-bf08-511689c59a16-kube-api-access-v76bz\") pod \"dnsmasq-dns-5784cf869f-t6mnp\" (UID: \"22a1da61-42b1-4a7a-bf08-511689c59a16\") " pod="openstack/dnsmasq-dns-5784cf869f-t6mnp" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.359383 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22a1da61-42b1-4a7a-bf08-511689c59a16-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-t6mnp\" (UID: \"22a1da61-42b1-4a7a-bf08-511689c59a16\") " pod="openstack/dnsmasq-dns-5784cf869f-t6mnp" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.359418 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22a1da61-42b1-4a7a-bf08-511689c59a16-config\") pod \"dnsmasq-dns-5784cf869f-t6mnp\" (UID: \"22a1da61-42b1-4a7a-bf08-511689c59a16\") " pod="openstack/dnsmasq-dns-5784cf869f-t6mnp" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.359451 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22a1da61-42b1-4a7a-bf08-511689c59a16-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-t6mnp\" (UID: \"22a1da61-42b1-4a7a-bf08-511689c59a16\") " pod="openstack/dnsmasq-dns-5784cf869f-t6mnp" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.359481 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22a1da61-42b1-4a7a-bf08-511689c59a16-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-t6mnp\" (UID: \"22a1da61-42b1-4a7a-bf08-511689c59a16\") " pod="openstack/dnsmasq-dns-5784cf869f-t6mnp" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.359504 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22a1da61-42b1-4a7a-bf08-511689c59a16-dns-svc\") pod \"dnsmasq-dns-5784cf869f-t6mnp\" (UID: \"22a1da61-42b1-4a7a-bf08-511689c59a16\") " pod="openstack/dnsmasq-dns-5784cf869f-t6mnp" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.360435 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22a1da61-42b1-4a7a-bf08-511689c59a16-dns-svc\") pod \"dnsmasq-dns-5784cf869f-t6mnp\" (UID: \"22a1da61-42b1-4a7a-bf08-511689c59a16\") " pod="openstack/dnsmasq-dns-5784cf869f-t6mnp" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.360470 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22a1da61-42b1-4a7a-bf08-511689c59a16-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-t6mnp\" (UID: \"22a1da61-42b1-4a7a-bf08-511689c59a16\") " pod="openstack/dnsmasq-dns-5784cf869f-t6mnp" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.361130 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22a1da61-42b1-4a7a-bf08-511689c59a16-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-t6mnp\" (UID: \"22a1da61-42b1-4a7a-bf08-511689c59a16\") " pod="openstack/dnsmasq-dns-5784cf869f-t6mnp" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.363856 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22a1da61-42b1-4a7a-bf08-511689c59a16-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-t6mnp\" (UID: \"22a1da61-42b1-4a7a-bf08-511689c59a16\") " pod="openstack/dnsmasq-dns-5784cf869f-t6mnp" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.365353 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.371993 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22a1da61-42b1-4a7a-bf08-511689c59a16-config\") pod \"dnsmasq-dns-5784cf869f-t6mnp\" (UID: \"22a1da61-42b1-4a7a-bf08-511689c59a16\") " pod="openstack/dnsmasq-dns-5784cf869f-t6mnp" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.389653 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7554f5f8dd-vxdsk"] Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.392655 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v76bz\" (UniqueName: \"kubernetes.io/projected/22a1da61-42b1-4a7a-bf08-511689c59a16-kube-api-access-v76bz\") pod \"dnsmasq-dns-5784cf869f-t6mnp\" (UID: \"22a1da61-42b1-4a7a-bf08-511689c59a16\") " pod="openstack/dnsmasq-dns-5784cf869f-t6mnp" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.397845 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.463171 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/08ef410f-221b-4062-a3a0-e20f2169c488-ovndb-tls-certs\") pod \"neutron-7554f5f8dd-vxdsk\" (UID: \"08ef410f-221b-4062-a3a0-e20f2169c488\") " pod="openstack/neutron-7554f5f8dd-vxdsk" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.463237 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/08ef410f-221b-4062-a3a0-e20f2169c488-config\") pod \"neutron-7554f5f8dd-vxdsk\" (UID: \"08ef410f-221b-4062-a3a0-e20f2169c488\") " pod="openstack/neutron-7554f5f8dd-vxdsk" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.463280 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8qbk\" (UniqueName: \"kubernetes.io/projected/08ef410f-221b-4062-a3a0-e20f2169c488-kube-api-access-k8qbk\") pod \"neutron-7554f5f8dd-vxdsk\" (UID: \"08ef410f-221b-4062-a3a0-e20f2169c488\") " pod="openstack/neutron-7554f5f8dd-vxdsk" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.463376 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/08ef410f-221b-4062-a3a0-e20f2169c488-httpd-config\") pod \"neutron-7554f5f8dd-vxdsk\" (UID: \"08ef410f-221b-4062-a3a0-e20f2169c488\") " pod="openstack/neutron-7554f5f8dd-vxdsk" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.463437 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08ef410f-221b-4062-a3a0-e20f2169c488-combined-ca-bundle\") pod \"neutron-7554f5f8dd-vxdsk\" (UID: \"08ef410f-221b-4062-a3a0-e20f2169c488\") " pod="openstack/neutron-7554f5f8dd-vxdsk" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.546663 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-85c8cc6bbb-jj2g6" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.568317 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8qbk\" (UniqueName: \"kubernetes.io/projected/08ef410f-221b-4062-a3a0-e20f2169c488-kube-api-access-k8qbk\") pod \"neutron-7554f5f8dd-vxdsk\" (UID: \"08ef410f-221b-4062-a3a0-e20f2169c488\") " pod="openstack/neutron-7554f5f8dd-vxdsk" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.568416 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/08ef410f-221b-4062-a3a0-e20f2169c488-httpd-config\") pod \"neutron-7554f5f8dd-vxdsk\" (UID: \"08ef410f-221b-4062-a3a0-e20f2169c488\") " pod="openstack/neutron-7554f5f8dd-vxdsk" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.568467 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08ef410f-221b-4062-a3a0-e20f2169c488-combined-ca-bundle\") pod \"neutron-7554f5f8dd-vxdsk\" (UID: \"08ef410f-221b-4062-a3a0-e20f2169c488\") " pod="openstack/neutron-7554f5f8dd-vxdsk" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.568558 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/08ef410f-221b-4062-a3a0-e20f2169c488-ovndb-tls-certs\") pod \"neutron-7554f5f8dd-vxdsk\" (UID: \"08ef410f-221b-4062-a3a0-e20f2169c488\") " pod="openstack/neutron-7554f5f8dd-vxdsk" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.568585 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/08ef410f-221b-4062-a3a0-e20f2169c488-config\") pod \"neutron-7554f5f8dd-vxdsk\" (UID: \"08ef410f-221b-4062-a3a0-e20f2169c488\") " pod="openstack/neutron-7554f5f8dd-vxdsk" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.574615 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/08ef410f-221b-4062-a3a0-e20f2169c488-config\") pod \"neutron-7554f5f8dd-vxdsk\" (UID: \"08ef410f-221b-4062-a3a0-e20f2169c488\") " pod="openstack/neutron-7554f5f8dd-vxdsk" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.575328 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/08ef410f-221b-4062-a3a0-e20f2169c488-ovndb-tls-certs\") pod \"neutron-7554f5f8dd-vxdsk\" (UID: \"08ef410f-221b-4062-a3a0-e20f2169c488\") " pod="openstack/neutron-7554f5f8dd-vxdsk" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.575763 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08ef410f-221b-4062-a3a0-e20f2169c488-combined-ca-bundle\") pod \"neutron-7554f5f8dd-vxdsk\" (UID: \"08ef410f-221b-4062-a3a0-e20f2169c488\") " pod="openstack/neutron-7554f5f8dd-vxdsk" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.585462 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/08ef410f-221b-4062-a3a0-e20f2169c488-httpd-config\") pod \"neutron-7554f5f8dd-vxdsk\" (UID: \"08ef410f-221b-4062-a3a0-e20f2169c488\") " pod="openstack/neutron-7554f5f8dd-vxdsk" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.590328 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8qbk\" (UniqueName: \"kubernetes.io/projected/08ef410f-221b-4062-a3a0-e20f2169c488-kube-api-access-k8qbk\") pod \"neutron-7554f5f8dd-vxdsk\" (UID: \"08ef410f-221b-4062-a3a0-e20f2169c488\") " pod="openstack/neutron-7554f5f8dd-vxdsk" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.640054 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-t6mnp" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.669965 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgtln\" (UniqueName: \"kubernetes.io/projected/3b640b42-7f62-466a-b115-6d9a530e35a6-kube-api-access-tgtln\") pod \"3b640b42-7f62-466a-b115-6d9a530e35a6\" (UID: \"3b640b42-7f62-466a-b115-6d9a530e35a6\") " Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.670056 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b640b42-7f62-466a-b115-6d9a530e35a6-config-data\") pod \"3b640b42-7f62-466a-b115-6d9a530e35a6\" (UID: \"3b640b42-7f62-466a-b115-6d9a530e35a6\") " Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.670092 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b640b42-7f62-466a-b115-6d9a530e35a6-logs\") pod \"3b640b42-7f62-466a-b115-6d9a530e35a6\" (UID: \"3b640b42-7f62-466a-b115-6d9a530e35a6\") " Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.670232 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b640b42-7f62-466a-b115-6d9a530e35a6-combined-ca-bundle\") pod \"3b640b42-7f62-466a-b115-6d9a530e35a6\" (UID: \"3b640b42-7f62-466a-b115-6d9a530e35a6\") " Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.670259 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b640b42-7f62-466a-b115-6d9a530e35a6-config-data-custom\") pod \"3b640b42-7f62-466a-b115-6d9a530e35a6\" (UID: \"3b640b42-7f62-466a-b115-6d9a530e35a6\") " Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.674595 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b640b42-7f62-466a-b115-6d9a530e35a6-logs" (OuterVolumeSpecName: "logs") pod "3b640b42-7f62-466a-b115-6d9a530e35a6" (UID: "3b640b42-7f62-466a-b115-6d9a530e35a6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.674627 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b640b42-7f62-466a-b115-6d9a530e35a6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3b640b42-7f62-466a-b115-6d9a530e35a6" (UID: "3b640b42-7f62-466a-b115-6d9a530e35a6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.679060 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b640b42-7f62-466a-b115-6d9a530e35a6-kube-api-access-tgtln" (OuterVolumeSpecName: "kube-api-access-tgtln") pod "3b640b42-7f62-466a-b115-6d9a530e35a6" (UID: "3b640b42-7f62-466a-b115-6d9a530e35a6"). InnerVolumeSpecName "kube-api-access-tgtln". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.686888 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7554f5f8dd-vxdsk" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.706231 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b640b42-7f62-466a-b115-6d9a530e35a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b640b42-7f62-466a-b115-6d9a530e35a6" (UID: "3b640b42-7f62-466a-b115-6d9a530e35a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.746217 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b640b42-7f62-466a-b115-6d9a530e35a6-config-data" (OuterVolumeSpecName: "config-data") pod "3b640b42-7f62-466a-b115-6d9a530e35a6" (UID: "3b640b42-7f62-466a-b115-6d9a530e35a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.772195 4700 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b640b42-7f62-466a-b115-6d9a530e35a6-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.774658 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgtln\" (UniqueName: \"kubernetes.io/projected/3b640b42-7f62-466a-b115-6d9a530e35a6-kube-api-access-tgtln\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.774672 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b640b42-7f62-466a-b115-6d9a530e35a6-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.774682 4700 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b640b42-7f62-466a-b115-6d9a530e35a6-logs\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.774691 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b640b42-7f62-466a-b115-6d9a530e35a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.786696 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"792a56a7-50b6-45c4-9a3d-5c4104b86859","Type":"ContainerStarted","Data":"7041e0756ab1f63978de8f43735c700218c1308211f4e156149e81c949d3aa35"} Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.850201 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-85c8cc6bbb-jj2g6" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.850368 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-85c8cc6bbb-jj2g6" event={"ID":"3b640b42-7f62-466a-b115-6d9a530e35a6","Type":"ContainerDied","Data":"a79b741caa427e79abebaea6f104a5385a2eb562a49992b8e0bb6dccb5dd4015"} Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.850629 4700 scope.go:117] "RemoveContainer" containerID="9d4ef75aeb731bef542854e31e3e24f596e9d158208027f738c25f2d21ee234e" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.852745 4700 generic.go:334] "Generic (PLEG): container finished" podID="3d15f870-78db-4dac-8cd3-c50ff365d666" containerID="3dc5a68934503045b373ca403fb6d2416b113dacec51fa04226f507e87d137e2" exitCode=0 Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.852797 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c986f6d7-sqm5t" event={"ID":"3d15f870-78db-4dac-8cd3-c50ff365d666","Type":"ContainerDied","Data":"3dc5a68934503045b373ca403fb6d2416b113dacec51fa04226f507e87d137e2"} Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.852818 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c986f6d7-sqm5t" event={"ID":"3d15f870-78db-4dac-8cd3-c50ff365d666","Type":"ContainerStarted","Data":"cc7f82c43719e93c7fb52d9403f04754f87a579d749a38b365e1a998cb9e00f0"} Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.904020 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-85c8cc6bbb-jj2g6"] Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.918815 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a92f3ac-8266-4315-92f9-87bfffdc5660","Type":"ContainerStarted","Data":"ebb70f04c36bc4ab7c8c63ce82ead44c3fb182ad3084e81bb4259c1c980990ee"} Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.919164 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.920945 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9fa37e8f-6dfc-44f0-8fab-1dfa641475cb","Type":"ContainerStarted","Data":"920ac60e7397d5880a85d3ff5f0f70d3ba94a0edd63aa7802d7b053c088e458b"} Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.928324 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-85c8cc6bbb-jj2g6"] Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.952994 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.228383172 podStartE2EDuration="6.952971975s" podCreationTimestamp="2025-10-07 11:38:06 +0000 UTC" firstStartedPulling="2025-10-07 11:38:07.603574398 +0000 UTC m=+1054.399973387" lastFinishedPulling="2025-10-07 11:38:12.328163201 +0000 UTC m=+1059.124562190" observedRunningTime="2025-10-07 11:38:12.942344498 +0000 UTC m=+1059.738743497" watchObservedRunningTime="2025-10-07 11:38:12.952971975 +0000 UTC m=+1059.749370964" Oct 07 11:38:12 crc kubenswrapper[4700]: I1007 11:38:12.955412 4700 scope.go:117] "RemoveContainer" containerID="99d8cdd0dc4c59f3a514986754227c73f83585b40756fceb170120a7f36b96ff" Oct 07 11:38:13 crc kubenswrapper[4700]: I1007 11:38:13.298513 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7554f5f8dd-vxdsk"] Oct 07 11:38:13 crc kubenswrapper[4700]: W1007 11:38:13.309419 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08ef410f_221b_4062_a3a0_e20f2169c488.slice/crio-77ca4d0b0ddb89f5894ebf89de49125c5b6a6ac9c0c3a1b0f5adc1acc650055a WatchSource:0}: Error finding container 77ca4d0b0ddb89f5894ebf89de49125c5b6a6ac9c0c3a1b0f5adc1acc650055a: Status 404 returned error can't find the container with id 77ca4d0b0ddb89f5894ebf89de49125c5b6a6ac9c0c3a1b0f5adc1acc650055a Oct 07 11:38:13 crc kubenswrapper[4700]: I1007 11:38:13.377911 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-t6mnp"] Oct 07 11:38:13 crc kubenswrapper[4700]: I1007 11:38:13.509352 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c986f6d7-sqm5t" Oct 07 11:38:13 crc kubenswrapper[4700]: I1007 11:38:13.596449 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d15f870-78db-4dac-8cd3-c50ff365d666-dns-swift-storage-0\") pod \"3d15f870-78db-4dac-8cd3-c50ff365d666\" (UID: \"3d15f870-78db-4dac-8cd3-c50ff365d666\") " Oct 07 11:38:13 crc kubenswrapper[4700]: I1007 11:38:13.596684 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d15f870-78db-4dac-8cd3-c50ff365d666-dns-svc\") pod \"3d15f870-78db-4dac-8cd3-c50ff365d666\" (UID: \"3d15f870-78db-4dac-8cd3-c50ff365d666\") " Oct 07 11:38:13 crc kubenswrapper[4700]: I1007 11:38:13.596795 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d15f870-78db-4dac-8cd3-c50ff365d666-config\") pod \"3d15f870-78db-4dac-8cd3-c50ff365d666\" (UID: \"3d15f870-78db-4dac-8cd3-c50ff365d666\") " Oct 07 11:38:13 crc kubenswrapper[4700]: I1007 11:38:13.597662 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfv6d\" (UniqueName: \"kubernetes.io/projected/3d15f870-78db-4dac-8cd3-c50ff365d666-kube-api-access-hfv6d\") pod \"3d15f870-78db-4dac-8cd3-c50ff365d666\" (UID: \"3d15f870-78db-4dac-8cd3-c50ff365d666\") " Oct 07 11:38:13 crc kubenswrapper[4700]: I1007 11:38:13.597749 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d15f870-78db-4dac-8cd3-c50ff365d666-ovsdbserver-sb\") pod \"3d15f870-78db-4dac-8cd3-c50ff365d666\" (UID: \"3d15f870-78db-4dac-8cd3-c50ff365d666\") " Oct 07 11:38:13 crc kubenswrapper[4700]: I1007 11:38:13.597818 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d15f870-78db-4dac-8cd3-c50ff365d666-ovsdbserver-nb\") pod \"3d15f870-78db-4dac-8cd3-c50ff365d666\" (UID: \"3d15f870-78db-4dac-8cd3-c50ff365d666\") " Oct 07 11:38:13 crc kubenswrapper[4700]: I1007 11:38:13.605913 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d15f870-78db-4dac-8cd3-c50ff365d666-kube-api-access-hfv6d" (OuterVolumeSpecName: "kube-api-access-hfv6d") pod "3d15f870-78db-4dac-8cd3-c50ff365d666" (UID: "3d15f870-78db-4dac-8cd3-c50ff365d666"). InnerVolumeSpecName "kube-api-access-hfv6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:38:13 crc kubenswrapper[4700]: I1007 11:38:13.630147 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d15f870-78db-4dac-8cd3-c50ff365d666-config" (OuterVolumeSpecName: "config") pod "3d15f870-78db-4dac-8cd3-c50ff365d666" (UID: "3d15f870-78db-4dac-8cd3-c50ff365d666"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:38:13 crc kubenswrapper[4700]: I1007 11:38:13.642508 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d15f870-78db-4dac-8cd3-c50ff365d666-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3d15f870-78db-4dac-8cd3-c50ff365d666" (UID: "3d15f870-78db-4dac-8cd3-c50ff365d666"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:38:13 crc kubenswrapper[4700]: I1007 11:38:13.647269 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d15f870-78db-4dac-8cd3-c50ff365d666-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3d15f870-78db-4dac-8cd3-c50ff365d666" (UID: "3d15f870-78db-4dac-8cd3-c50ff365d666"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:38:13 crc kubenswrapper[4700]: I1007 11:38:13.659739 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d15f870-78db-4dac-8cd3-c50ff365d666-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3d15f870-78db-4dac-8cd3-c50ff365d666" (UID: "3d15f870-78db-4dac-8cd3-c50ff365d666"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:38:13 crc kubenswrapper[4700]: I1007 11:38:13.666533 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d15f870-78db-4dac-8cd3-c50ff365d666-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3d15f870-78db-4dac-8cd3-c50ff365d666" (UID: "3d15f870-78db-4dac-8cd3-c50ff365d666"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:38:13 crc kubenswrapper[4700]: I1007 11:38:13.700376 4700 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d15f870-78db-4dac-8cd3-c50ff365d666-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:13 crc kubenswrapper[4700]: I1007 11:38:13.700409 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d15f870-78db-4dac-8cd3-c50ff365d666-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:13 crc kubenswrapper[4700]: I1007 11:38:13.700422 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfv6d\" (UniqueName: \"kubernetes.io/projected/3d15f870-78db-4dac-8cd3-c50ff365d666-kube-api-access-hfv6d\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:13 crc kubenswrapper[4700]: I1007 11:38:13.700435 4700 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d15f870-78db-4dac-8cd3-c50ff365d666-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:13 crc kubenswrapper[4700]: I1007 11:38:13.700447 4700 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d15f870-78db-4dac-8cd3-c50ff365d666-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:13 crc kubenswrapper[4700]: I1007 11:38:13.700459 4700 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d15f870-78db-4dac-8cd3-c50ff365d666-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:13 crc kubenswrapper[4700]: I1007 11:38:13.835930 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 07 11:38:13 crc kubenswrapper[4700]: I1007 11:38:13.949652 4700 generic.go:334] "Generic (PLEG): container finished" podID="22a1da61-42b1-4a7a-bf08-511689c59a16" containerID="f0e131e76d6c7cf8906405d8fef1832141eb007d8aa0f5b261546e44fa897a55" exitCode=0 Oct 07 11:38:13 crc kubenswrapper[4700]: I1007 11:38:13.949740 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-t6mnp" event={"ID":"22a1da61-42b1-4a7a-bf08-511689c59a16","Type":"ContainerDied","Data":"f0e131e76d6c7cf8906405d8fef1832141eb007d8aa0f5b261546e44fa897a55"} Oct 07 11:38:13 crc kubenswrapper[4700]: I1007 11:38:13.949768 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-t6mnp" event={"ID":"22a1da61-42b1-4a7a-bf08-511689c59a16","Type":"ContainerStarted","Data":"606fd13a7bb1600e518dbcf1ce71a030fcc5a0d4cba13cf10f28d15238599b6d"} Oct 07 11:38:14 crc kubenswrapper[4700]: I1007 11:38:14.002701 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b640b42-7f62-466a-b115-6d9a530e35a6" path="/var/lib/kubelet/pods/3b640b42-7f62-466a-b115-6d9a530e35a6/volumes" Oct 07 11:38:14 crc kubenswrapper[4700]: I1007 11:38:14.003608 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"792a56a7-50b6-45c4-9a3d-5c4104b86859","Type":"ContainerStarted","Data":"4b21ecba7a2cf9d305214f89de20890cbcf7636bdfbdf7df49e59eba49265bbd"} Oct 07 11:38:14 crc kubenswrapper[4700]: I1007 11:38:14.010497 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c986f6d7-sqm5t" event={"ID":"3d15f870-78db-4dac-8cd3-c50ff365d666","Type":"ContainerDied","Data":"cc7f82c43719e93c7fb52d9403f04754f87a579d749a38b365e1a998cb9e00f0"} Oct 07 11:38:14 crc kubenswrapper[4700]: I1007 11:38:14.010546 4700 scope.go:117] "RemoveContainer" containerID="3dc5a68934503045b373ca403fb6d2416b113dacec51fa04226f507e87d137e2" Oct 07 11:38:14 crc kubenswrapper[4700]: I1007 11:38:14.010643 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c986f6d7-sqm5t" Oct 07 11:38:14 crc kubenswrapper[4700]: I1007 11:38:14.065610 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7554f5f8dd-vxdsk" event={"ID":"08ef410f-221b-4062-a3a0-e20f2169c488","Type":"ContainerStarted","Data":"9fa20af259c3639795e1b24f809c20552c592b54ecad02fd4658e73f808d6748"} Oct 07 11:38:14 crc kubenswrapper[4700]: I1007 11:38:14.065851 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7554f5f8dd-vxdsk" event={"ID":"08ef410f-221b-4062-a3a0-e20f2169c488","Type":"ContainerStarted","Data":"77ca4d0b0ddb89f5894ebf89de49125c5b6a6ac9c0c3a1b0f5adc1acc650055a"} Oct 07 11:38:14 crc kubenswrapper[4700]: I1007 11:38:14.400564 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-sqm5t"] Oct 07 11:38:14 crc kubenswrapper[4700]: I1007 11:38:14.426142 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-sqm5t"] Oct 07 11:38:15 crc kubenswrapper[4700]: I1007 11:38:15.089026 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-t6mnp" event={"ID":"22a1da61-42b1-4a7a-bf08-511689c59a16","Type":"ContainerStarted","Data":"e8e6f362d3fd8bda4f37e105243a3735d4f6b5ac13434e4af2fd4b7ef2d7c99c"} Oct 07 11:38:15 crc kubenswrapper[4700]: I1007 11:38:15.089285 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5784cf869f-t6mnp" Oct 07 11:38:15 crc kubenswrapper[4700]: I1007 11:38:15.095678 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9fa37e8f-6dfc-44f0-8fab-1dfa641475cb","Type":"ContainerStarted","Data":"fad17572c6002f3df5339bd519e5c09534590a84db96bfc369b020f90e1dbe0e"} Oct 07 11:38:15 crc kubenswrapper[4700]: I1007 11:38:15.097453 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"792a56a7-50b6-45c4-9a3d-5c4104b86859","Type":"ContainerStarted","Data":"564578bbd8215cc7e66988ca703b330866165818bb56618919cf8cc894751ac9"} Oct 07 11:38:15 crc kubenswrapper[4700]: I1007 11:38:15.097577 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="792a56a7-50b6-45c4-9a3d-5c4104b86859" containerName="cinder-api-log" containerID="cri-o://4b21ecba7a2cf9d305214f89de20890cbcf7636bdfbdf7df49e59eba49265bbd" gracePeriod=30 Oct 07 11:38:15 crc kubenswrapper[4700]: I1007 11:38:15.097654 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 07 11:38:15 crc kubenswrapper[4700]: I1007 11:38:15.097681 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="792a56a7-50b6-45c4-9a3d-5c4104b86859" containerName="cinder-api" containerID="cri-o://564578bbd8215cc7e66988ca703b330866165818bb56618919cf8cc894751ac9" gracePeriod=30 Oct 07 11:38:15 crc kubenswrapper[4700]: I1007 11:38:15.111058 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5784cf869f-t6mnp" podStartSLOduration=3.111037038 podStartE2EDuration="3.111037038s" podCreationTimestamp="2025-10-07 11:38:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:38:15.108334618 +0000 UTC m=+1061.904733607" watchObservedRunningTime="2025-10-07 11:38:15.111037038 +0000 UTC m=+1061.907436017" Oct 07 11:38:15 crc kubenswrapper[4700]: I1007 11:38:15.116073 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7554f5f8dd-vxdsk" event={"ID":"08ef410f-221b-4062-a3a0-e20f2169c488","Type":"ContainerStarted","Data":"8e8a1783e23903d63dcdd26bf47452295ba061c9dca42571739b921dd0e8979f"} Oct 07 11:38:15 crc kubenswrapper[4700]: I1007 11:38:15.117274 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7554f5f8dd-vxdsk" Oct 07 11:38:15 crc kubenswrapper[4700]: I1007 11:38:15.131328 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.131301846 podStartE2EDuration="4.131301846s" podCreationTimestamp="2025-10-07 11:38:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:38:15.125471424 +0000 UTC m=+1061.921870443" watchObservedRunningTime="2025-10-07 11:38:15.131301846 +0000 UTC m=+1061.927700835" Oct 07 11:38:15 crc kubenswrapper[4700]: I1007 11:38:15.163180 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7554f5f8dd-vxdsk" podStartSLOduration=3.163158746 podStartE2EDuration="3.163158746s" podCreationTimestamp="2025-10-07 11:38:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:38:15.154584692 +0000 UTC m=+1061.950983681" watchObservedRunningTime="2025-10-07 11:38:15.163158746 +0000 UTC m=+1061.959557735" Oct 07 11:38:15 crc kubenswrapper[4700]: I1007 11:38:15.756516 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 11:38:15 crc kubenswrapper[4700]: I1007 11:38:15.855196 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/792a56a7-50b6-45c4-9a3d-5c4104b86859-scripts\") pod \"792a56a7-50b6-45c4-9a3d-5c4104b86859\" (UID: \"792a56a7-50b6-45c4-9a3d-5c4104b86859\") " Oct 07 11:38:15 crc kubenswrapper[4700]: I1007 11:38:15.855340 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/792a56a7-50b6-45c4-9a3d-5c4104b86859-config-data-custom\") pod \"792a56a7-50b6-45c4-9a3d-5c4104b86859\" (UID: \"792a56a7-50b6-45c4-9a3d-5c4104b86859\") " Oct 07 11:38:15 crc kubenswrapper[4700]: I1007 11:38:15.855408 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/792a56a7-50b6-45c4-9a3d-5c4104b86859-config-data\") pod \"792a56a7-50b6-45c4-9a3d-5c4104b86859\" (UID: \"792a56a7-50b6-45c4-9a3d-5c4104b86859\") " Oct 07 11:38:15 crc kubenswrapper[4700]: I1007 11:38:15.855437 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdksd\" (UniqueName: \"kubernetes.io/projected/792a56a7-50b6-45c4-9a3d-5c4104b86859-kube-api-access-sdksd\") pod \"792a56a7-50b6-45c4-9a3d-5c4104b86859\" (UID: \"792a56a7-50b6-45c4-9a3d-5c4104b86859\") " Oct 07 11:38:15 crc kubenswrapper[4700]: I1007 11:38:15.856021 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/792a56a7-50b6-45c4-9a3d-5c4104b86859-combined-ca-bundle\") pod \"792a56a7-50b6-45c4-9a3d-5c4104b86859\" (UID: \"792a56a7-50b6-45c4-9a3d-5c4104b86859\") " Oct 07 11:38:15 crc kubenswrapper[4700]: I1007 11:38:15.856086 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/792a56a7-50b6-45c4-9a3d-5c4104b86859-logs\") pod \"792a56a7-50b6-45c4-9a3d-5c4104b86859\" (UID: \"792a56a7-50b6-45c4-9a3d-5c4104b86859\") " Oct 07 11:38:15 crc kubenswrapper[4700]: I1007 11:38:15.856139 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/792a56a7-50b6-45c4-9a3d-5c4104b86859-etc-machine-id\") pod \"792a56a7-50b6-45c4-9a3d-5c4104b86859\" (UID: \"792a56a7-50b6-45c4-9a3d-5c4104b86859\") " Oct 07 11:38:15 crc kubenswrapper[4700]: I1007 11:38:15.856543 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/792a56a7-50b6-45c4-9a3d-5c4104b86859-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "792a56a7-50b6-45c4-9a3d-5c4104b86859" (UID: "792a56a7-50b6-45c4-9a3d-5c4104b86859"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 11:38:15 crc kubenswrapper[4700]: I1007 11:38:15.856559 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/792a56a7-50b6-45c4-9a3d-5c4104b86859-logs" (OuterVolumeSpecName: "logs") pod "792a56a7-50b6-45c4-9a3d-5c4104b86859" (UID: "792a56a7-50b6-45c4-9a3d-5c4104b86859"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:38:15 crc kubenswrapper[4700]: I1007 11:38:15.865439 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/792a56a7-50b6-45c4-9a3d-5c4104b86859-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "792a56a7-50b6-45c4-9a3d-5c4104b86859" (UID: "792a56a7-50b6-45c4-9a3d-5c4104b86859"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:15 crc kubenswrapper[4700]: I1007 11:38:15.865533 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/792a56a7-50b6-45c4-9a3d-5c4104b86859-kube-api-access-sdksd" (OuterVolumeSpecName: "kube-api-access-sdksd") pod "792a56a7-50b6-45c4-9a3d-5c4104b86859" (UID: "792a56a7-50b6-45c4-9a3d-5c4104b86859"). InnerVolumeSpecName "kube-api-access-sdksd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:38:15 crc kubenswrapper[4700]: I1007 11:38:15.867417 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/792a56a7-50b6-45c4-9a3d-5c4104b86859-scripts" (OuterVolumeSpecName: "scripts") pod "792a56a7-50b6-45c4-9a3d-5c4104b86859" (UID: "792a56a7-50b6-45c4-9a3d-5c4104b86859"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:15 crc kubenswrapper[4700]: I1007 11:38:15.886363 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/792a56a7-50b6-45c4-9a3d-5c4104b86859-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "792a56a7-50b6-45c4-9a3d-5c4104b86859" (UID: "792a56a7-50b6-45c4-9a3d-5c4104b86859"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:15 crc kubenswrapper[4700]: I1007 11:38:15.896671 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7d99cbbb56-xqb5x" Oct 07 11:38:15 crc kubenswrapper[4700]: I1007 11:38:15.899895 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7d99cbbb56-xqb5x" Oct 07 11:38:15 crc kubenswrapper[4700]: I1007 11:38:15.933478 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/792a56a7-50b6-45c4-9a3d-5c4104b86859-config-data" (OuterVolumeSpecName: "config-data") pod "792a56a7-50b6-45c4-9a3d-5c4104b86859" (UID: "792a56a7-50b6-45c4-9a3d-5c4104b86859"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:15 crc kubenswrapper[4700]: I1007 11:38:15.959816 4700 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/792a56a7-50b6-45c4-9a3d-5c4104b86859-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:15 crc kubenswrapper[4700]: I1007 11:38:15.959862 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/792a56a7-50b6-45c4-9a3d-5c4104b86859-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:15 crc kubenswrapper[4700]: I1007 11:38:15.959876 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdksd\" (UniqueName: \"kubernetes.io/projected/792a56a7-50b6-45c4-9a3d-5c4104b86859-kube-api-access-sdksd\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:15 crc kubenswrapper[4700]: I1007 11:38:15.959890 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/792a56a7-50b6-45c4-9a3d-5c4104b86859-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:15 crc kubenswrapper[4700]: I1007 11:38:15.959900 4700 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/792a56a7-50b6-45c4-9a3d-5c4104b86859-logs\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:15 crc kubenswrapper[4700]: I1007 11:38:15.959911 4700 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/792a56a7-50b6-45c4-9a3d-5c4104b86859-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:15 crc kubenswrapper[4700]: I1007 11:38:15.959921 4700 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/792a56a7-50b6-45c4-9a3d-5c4104b86859-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:15 crc kubenswrapper[4700]: I1007 11:38:15.998753 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d15f870-78db-4dac-8cd3-c50ff365d666" path="/var/lib/kubelet/pods/3d15f870-78db-4dac-8cd3-c50ff365d666/volumes" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.143040 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9fa37e8f-6dfc-44f0-8fab-1dfa641475cb","Type":"ContainerStarted","Data":"b05ef4da362b57f929cdd908137a3dd1230c10fadbd478752cd7975245420543"} Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.155682 4700 generic.go:334] "Generic (PLEG): container finished" podID="792a56a7-50b6-45c4-9a3d-5c4104b86859" containerID="564578bbd8215cc7e66988ca703b330866165818bb56618919cf8cc894751ac9" exitCode=0 Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.155716 4700 generic.go:334] "Generic (PLEG): container finished" podID="792a56a7-50b6-45c4-9a3d-5c4104b86859" containerID="4b21ecba7a2cf9d305214f89de20890cbcf7636bdfbdf7df49e59eba49265bbd" exitCode=143 Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.156563 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.157004 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"792a56a7-50b6-45c4-9a3d-5c4104b86859","Type":"ContainerDied","Data":"564578bbd8215cc7e66988ca703b330866165818bb56618919cf8cc894751ac9"} Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.157031 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"792a56a7-50b6-45c4-9a3d-5c4104b86859","Type":"ContainerDied","Data":"4b21ecba7a2cf9d305214f89de20890cbcf7636bdfbdf7df49e59eba49265bbd"} Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.157040 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"792a56a7-50b6-45c4-9a3d-5c4104b86859","Type":"ContainerDied","Data":"7041e0756ab1f63978de8f43735c700218c1308211f4e156149e81c949d3aa35"} Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.157056 4700 scope.go:117] "RemoveContainer" containerID="564578bbd8215cc7e66988ca703b330866165818bb56618919cf8cc894751ac9" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.169081 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.617263277 podStartE2EDuration="5.169064707s" podCreationTimestamp="2025-10-07 11:38:11 +0000 UTC" firstStartedPulling="2025-10-07 11:38:12.373524693 +0000 UTC m=+1059.169923682" lastFinishedPulling="2025-10-07 11:38:13.925326123 +0000 UTC m=+1060.721725112" observedRunningTime="2025-10-07 11:38:16.166407068 +0000 UTC m=+1062.962806087" watchObservedRunningTime="2025-10-07 11:38:16.169064707 +0000 UTC m=+1062.965463696" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.208997 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.222253 4700 scope.go:117] "RemoveContainer" containerID="4b21ecba7a2cf9d305214f89de20890cbcf7636bdfbdf7df49e59eba49265bbd" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.236365 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.254267 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 07 11:38:16 crc kubenswrapper[4700]: E1007 11:38:16.254618 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b640b42-7f62-466a-b115-6d9a530e35a6" containerName="barbican-api" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.254635 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b640b42-7f62-466a-b115-6d9a530e35a6" containerName="barbican-api" Oct 07 11:38:16 crc kubenswrapper[4700]: E1007 11:38:16.254647 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b640b42-7f62-466a-b115-6d9a530e35a6" containerName="barbican-api-log" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.254655 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b640b42-7f62-466a-b115-6d9a530e35a6" containerName="barbican-api-log" Oct 07 11:38:16 crc kubenswrapper[4700]: E1007 11:38:16.254676 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d15f870-78db-4dac-8cd3-c50ff365d666" containerName="init" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.254682 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d15f870-78db-4dac-8cd3-c50ff365d666" containerName="init" Oct 07 11:38:16 crc kubenswrapper[4700]: E1007 11:38:16.254689 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="792a56a7-50b6-45c4-9a3d-5c4104b86859" containerName="cinder-api" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.254695 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="792a56a7-50b6-45c4-9a3d-5c4104b86859" containerName="cinder-api" Oct 07 11:38:16 crc kubenswrapper[4700]: E1007 11:38:16.254714 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="792a56a7-50b6-45c4-9a3d-5c4104b86859" containerName="cinder-api-log" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.254721 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="792a56a7-50b6-45c4-9a3d-5c4104b86859" containerName="cinder-api-log" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.254891 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d15f870-78db-4dac-8cd3-c50ff365d666" containerName="init" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.254904 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="792a56a7-50b6-45c4-9a3d-5c4104b86859" containerName="cinder-api-log" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.254915 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="792a56a7-50b6-45c4-9a3d-5c4104b86859" containerName="cinder-api" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.254924 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b640b42-7f62-466a-b115-6d9a530e35a6" containerName="barbican-api" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.254937 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b640b42-7f62-466a-b115-6d9a530e35a6" containerName="barbican-api-log" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.257549 4700 scope.go:117] "RemoveContainer" containerID="564578bbd8215cc7e66988ca703b330866165818bb56618919cf8cc894751ac9" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.260430 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.260549 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 11:38:16 crc kubenswrapper[4700]: E1007 11:38:16.268428 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"564578bbd8215cc7e66988ca703b330866165818bb56618919cf8cc894751ac9\": container with ID starting with 564578bbd8215cc7e66988ca703b330866165818bb56618919cf8cc894751ac9 not found: ID does not exist" containerID="564578bbd8215cc7e66988ca703b330866165818bb56618919cf8cc894751ac9" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.268476 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"564578bbd8215cc7e66988ca703b330866165818bb56618919cf8cc894751ac9"} err="failed to get container status \"564578bbd8215cc7e66988ca703b330866165818bb56618919cf8cc894751ac9\": rpc error: code = NotFound desc = could not find container \"564578bbd8215cc7e66988ca703b330866165818bb56618919cf8cc894751ac9\": container with ID starting with 564578bbd8215cc7e66988ca703b330866165818bb56618919cf8cc894751ac9 not found: ID does not exist" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.268501 4700 scope.go:117] "RemoveContainer" containerID="4b21ecba7a2cf9d305214f89de20890cbcf7636bdfbdf7df49e59eba49265bbd" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.268712 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.268817 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.268890 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 07 11:38:16 crc kubenswrapper[4700]: E1007 11:38:16.274013 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b21ecba7a2cf9d305214f89de20890cbcf7636bdfbdf7df49e59eba49265bbd\": container with ID starting with 4b21ecba7a2cf9d305214f89de20890cbcf7636bdfbdf7df49e59eba49265bbd not found: ID does not exist" containerID="4b21ecba7a2cf9d305214f89de20890cbcf7636bdfbdf7df49e59eba49265bbd" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.274050 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b21ecba7a2cf9d305214f89de20890cbcf7636bdfbdf7df49e59eba49265bbd"} err="failed to get container status \"4b21ecba7a2cf9d305214f89de20890cbcf7636bdfbdf7df49e59eba49265bbd\": rpc error: code = NotFound desc = could not find container \"4b21ecba7a2cf9d305214f89de20890cbcf7636bdfbdf7df49e59eba49265bbd\": container with ID starting with 4b21ecba7a2cf9d305214f89de20890cbcf7636bdfbdf7df49e59eba49265bbd not found: ID does not exist" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.274076 4700 scope.go:117] "RemoveContainer" containerID="564578bbd8215cc7e66988ca703b330866165818bb56618919cf8cc894751ac9" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.279418 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"564578bbd8215cc7e66988ca703b330866165818bb56618919cf8cc894751ac9"} err="failed to get container status \"564578bbd8215cc7e66988ca703b330866165818bb56618919cf8cc894751ac9\": rpc error: code = NotFound desc = could not find container \"564578bbd8215cc7e66988ca703b330866165818bb56618919cf8cc894751ac9\": container with ID starting with 564578bbd8215cc7e66988ca703b330866165818bb56618919cf8cc894751ac9 not found: ID does not exist" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.279458 4700 scope.go:117] "RemoveContainer" containerID="4b21ecba7a2cf9d305214f89de20890cbcf7636bdfbdf7df49e59eba49265bbd" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.282182 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b21ecba7a2cf9d305214f89de20890cbcf7636bdfbdf7df49e59eba49265bbd"} err="failed to get container status \"4b21ecba7a2cf9d305214f89de20890cbcf7636bdfbdf7df49e59eba49265bbd\": rpc error: code = NotFound desc = could not find container \"4b21ecba7a2cf9d305214f89de20890cbcf7636bdfbdf7df49e59eba49265bbd\": container with ID starting with 4b21ecba7a2cf9d305214f89de20890cbcf7636bdfbdf7df49e59eba49265bbd not found: ID does not exist" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.366199 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ab9a3d0-4f1c-4650-b766-836415e6cb40-scripts\") pod \"cinder-api-0\" (UID: \"9ab9a3d0-4f1c-4650-b766-836415e6cb40\") " pod="openstack/cinder-api-0" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.366244 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab9a3d0-4f1c-4650-b766-836415e6cb40-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9ab9a3d0-4f1c-4650-b766-836415e6cb40\") " pod="openstack/cinder-api-0" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.366425 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xnpg\" (UniqueName: \"kubernetes.io/projected/9ab9a3d0-4f1c-4650-b766-836415e6cb40-kube-api-access-2xnpg\") pod \"cinder-api-0\" (UID: \"9ab9a3d0-4f1c-4650-b766-836415e6cb40\") " pod="openstack/cinder-api-0" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.366542 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab9a3d0-4f1c-4650-b766-836415e6cb40-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9ab9a3d0-4f1c-4650-b766-836415e6cb40\") " pod="openstack/cinder-api-0" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.366562 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9ab9a3d0-4f1c-4650-b766-836415e6cb40-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9ab9a3d0-4f1c-4650-b766-836415e6cb40\") " pod="openstack/cinder-api-0" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.366577 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ab9a3d0-4f1c-4650-b766-836415e6cb40-logs\") pod \"cinder-api-0\" (UID: \"9ab9a3d0-4f1c-4650-b766-836415e6cb40\") " pod="openstack/cinder-api-0" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.366742 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab9a3d0-4f1c-4650-b766-836415e6cb40-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9ab9a3d0-4f1c-4650-b766-836415e6cb40\") " pod="openstack/cinder-api-0" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.366786 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ab9a3d0-4f1c-4650-b766-836415e6cb40-config-data-custom\") pod \"cinder-api-0\" (UID: \"9ab9a3d0-4f1c-4650-b766-836415e6cb40\") " pod="openstack/cinder-api-0" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.366856 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ab9a3d0-4f1c-4650-b766-836415e6cb40-config-data\") pod \"cinder-api-0\" (UID: \"9ab9a3d0-4f1c-4650-b766-836415e6cb40\") " pod="openstack/cinder-api-0" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.468064 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ab9a3d0-4f1c-4650-b766-836415e6cb40-logs\") pod \"cinder-api-0\" (UID: \"9ab9a3d0-4f1c-4650-b766-836415e6cb40\") " pod="openstack/cinder-api-0" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.468109 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab9a3d0-4f1c-4650-b766-836415e6cb40-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9ab9a3d0-4f1c-4650-b766-836415e6cb40\") " pod="openstack/cinder-api-0" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.468130 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9ab9a3d0-4f1c-4650-b766-836415e6cb40-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9ab9a3d0-4f1c-4650-b766-836415e6cb40\") " pod="openstack/cinder-api-0" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.468224 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9ab9a3d0-4f1c-4650-b766-836415e6cb40-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9ab9a3d0-4f1c-4650-b766-836415e6cb40\") " pod="openstack/cinder-api-0" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.468732 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ab9a3d0-4f1c-4650-b766-836415e6cb40-logs\") pod \"cinder-api-0\" (UID: \"9ab9a3d0-4f1c-4650-b766-836415e6cb40\") " pod="openstack/cinder-api-0" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.469286 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab9a3d0-4f1c-4650-b766-836415e6cb40-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9ab9a3d0-4f1c-4650-b766-836415e6cb40\") " pod="openstack/cinder-api-0" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.469348 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ab9a3d0-4f1c-4650-b766-836415e6cb40-config-data-custom\") pod \"cinder-api-0\" (UID: \"9ab9a3d0-4f1c-4650-b766-836415e6cb40\") " pod="openstack/cinder-api-0" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.469429 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ab9a3d0-4f1c-4650-b766-836415e6cb40-config-data\") pod \"cinder-api-0\" (UID: \"9ab9a3d0-4f1c-4650-b766-836415e6cb40\") " pod="openstack/cinder-api-0" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.469450 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ab9a3d0-4f1c-4650-b766-836415e6cb40-scripts\") pod \"cinder-api-0\" (UID: \"9ab9a3d0-4f1c-4650-b766-836415e6cb40\") " pod="openstack/cinder-api-0" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.469475 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab9a3d0-4f1c-4650-b766-836415e6cb40-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9ab9a3d0-4f1c-4650-b766-836415e6cb40\") " pod="openstack/cinder-api-0" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.469561 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xnpg\" (UniqueName: \"kubernetes.io/projected/9ab9a3d0-4f1c-4650-b766-836415e6cb40-kube-api-access-2xnpg\") pod \"cinder-api-0\" (UID: \"9ab9a3d0-4f1c-4650-b766-836415e6cb40\") " pod="openstack/cinder-api-0" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.472847 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab9a3d0-4f1c-4650-b766-836415e6cb40-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9ab9a3d0-4f1c-4650-b766-836415e6cb40\") " pod="openstack/cinder-api-0" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.474715 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab9a3d0-4f1c-4650-b766-836415e6cb40-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9ab9a3d0-4f1c-4650-b766-836415e6cb40\") " pod="openstack/cinder-api-0" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.477710 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab9a3d0-4f1c-4650-b766-836415e6cb40-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9ab9a3d0-4f1c-4650-b766-836415e6cb40\") " pod="openstack/cinder-api-0" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.477783 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.481639 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ab9a3d0-4f1c-4650-b766-836415e6cb40-scripts\") pod \"cinder-api-0\" (UID: \"9ab9a3d0-4f1c-4650-b766-836415e6cb40\") " pod="openstack/cinder-api-0" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.484189 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ab9a3d0-4f1c-4650-b766-836415e6cb40-config-data\") pod \"cinder-api-0\" (UID: \"9ab9a3d0-4f1c-4650-b766-836415e6cb40\") " pod="openstack/cinder-api-0" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.484700 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ab9a3d0-4f1c-4650-b766-836415e6cb40-config-data-custom\") pod \"cinder-api-0\" (UID: \"9ab9a3d0-4f1c-4650-b766-836415e6cb40\") " pod="openstack/cinder-api-0" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.496603 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-8444f487fd-js794" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.500988 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xnpg\" (UniqueName: \"kubernetes.io/projected/9ab9a3d0-4f1c-4650-b766-836415e6cb40-kube-api-access-2xnpg\") pod \"cinder-api-0\" (UID: \"9ab9a3d0-4f1c-4650-b766-836415e6cb40\") " pod="openstack/cinder-api-0" Oct 07 11:38:16 crc kubenswrapper[4700]: I1007 11:38:16.595393 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 11:38:17 crc kubenswrapper[4700]: I1007 11:38:17.111386 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 07 11:38:17 crc kubenswrapper[4700]: W1007 11:38:17.123603 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ab9a3d0_4f1c_4650_b766_836415e6cb40.slice/crio-6bb92b6e459b7b5fe73ad49356fe19c32101e426905c02411a32bc6f6e146057 WatchSource:0}: Error finding container 6bb92b6e459b7b5fe73ad49356fe19c32101e426905c02411a32bc6f6e146057: Status 404 returned error can't find the container with id 6bb92b6e459b7b5fe73ad49356fe19c32101e426905c02411a32bc6f6e146057 Oct 07 11:38:17 crc kubenswrapper[4700]: I1007 11:38:17.181927 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9ab9a3d0-4f1c-4650-b766-836415e6cb40","Type":"ContainerStarted","Data":"6bb92b6e459b7b5fe73ad49356fe19c32101e426905c02411a32bc6f6e146057"} Oct 07 11:38:17 crc kubenswrapper[4700]: I1007 11:38:17.490659 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-744b8f5559-c67wh"] Oct 07 11:38:17 crc kubenswrapper[4700]: I1007 11:38:17.492071 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-744b8f5559-c67wh" Oct 07 11:38:17 crc kubenswrapper[4700]: I1007 11:38:17.494449 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 07 11:38:17 crc kubenswrapper[4700]: I1007 11:38:17.494665 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 07 11:38:17 crc kubenswrapper[4700]: I1007 11:38:17.526234 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-744b8f5559-c67wh"] Oct 07 11:38:17 crc kubenswrapper[4700]: I1007 11:38:17.624931 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b103be5-6b3d-41f7-ba2e-34f1f5b2730a-combined-ca-bundle\") pod \"neutron-744b8f5559-c67wh\" (UID: \"3b103be5-6b3d-41f7-ba2e-34f1f5b2730a\") " pod="openstack/neutron-744b8f5559-c67wh" Oct 07 11:38:17 crc kubenswrapper[4700]: I1007 11:38:17.625104 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3b103be5-6b3d-41f7-ba2e-34f1f5b2730a-httpd-config\") pod \"neutron-744b8f5559-c67wh\" (UID: \"3b103be5-6b3d-41f7-ba2e-34f1f5b2730a\") " pod="openstack/neutron-744b8f5559-c67wh" Oct 07 11:38:17 crc kubenswrapper[4700]: I1007 11:38:17.625221 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-972jk\" (UniqueName: \"kubernetes.io/projected/3b103be5-6b3d-41f7-ba2e-34f1f5b2730a-kube-api-access-972jk\") pod \"neutron-744b8f5559-c67wh\" (UID: \"3b103be5-6b3d-41f7-ba2e-34f1f5b2730a\") " pod="openstack/neutron-744b8f5559-c67wh" Oct 07 11:38:17 crc kubenswrapper[4700]: I1007 11:38:17.625374 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3b103be5-6b3d-41f7-ba2e-34f1f5b2730a-config\") pod \"neutron-744b8f5559-c67wh\" (UID: \"3b103be5-6b3d-41f7-ba2e-34f1f5b2730a\") " pod="openstack/neutron-744b8f5559-c67wh" Oct 07 11:38:17 crc kubenswrapper[4700]: I1007 11:38:17.625611 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b103be5-6b3d-41f7-ba2e-34f1f5b2730a-public-tls-certs\") pod \"neutron-744b8f5559-c67wh\" (UID: \"3b103be5-6b3d-41f7-ba2e-34f1f5b2730a\") " pod="openstack/neutron-744b8f5559-c67wh" Oct 07 11:38:17 crc kubenswrapper[4700]: I1007 11:38:17.625669 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b103be5-6b3d-41f7-ba2e-34f1f5b2730a-internal-tls-certs\") pod \"neutron-744b8f5559-c67wh\" (UID: \"3b103be5-6b3d-41f7-ba2e-34f1f5b2730a\") " pod="openstack/neutron-744b8f5559-c67wh" Oct 07 11:38:17 crc kubenswrapper[4700]: I1007 11:38:17.625689 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b103be5-6b3d-41f7-ba2e-34f1f5b2730a-ovndb-tls-certs\") pod \"neutron-744b8f5559-c67wh\" (UID: \"3b103be5-6b3d-41f7-ba2e-34f1f5b2730a\") " pod="openstack/neutron-744b8f5559-c67wh" Oct 07 11:38:17 crc kubenswrapper[4700]: I1007 11:38:17.727431 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3b103be5-6b3d-41f7-ba2e-34f1f5b2730a-config\") pod \"neutron-744b8f5559-c67wh\" (UID: \"3b103be5-6b3d-41f7-ba2e-34f1f5b2730a\") " pod="openstack/neutron-744b8f5559-c67wh" Oct 07 11:38:17 crc kubenswrapper[4700]: I1007 11:38:17.727745 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b103be5-6b3d-41f7-ba2e-34f1f5b2730a-public-tls-certs\") pod \"neutron-744b8f5559-c67wh\" (UID: \"3b103be5-6b3d-41f7-ba2e-34f1f5b2730a\") " pod="openstack/neutron-744b8f5559-c67wh" Oct 07 11:38:17 crc kubenswrapper[4700]: I1007 11:38:17.727769 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b103be5-6b3d-41f7-ba2e-34f1f5b2730a-internal-tls-certs\") pod \"neutron-744b8f5559-c67wh\" (UID: \"3b103be5-6b3d-41f7-ba2e-34f1f5b2730a\") " pod="openstack/neutron-744b8f5559-c67wh" Oct 07 11:38:17 crc kubenswrapper[4700]: I1007 11:38:17.727783 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b103be5-6b3d-41f7-ba2e-34f1f5b2730a-ovndb-tls-certs\") pod \"neutron-744b8f5559-c67wh\" (UID: \"3b103be5-6b3d-41f7-ba2e-34f1f5b2730a\") " pod="openstack/neutron-744b8f5559-c67wh" Oct 07 11:38:17 crc kubenswrapper[4700]: I1007 11:38:17.727821 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b103be5-6b3d-41f7-ba2e-34f1f5b2730a-combined-ca-bundle\") pod \"neutron-744b8f5559-c67wh\" (UID: \"3b103be5-6b3d-41f7-ba2e-34f1f5b2730a\") " pod="openstack/neutron-744b8f5559-c67wh" Oct 07 11:38:17 crc kubenswrapper[4700]: I1007 11:38:17.727876 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3b103be5-6b3d-41f7-ba2e-34f1f5b2730a-httpd-config\") pod \"neutron-744b8f5559-c67wh\" (UID: \"3b103be5-6b3d-41f7-ba2e-34f1f5b2730a\") " pod="openstack/neutron-744b8f5559-c67wh" Oct 07 11:38:17 crc kubenswrapper[4700]: I1007 11:38:17.727897 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-972jk\" (UniqueName: \"kubernetes.io/projected/3b103be5-6b3d-41f7-ba2e-34f1f5b2730a-kube-api-access-972jk\") pod \"neutron-744b8f5559-c67wh\" (UID: \"3b103be5-6b3d-41f7-ba2e-34f1f5b2730a\") " pod="openstack/neutron-744b8f5559-c67wh" Oct 07 11:38:17 crc kubenswrapper[4700]: I1007 11:38:17.734135 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b103be5-6b3d-41f7-ba2e-34f1f5b2730a-combined-ca-bundle\") pod \"neutron-744b8f5559-c67wh\" (UID: \"3b103be5-6b3d-41f7-ba2e-34f1f5b2730a\") " pod="openstack/neutron-744b8f5559-c67wh" Oct 07 11:38:17 crc kubenswrapper[4700]: I1007 11:38:17.735210 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b103be5-6b3d-41f7-ba2e-34f1f5b2730a-internal-tls-certs\") pod \"neutron-744b8f5559-c67wh\" (UID: \"3b103be5-6b3d-41f7-ba2e-34f1f5b2730a\") " pod="openstack/neutron-744b8f5559-c67wh" Oct 07 11:38:17 crc kubenswrapper[4700]: I1007 11:38:17.735411 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3b103be5-6b3d-41f7-ba2e-34f1f5b2730a-config\") pod \"neutron-744b8f5559-c67wh\" (UID: \"3b103be5-6b3d-41f7-ba2e-34f1f5b2730a\") " pod="openstack/neutron-744b8f5559-c67wh" Oct 07 11:38:17 crc kubenswrapper[4700]: I1007 11:38:17.736219 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b103be5-6b3d-41f7-ba2e-34f1f5b2730a-ovndb-tls-certs\") pod \"neutron-744b8f5559-c67wh\" (UID: \"3b103be5-6b3d-41f7-ba2e-34f1f5b2730a\") " pod="openstack/neutron-744b8f5559-c67wh" Oct 07 11:38:17 crc kubenswrapper[4700]: I1007 11:38:17.737879 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b103be5-6b3d-41f7-ba2e-34f1f5b2730a-public-tls-certs\") pod \"neutron-744b8f5559-c67wh\" (UID: \"3b103be5-6b3d-41f7-ba2e-34f1f5b2730a\") " pod="openstack/neutron-744b8f5559-c67wh" Oct 07 11:38:17 crc kubenswrapper[4700]: I1007 11:38:17.751336 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-972jk\" (UniqueName: \"kubernetes.io/projected/3b103be5-6b3d-41f7-ba2e-34f1f5b2730a-kube-api-access-972jk\") pod \"neutron-744b8f5559-c67wh\" (UID: \"3b103be5-6b3d-41f7-ba2e-34f1f5b2730a\") " pod="openstack/neutron-744b8f5559-c67wh" Oct 07 11:38:17 crc kubenswrapper[4700]: I1007 11:38:17.752240 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3b103be5-6b3d-41f7-ba2e-34f1f5b2730a-httpd-config\") pod \"neutron-744b8f5559-c67wh\" (UID: \"3b103be5-6b3d-41f7-ba2e-34f1f5b2730a\") " pod="openstack/neutron-744b8f5559-c67wh" Oct 07 11:38:17 crc kubenswrapper[4700]: I1007 11:38:17.810394 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-744b8f5559-c67wh" Oct 07 11:38:17 crc kubenswrapper[4700]: I1007 11:38:17.977066 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="792a56a7-50b6-45c4-9a3d-5c4104b86859" path="/var/lib/kubelet/pods/792a56a7-50b6-45c4-9a3d-5c4104b86859/volumes" Oct 07 11:38:18 crc kubenswrapper[4700]: I1007 11:38:18.224537 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9ab9a3d0-4f1c-4650-b766-836415e6cb40","Type":"ContainerStarted","Data":"b072e4f86eee19bfa03ec758eac34a52b66d93d141ab812222be5472e678858e"} Oct 07 11:38:18 crc kubenswrapper[4700]: I1007 11:38:18.395713 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-744b8f5559-c67wh"] Oct 07 11:38:18 crc kubenswrapper[4700]: W1007 11:38:18.430513 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b103be5_6b3d_41f7_ba2e_34f1f5b2730a.slice/crio-da4425bada0ef2d5299c1a007f7f0d52c9b627f0fb5cd9b0600dbae96aa6d5da WatchSource:0}: Error finding container da4425bada0ef2d5299c1a007f7f0d52c9b627f0fb5cd9b0600dbae96aa6d5da: Status 404 returned error can't find the container with id da4425bada0ef2d5299c1a007f7f0d52c9b627f0fb5cd9b0600dbae96aa6d5da Oct 07 11:38:19 crc kubenswrapper[4700]: I1007 11:38:19.233377 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9ab9a3d0-4f1c-4650-b766-836415e6cb40","Type":"ContainerStarted","Data":"74eb14031bdda67651b1eb52e8fd8d793463e24081d5cdf5ffdf430e1d9c93bf"} Oct 07 11:38:19 crc kubenswrapper[4700]: I1007 11:38:19.233700 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 07 11:38:19 crc kubenswrapper[4700]: I1007 11:38:19.235648 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-744b8f5559-c67wh" event={"ID":"3b103be5-6b3d-41f7-ba2e-34f1f5b2730a","Type":"ContainerStarted","Data":"81bab9b3d116fa46f12a5d76adf29f766b59a18b84cb8e0d485d79217061137c"} Oct 07 11:38:19 crc kubenswrapper[4700]: I1007 11:38:19.235675 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-744b8f5559-c67wh" event={"ID":"3b103be5-6b3d-41f7-ba2e-34f1f5b2730a","Type":"ContainerStarted","Data":"fef23cc3a6e10176fe03d3ab7d9e310a6f69f68f409ab0ba35f90b26e7cfcb99"} Oct 07 11:38:19 crc kubenswrapper[4700]: I1007 11:38:19.235685 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-744b8f5559-c67wh" event={"ID":"3b103be5-6b3d-41f7-ba2e-34f1f5b2730a","Type":"ContainerStarted","Data":"da4425bada0ef2d5299c1a007f7f0d52c9b627f0fb5cd9b0600dbae96aa6d5da"} Oct 07 11:38:19 crc kubenswrapper[4700]: I1007 11:38:19.236416 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-744b8f5559-c67wh" Oct 07 11:38:19 crc kubenswrapper[4700]: I1007 11:38:19.254840 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.254823173 podStartE2EDuration="3.254823173s" podCreationTimestamp="2025-10-07 11:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:38:19.252572054 +0000 UTC m=+1066.048971043" watchObservedRunningTime="2025-10-07 11:38:19.254823173 +0000 UTC m=+1066.051222162" Oct 07 11:38:19 crc kubenswrapper[4700]: I1007 11:38:19.288828 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-744b8f5559-c67wh" podStartSLOduration=2.288808748 podStartE2EDuration="2.288808748s" podCreationTimestamp="2025-10-07 11:38:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:38:19.282768381 +0000 UTC m=+1066.079167370" watchObservedRunningTime="2025-10-07 11:38:19.288808748 +0000 UTC m=+1066.085207737" Oct 07 11:38:19 crc kubenswrapper[4700]: I1007 11:38:19.617769 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 07 11:38:19 crc kubenswrapper[4700]: I1007 11:38:19.619184 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 11:38:19 crc kubenswrapper[4700]: I1007 11:38:19.621563 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 07 11:38:19 crc kubenswrapper[4700]: I1007 11:38:19.621707 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 07 11:38:19 crc kubenswrapper[4700]: I1007 11:38:19.621835 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-ftn88" Oct 07 11:38:19 crc kubenswrapper[4700]: I1007 11:38:19.629843 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 07 11:38:19 crc kubenswrapper[4700]: I1007 11:38:19.773229 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56804898-9b59-4cc9-bb99-f01704c11a0f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"56804898-9b59-4cc9-bb99-f01704c11a0f\") " pod="openstack/openstackclient" Oct 07 11:38:19 crc kubenswrapper[4700]: I1007 11:38:19.773295 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnfsq\" (UniqueName: \"kubernetes.io/projected/56804898-9b59-4cc9-bb99-f01704c11a0f-kube-api-access-mnfsq\") pod \"openstackclient\" (UID: \"56804898-9b59-4cc9-bb99-f01704c11a0f\") " pod="openstack/openstackclient" Oct 07 11:38:19 crc kubenswrapper[4700]: I1007 11:38:19.773363 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/56804898-9b59-4cc9-bb99-f01704c11a0f-openstack-config-secret\") pod \"openstackclient\" (UID: \"56804898-9b59-4cc9-bb99-f01704c11a0f\") " pod="openstack/openstackclient" Oct 07 11:38:19 crc kubenswrapper[4700]: I1007 11:38:19.773390 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/56804898-9b59-4cc9-bb99-f01704c11a0f-openstack-config\") pod \"openstackclient\" (UID: \"56804898-9b59-4cc9-bb99-f01704c11a0f\") " pod="openstack/openstackclient" Oct 07 11:38:19 crc kubenswrapper[4700]: I1007 11:38:19.874548 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnfsq\" (UniqueName: \"kubernetes.io/projected/56804898-9b59-4cc9-bb99-f01704c11a0f-kube-api-access-mnfsq\") pod \"openstackclient\" (UID: \"56804898-9b59-4cc9-bb99-f01704c11a0f\") " pod="openstack/openstackclient" Oct 07 11:38:19 crc kubenswrapper[4700]: I1007 11:38:19.874650 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/56804898-9b59-4cc9-bb99-f01704c11a0f-openstack-config-secret\") pod \"openstackclient\" (UID: \"56804898-9b59-4cc9-bb99-f01704c11a0f\") " pod="openstack/openstackclient" Oct 07 11:38:19 crc kubenswrapper[4700]: I1007 11:38:19.874694 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/56804898-9b59-4cc9-bb99-f01704c11a0f-openstack-config\") pod \"openstackclient\" (UID: \"56804898-9b59-4cc9-bb99-f01704c11a0f\") " pod="openstack/openstackclient" Oct 07 11:38:19 crc kubenswrapper[4700]: I1007 11:38:19.874798 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56804898-9b59-4cc9-bb99-f01704c11a0f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"56804898-9b59-4cc9-bb99-f01704c11a0f\") " pod="openstack/openstackclient" Oct 07 11:38:19 crc kubenswrapper[4700]: I1007 11:38:19.876388 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/56804898-9b59-4cc9-bb99-f01704c11a0f-openstack-config\") pod \"openstackclient\" (UID: \"56804898-9b59-4cc9-bb99-f01704c11a0f\") " pod="openstack/openstackclient" Oct 07 11:38:19 crc kubenswrapper[4700]: I1007 11:38:19.881266 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/56804898-9b59-4cc9-bb99-f01704c11a0f-openstack-config-secret\") pod \"openstackclient\" (UID: \"56804898-9b59-4cc9-bb99-f01704c11a0f\") " pod="openstack/openstackclient" Oct 07 11:38:19 crc kubenswrapper[4700]: I1007 11:38:19.881335 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56804898-9b59-4cc9-bb99-f01704c11a0f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"56804898-9b59-4cc9-bb99-f01704c11a0f\") " pod="openstack/openstackclient" Oct 07 11:38:19 crc kubenswrapper[4700]: I1007 11:38:19.892956 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnfsq\" (UniqueName: \"kubernetes.io/projected/56804898-9b59-4cc9-bb99-f01704c11a0f-kube-api-access-mnfsq\") pod \"openstackclient\" (UID: \"56804898-9b59-4cc9-bb99-f01704c11a0f\") " pod="openstack/openstackclient" Oct 07 11:38:19 crc kubenswrapper[4700]: I1007 11:38:19.944244 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 11:38:20 crc kubenswrapper[4700]: I1007 11:38:20.006066 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 07 11:38:20 crc kubenswrapper[4700]: I1007 11:38:20.026122 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 07 11:38:20 crc kubenswrapper[4700]: I1007 11:38:20.041795 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 07 11:38:20 crc kubenswrapper[4700]: I1007 11:38:20.043561 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 11:38:20 crc kubenswrapper[4700]: I1007 11:38:20.050239 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 07 11:38:20 crc kubenswrapper[4700]: E1007 11:38:20.128969 4700 log.go:32] "RunPodSandbox from runtime service failed" err=< Oct 07 11:38:20 crc kubenswrapper[4700]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_56804898-9b59-4cc9-bb99-f01704c11a0f_0(c4d782921473708e42133d9dbcaea7d45f9eccba765c03159ef036b9f10cf898): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"c4d782921473708e42133d9dbcaea7d45f9eccba765c03159ef036b9f10cf898" Netns:"/var/run/netns/a29cb392-3a59-437f-81e7-210a2fa48525" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=c4d782921473708e42133d9dbcaea7d45f9eccba765c03159ef036b9f10cf898;K8S_POD_UID=56804898-9b59-4cc9-bb99-f01704c11a0f" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/56804898-9b59-4cc9-bb99-f01704c11a0f]: expected pod UID "56804898-9b59-4cc9-bb99-f01704c11a0f" but got "1cb5863c-c473-49b8-9ffa-ce83d51a061c" from Kube API Oct 07 11:38:20 crc kubenswrapper[4700]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Oct 07 11:38:20 crc kubenswrapper[4700]: > Oct 07 11:38:20 crc kubenswrapper[4700]: E1007 11:38:20.129049 4700 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Oct 07 11:38:20 crc kubenswrapper[4700]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_56804898-9b59-4cc9-bb99-f01704c11a0f_0(c4d782921473708e42133d9dbcaea7d45f9eccba765c03159ef036b9f10cf898): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"c4d782921473708e42133d9dbcaea7d45f9eccba765c03159ef036b9f10cf898" Netns:"/var/run/netns/a29cb392-3a59-437f-81e7-210a2fa48525" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=c4d782921473708e42133d9dbcaea7d45f9eccba765c03159ef036b9f10cf898;K8S_POD_UID=56804898-9b59-4cc9-bb99-f01704c11a0f" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/56804898-9b59-4cc9-bb99-f01704c11a0f]: expected pod UID "56804898-9b59-4cc9-bb99-f01704c11a0f" but got "1cb5863c-c473-49b8-9ffa-ce83d51a061c" from Kube API Oct 07 11:38:20 crc kubenswrapper[4700]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Oct 07 11:38:20 crc kubenswrapper[4700]: > pod="openstack/openstackclient" Oct 07 11:38:20 crc kubenswrapper[4700]: I1007 11:38:20.181624 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1cb5863c-c473-49b8-9ffa-ce83d51a061c-openstack-config-secret\") pod \"openstackclient\" (UID: \"1cb5863c-c473-49b8-9ffa-ce83d51a061c\") " pod="openstack/openstackclient" Oct 07 11:38:20 crc kubenswrapper[4700]: I1007 11:38:20.182027 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wd6k\" (UniqueName: \"kubernetes.io/projected/1cb5863c-c473-49b8-9ffa-ce83d51a061c-kube-api-access-8wd6k\") pod \"openstackclient\" (UID: \"1cb5863c-c473-49b8-9ffa-ce83d51a061c\") " pod="openstack/openstackclient" Oct 07 11:38:20 crc kubenswrapper[4700]: I1007 11:38:20.182085 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1cb5863c-c473-49b8-9ffa-ce83d51a061c-openstack-config\") pod \"openstackclient\" (UID: \"1cb5863c-c473-49b8-9ffa-ce83d51a061c\") " pod="openstack/openstackclient" Oct 07 11:38:20 crc kubenswrapper[4700]: I1007 11:38:20.182122 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cb5863c-c473-49b8-9ffa-ce83d51a061c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1cb5863c-c473-49b8-9ffa-ce83d51a061c\") " pod="openstack/openstackclient" Oct 07 11:38:20 crc kubenswrapper[4700]: I1007 11:38:20.243275 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 11:38:20 crc kubenswrapper[4700]: I1007 11:38:20.246512 4700 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="56804898-9b59-4cc9-bb99-f01704c11a0f" podUID="1cb5863c-c473-49b8-9ffa-ce83d51a061c" Oct 07 11:38:20 crc kubenswrapper[4700]: I1007 11:38:20.252822 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 11:38:20 crc kubenswrapper[4700]: I1007 11:38:20.283949 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cb5863c-c473-49b8-9ffa-ce83d51a061c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1cb5863c-c473-49b8-9ffa-ce83d51a061c\") " pod="openstack/openstackclient" Oct 07 11:38:20 crc kubenswrapper[4700]: I1007 11:38:20.284063 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1cb5863c-c473-49b8-9ffa-ce83d51a061c-openstack-config-secret\") pod \"openstackclient\" (UID: \"1cb5863c-c473-49b8-9ffa-ce83d51a061c\") " pod="openstack/openstackclient" Oct 07 11:38:20 crc kubenswrapper[4700]: I1007 11:38:20.284119 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wd6k\" (UniqueName: \"kubernetes.io/projected/1cb5863c-c473-49b8-9ffa-ce83d51a061c-kube-api-access-8wd6k\") pod \"openstackclient\" (UID: \"1cb5863c-c473-49b8-9ffa-ce83d51a061c\") " pod="openstack/openstackclient" Oct 07 11:38:20 crc kubenswrapper[4700]: I1007 11:38:20.284157 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1cb5863c-c473-49b8-9ffa-ce83d51a061c-openstack-config\") pod \"openstackclient\" (UID: \"1cb5863c-c473-49b8-9ffa-ce83d51a061c\") " pod="openstack/openstackclient" Oct 07 11:38:20 crc kubenswrapper[4700]: I1007 11:38:20.284949 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1cb5863c-c473-49b8-9ffa-ce83d51a061c-openstack-config\") pod \"openstackclient\" (UID: \"1cb5863c-c473-49b8-9ffa-ce83d51a061c\") " pod="openstack/openstackclient" Oct 07 11:38:20 crc kubenswrapper[4700]: I1007 11:38:20.288092 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cb5863c-c473-49b8-9ffa-ce83d51a061c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1cb5863c-c473-49b8-9ffa-ce83d51a061c\") " pod="openstack/openstackclient" Oct 07 11:38:20 crc kubenswrapper[4700]: I1007 11:38:20.300949 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1cb5863c-c473-49b8-9ffa-ce83d51a061c-openstack-config-secret\") pod \"openstackclient\" (UID: \"1cb5863c-c473-49b8-9ffa-ce83d51a061c\") " pod="openstack/openstackclient" Oct 07 11:38:20 crc kubenswrapper[4700]: I1007 11:38:20.307331 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wd6k\" (UniqueName: \"kubernetes.io/projected/1cb5863c-c473-49b8-9ffa-ce83d51a061c-kube-api-access-8wd6k\") pod \"openstackclient\" (UID: \"1cb5863c-c473-49b8-9ffa-ce83d51a061c\") " pod="openstack/openstackclient" Oct 07 11:38:20 crc kubenswrapper[4700]: I1007 11:38:20.376645 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 11:38:20 crc kubenswrapper[4700]: I1007 11:38:20.385645 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/56804898-9b59-4cc9-bb99-f01704c11a0f-openstack-config\") pod \"56804898-9b59-4cc9-bb99-f01704c11a0f\" (UID: \"56804898-9b59-4cc9-bb99-f01704c11a0f\") " Oct 07 11:38:20 crc kubenswrapper[4700]: I1007 11:38:20.385736 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56804898-9b59-4cc9-bb99-f01704c11a0f-combined-ca-bundle\") pod \"56804898-9b59-4cc9-bb99-f01704c11a0f\" (UID: \"56804898-9b59-4cc9-bb99-f01704c11a0f\") " Oct 07 11:38:20 crc kubenswrapper[4700]: I1007 11:38:20.385789 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/56804898-9b59-4cc9-bb99-f01704c11a0f-openstack-config-secret\") pod \"56804898-9b59-4cc9-bb99-f01704c11a0f\" (UID: \"56804898-9b59-4cc9-bb99-f01704c11a0f\") " Oct 07 11:38:20 crc kubenswrapper[4700]: I1007 11:38:20.386050 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnfsq\" (UniqueName: \"kubernetes.io/projected/56804898-9b59-4cc9-bb99-f01704c11a0f-kube-api-access-mnfsq\") pod \"56804898-9b59-4cc9-bb99-f01704c11a0f\" (UID: \"56804898-9b59-4cc9-bb99-f01704c11a0f\") " Oct 07 11:38:20 crc kubenswrapper[4700]: I1007 11:38:20.390557 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56804898-9b59-4cc9-bb99-f01704c11a0f-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "56804898-9b59-4cc9-bb99-f01704c11a0f" (UID: "56804898-9b59-4cc9-bb99-f01704c11a0f"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:38:20 crc kubenswrapper[4700]: I1007 11:38:20.392594 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56804898-9b59-4cc9-bb99-f01704c11a0f-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "56804898-9b59-4cc9-bb99-f01704c11a0f" (UID: "56804898-9b59-4cc9-bb99-f01704c11a0f"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:20 crc kubenswrapper[4700]: I1007 11:38:20.393360 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56804898-9b59-4cc9-bb99-f01704c11a0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56804898-9b59-4cc9-bb99-f01704c11a0f" (UID: "56804898-9b59-4cc9-bb99-f01704c11a0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:20 crc kubenswrapper[4700]: I1007 11:38:20.394632 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56804898-9b59-4cc9-bb99-f01704c11a0f-kube-api-access-mnfsq" (OuterVolumeSpecName: "kube-api-access-mnfsq") pod "56804898-9b59-4cc9-bb99-f01704c11a0f" (UID: "56804898-9b59-4cc9-bb99-f01704c11a0f"). InnerVolumeSpecName "kube-api-access-mnfsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:38:20 crc kubenswrapper[4700]: I1007 11:38:20.489198 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnfsq\" (UniqueName: \"kubernetes.io/projected/56804898-9b59-4cc9-bb99-f01704c11a0f-kube-api-access-mnfsq\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:20 crc kubenswrapper[4700]: I1007 11:38:20.489255 4700 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/56804898-9b59-4cc9-bb99-f01704c11a0f-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:20 crc kubenswrapper[4700]: I1007 11:38:20.489268 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56804898-9b59-4cc9-bb99-f01704c11a0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:20 crc kubenswrapper[4700]: I1007 11:38:20.489281 4700 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/56804898-9b59-4cc9-bb99-f01704c11a0f-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:20 crc kubenswrapper[4700]: I1007 11:38:20.854947 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 07 11:38:20 crc kubenswrapper[4700]: W1007 11:38:20.857176 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cb5863c_c473_49b8_9ffa_ce83d51a061c.slice/crio-a360bd6c0374704173a13cae89a057db2be5b08f804b296670b2fa1f48e622bf WatchSource:0}: Error finding container a360bd6c0374704173a13cae89a057db2be5b08f804b296670b2fa1f48e622bf: Status 404 returned error can't find the container with id a360bd6c0374704173a13cae89a057db2be5b08f804b296670b2fa1f48e622bf Oct 07 11:38:21 crc kubenswrapper[4700]: I1007 11:38:21.254553 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"1cb5863c-c473-49b8-9ffa-ce83d51a061c","Type":"ContainerStarted","Data":"a360bd6c0374704173a13cae89a057db2be5b08f804b296670b2fa1f48e622bf"} Oct 07 11:38:21 crc kubenswrapper[4700]: I1007 11:38:21.254606 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 11:38:21 crc kubenswrapper[4700]: I1007 11:38:21.257511 4700 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="56804898-9b59-4cc9-bb99-f01704c11a0f" podUID="1cb5863c-c473-49b8-9ffa-ce83d51a061c" Oct 07 11:38:21 crc kubenswrapper[4700]: I1007 11:38:21.682467 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 07 11:38:21 crc kubenswrapper[4700]: I1007 11:38:21.751856 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 11:38:21 crc kubenswrapper[4700]: I1007 11:38:21.980671 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56804898-9b59-4cc9-bb99-f01704c11a0f" path="/var/lib/kubelet/pods/56804898-9b59-4cc9-bb99-f01704c11a0f/volumes" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.017837 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-759f9b554d-c5s6x"] Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.019216 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-759f9b554d-c5s6x" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.025913 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.026022 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-bxh9j" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.026149 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.033140 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-759f9b554d-c5s6x"] Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.104250 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-79dfccd884-2qrjv"] Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.105597 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-79dfccd884-2qrjv" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.107495 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.114993 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-79dfccd884-2qrjv"] Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.124674 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-t6mnp"] Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.124898 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5784cf869f-t6mnp" podUID="22a1da61-42b1-4a7a-bf08-511689c59a16" containerName="dnsmasq-dns" containerID="cri-o://e8e6f362d3fd8bda4f37e105243a3735d4f6b5ac13434e4af2fd4b7ef2d7c99c" gracePeriod=10 Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.128340 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7389d16-ec46-47d3-9466-2b844946b6c6-config-data-custom\") pod \"heat-engine-759f9b554d-c5s6x\" (UID: \"b7389d16-ec46-47d3-9466-2b844946b6c6\") " pod="openstack/heat-engine-759f9b554d-c5s6x" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.128422 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs5cd\" (UniqueName: \"kubernetes.io/projected/b7389d16-ec46-47d3-9466-2b844946b6c6-kube-api-access-zs5cd\") pod \"heat-engine-759f9b554d-c5s6x\" (UID: \"b7389d16-ec46-47d3-9466-2b844946b6c6\") " pod="openstack/heat-engine-759f9b554d-c5s6x" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.128457 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7389d16-ec46-47d3-9466-2b844946b6c6-combined-ca-bundle\") pod \"heat-engine-759f9b554d-c5s6x\" (UID: \"b7389d16-ec46-47d3-9466-2b844946b6c6\") " pod="openstack/heat-engine-759f9b554d-c5s6x" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.128481 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7389d16-ec46-47d3-9466-2b844946b6c6-config-data\") pod \"heat-engine-759f9b554d-c5s6x\" (UID: \"b7389d16-ec46-47d3-9466-2b844946b6c6\") " pod="openstack/heat-engine-759f9b554d-c5s6x" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.128611 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5784cf869f-t6mnp" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.144485 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-89899c7cb-nsfvj"] Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.145995 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-89899c7cb-nsfvj" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.156791 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.178370 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f6bc4c6c9-4mfq6"] Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.179890 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f6bc4c6c9-4mfq6" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.189022 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-89899c7cb-nsfvj"] Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.212777 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f6bc4c6c9-4mfq6"] Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.230583 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7389d16-ec46-47d3-9466-2b844946b6c6-combined-ca-bundle\") pod \"heat-engine-759f9b554d-c5s6x\" (UID: \"b7389d16-ec46-47d3-9466-2b844946b6c6\") " pod="openstack/heat-engine-759f9b554d-c5s6x" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.230626 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a816d013-1668-445f-a7ab-0d25ad465c16-config-data\") pod \"heat-cfnapi-79dfccd884-2qrjv\" (UID: \"a816d013-1668-445f-a7ab-0d25ad465c16\") " pod="openstack/heat-cfnapi-79dfccd884-2qrjv" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.230658 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7389d16-ec46-47d3-9466-2b844946b6c6-config-data\") pod \"heat-engine-759f9b554d-c5s6x\" (UID: \"b7389d16-ec46-47d3-9466-2b844946b6c6\") " pod="openstack/heat-engine-759f9b554d-c5s6x" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.230693 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a816d013-1668-445f-a7ab-0d25ad465c16-config-data-custom\") pod \"heat-cfnapi-79dfccd884-2qrjv\" (UID: \"a816d013-1668-445f-a7ab-0d25ad465c16\") " pod="openstack/heat-cfnapi-79dfccd884-2qrjv" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.230959 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a816d013-1668-445f-a7ab-0d25ad465c16-combined-ca-bundle\") pod \"heat-cfnapi-79dfccd884-2qrjv\" (UID: \"a816d013-1668-445f-a7ab-0d25ad465c16\") " pod="openstack/heat-cfnapi-79dfccd884-2qrjv" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.231027 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmwb2\" (UniqueName: \"kubernetes.io/projected/a816d013-1668-445f-a7ab-0d25ad465c16-kube-api-access-rmwb2\") pod \"heat-cfnapi-79dfccd884-2qrjv\" (UID: \"a816d013-1668-445f-a7ab-0d25ad465c16\") " pod="openstack/heat-cfnapi-79dfccd884-2qrjv" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.231150 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7389d16-ec46-47d3-9466-2b844946b6c6-config-data-custom\") pod \"heat-engine-759f9b554d-c5s6x\" (UID: \"b7389d16-ec46-47d3-9466-2b844946b6c6\") " pod="openstack/heat-engine-759f9b554d-c5s6x" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.231331 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs5cd\" (UniqueName: \"kubernetes.io/projected/b7389d16-ec46-47d3-9466-2b844946b6c6-kube-api-access-zs5cd\") pod \"heat-engine-759f9b554d-c5s6x\" (UID: \"b7389d16-ec46-47d3-9466-2b844946b6c6\") " pod="openstack/heat-engine-759f9b554d-c5s6x" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.238264 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7389d16-ec46-47d3-9466-2b844946b6c6-config-data\") pod \"heat-engine-759f9b554d-c5s6x\" (UID: \"b7389d16-ec46-47d3-9466-2b844946b6c6\") " pod="openstack/heat-engine-759f9b554d-c5s6x" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.270161 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7389d16-ec46-47d3-9466-2b844946b6c6-combined-ca-bundle\") pod \"heat-engine-759f9b554d-c5s6x\" (UID: \"b7389d16-ec46-47d3-9466-2b844946b6c6\") " pod="openstack/heat-engine-759f9b554d-c5s6x" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.271383 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7389d16-ec46-47d3-9466-2b844946b6c6-config-data-custom\") pod \"heat-engine-759f9b554d-c5s6x\" (UID: \"b7389d16-ec46-47d3-9466-2b844946b6c6\") " pod="openstack/heat-engine-759f9b554d-c5s6x" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.278209 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs5cd\" (UniqueName: \"kubernetes.io/projected/b7389d16-ec46-47d3-9466-2b844946b6c6-kube-api-access-zs5cd\") pod \"heat-engine-759f9b554d-c5s6x\" (UID: \"b7389d16-ec46-47d3-9466-2b844946b6c6\") " pod="openstack/heat-engine-759f9b554d-c5s6x" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.332903 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a816d013-1668-445f-a7ab-0d25ad465c16-config-data-custom\") pod \"heat-cfnapi-79dfccd884-2qrjv\" (UID: \"a816d013-1668-445f-a7ab-0d25ad465c16\") " pod="openstack/heat-cfnapi-79dfccd884-2qrjv" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.332968 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0366de18-d4be-4a04-8bc7-b6343f5fc3f8-ovsdbserver-sb\") pod \"dnsmasq-dns-f6bc4c6c9-4mfq6\" (UID: \"0366de18-d4be-4a04-8bc7-b6343f5fc3f8\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-4mfq6" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.332995 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0366de18-d4be-4a04-8bc7-b6343f5fc3f8-dns-swift-storage-0\") pod \"dnsmasq-dns-f6bc4c6c9-4mfq6\" (UID: \"0366de18-d4be-4a04-8bc7-b6343f5fc3f8\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-4mfq6" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.333028 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a816d013-1668-445f-a7ab-0d25ad465c16-combined-ca-bundle\") pod \"heat-cfnapi-79dfccd884-2qrjv\" (UID: \"a816d013-1668-445f-a7ab-0d25ad465c16\") " pod="openstack/heat-cfnapi-79dfccd884-2qrjv" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.333051 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmwb2\" (UniqueName: \"kubernetes.io/projected/a816d013-1668-445f-a7ab-0d25ad465c16-kube-api-access-rmwb2\") pod \"heat-cfnapi-79dfccd884-2qrjv\" (UID: \"a816d013-1668-445f-a7ab-0d25ad465c16\") " pod="openstack/heat-cfnapi-79dfccd884-2qrjv" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.333070 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37d17d5c-b3c0-46bd-971e-194cda30f554-config-data-custom\") pod \"heat-api-89899c7cb-nsfvj\" (UID: \"37d17d5c-b3c0-46bd-971e-194cda30f554\") " pod="openstack/heat-api-89899c7cb-nsfvj" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.333089 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0366de18-d4be-4a04-8bc7-b6343f5fc3f8-ovsdbserver-nb\") pod \"dnsmasq-dns-f6bc4c6c9-4mfq6\" (UID: \"0366de18-d4be-4a04-8bc7-b6343f5fc3f8\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-4mfq6" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.333123 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxlf7\" (UniqueName: \"kubernetes.io/projected/0366de18-d4be-4a04-8bc7-b6343f5fc3f8-kube-api-access-dxlf7\") pod \"dnsmasq-dns-f6bc4c6c9-4mfq6\" (UID: \"0366de18-d4be-4a04-8bc7-b6343f5fc3f8\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-4mfq6" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.333176 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0366de18-d4be-4a04-8bc7-b6343f5fc3f8-dns-svc\") pod \"dnsmasq-dns-f6bc4c6c9-4mfq6\" (UID: \"0366de18-d4be-4a04-8bc7-b6343f5fc3f8\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-4mfq6" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.333201 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d17d5c-b3c0-46bd-971e-194cda30f554-combined-ca-bundle\") pod \"heat-api-89899c7cb-nsfvj\" (UID: \"37d17d5c-b3c0-46bd-971e-194cda30f554\") " pod="openstack/heat-api-89899c7cb-nsfvj" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.333229 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a816d013-1668-445f-a7ab-0d25ad465c16-config-data\") pod \"heat-cfnapi-79dfccd884-2qrjv\" (UID: \"a816d013-1668-445f-a7ab-0d25ad465c16\") " pod="openstack/heat-cfnapi-79dfccd884-2qrjv" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.333243 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0366de18-d4be-4a04-8bc7-b6343f5fc3f8-config\") pod \"dnsmasq-dns-f6bc4c6c9-4mfq6\" (UID: \"0366de18-d4be-4a04-8bc7-b6343f5fc3f8\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-4mfq6" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.333258 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37d17d5c-b3c0-46bd-971e-194cda30f554-config-data\") pod \"heat-api-89899c7cb-nsfvj\" (UID: \"37d17d5c-b3c0-46bd-971e-194cda30f554\") " pod="openstack/heat-api-89899c7cb-nsfvj" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.333275 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl5w4\" (UniqueName: \"kubernetes.io/projected/37d17d5c-b3c0-46bd-971e-194cda30f554-kube-api-access-fl5w4\") pod \"heat-api-89899c7cb-nsfvj\" (UID: \"37d17d5c-b3c0-46bd-971e-194cda30f554\") " pod="openstack/heat-api-89899c7cb-nsfvj" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.340672 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a816d013-1668-445f-a7ab-0d25ad465c16-combined-ca-bundle\") pod \"heat-cfnapi-79dfccd884-2qrjv\" (UID: \"a816d013-1668-445f-a7ab-0d25ad465c16\") " pod="openstack/heat-cfnapi-79dfccd884-2qrjv" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.341836 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a816d013-1668-445f-a7ab-0d25ad465c16-config-data-custom\") pod \"heat-cfnapi-79dfccd884-2qrjv\" (UID: \"a816d013-1668-445f-a7ab-0d25ad465c16\") " pod="openstack/heat-cfnapi-79dfccd884-2qrjv" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.363080 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-759f9b554d-c5s6x" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.363956 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmwb2\" (UniqueName: \"kubernetes.io/projected/a816d013-1668-445f-a7ab-0d25ad465c16-kube-api-access-rmwb2\") pod \"heat-cfnapi-79dfccd884-2qrjv\" (UID: \"a816d013-1668-445f-a7ab-0d25ad465c16\") " pod="openstack/heat-cfnapi-79dfccd884-2qrjv" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.369718 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a816d013-1668-445f-a7ab-0d25ad465c16-config-data\") pod \"heat-cfnapi-79dfccd884-2qrjv\" (UID: \"a816d013-1668-445f-a7ab-0d25ad465c16\") " pod="openstack/heat-cfnapi-79dfccd884-2qrjv" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.403885 4700 generic.go:334] "Generic (PLEG): container finished" podID="22a1da61-42b1-4a7a-bf08-511689c59a16" containerID="e8e6f362d3fd8bda4f37e105243a3735d4f6b5ac13434e4af2fd4b7ef2d7c99c" exitCode=0 Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.404120 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="9fa37e8f-6dfc-44f0-8fab-1dfa641475cb" containerName="cinder-scheduler" containerID="cri-o://fad17572c6002f3df5339bd519e5c09534590a84db96bfc369b020f90e1dbe0e" gracePeriod=30 Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.404215 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-t6mnp" event={"ID":"22a1da61-42b1-4a7a-bf08-511689c59a16","Type":"ContainerDied","Data":"e8e6f362d3fd8bda4f37e105243a3735d4f6b5ac13434e4af2fd4b7ef2d7c99c"} Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.404335 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="9fa37e8f-6dfc-44f0-8fab-1dfa641475cb" containerName="probe" containerID="cri-o://b05ef4da362b57f929cdd908137a3dd1230c10fadbd478752cd7975245420543" gracePeriod=30 Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.429432 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-79dfccd884-2qrjv" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.437936 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0366de18-d4be-4a04-8bc7-b6343f5fc3f8-dns-svc\") pod \"dnsmasq-dns-f6bc4c6c9-4mfq6\" (UID: \"0366de18-d4be-4a04-8bc7-b6343f5fc3f8\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-4mfq6" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.437991 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d17d5c-b3c0-46bd-971e-194cda30f554-combined-ca-bundle\") pod \"heat-api-89899c7cb-nsfvj\" (UID: \"37d17d5c-b3c0-46bd-971e-194cda30f554\") " pod="openstack/heat-api-89899c7cb-nsfvj" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.438023 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0366de18-d4be-4a04-8bc7-b6343f5fc3f8-config\") pod \"dnsmasq-dns-f6bc4c6c9-4mfq6\" (UID: \"0366de18-d4be-4a04-8bc7-b6343f5fc3f8\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-4mfq6" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.438041 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37d17d5c-b3c0-46bd-971e-194cda30f554-config-data\") pod \"heat-api-89899c7cb-nsfvj\" (UID: \"37d17d5c-b3c0-46bd-971e-194cda30f554\") " pod="openstack/heat-api-89899c7cb-nsfvj" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.438067 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl5w4\" (UniqueName: \"kubernetes.io/projected/37d17d5c-b3c0-46bd-971e-194cda30f554-kube-api-access-fl5w4\") pod \"heat-api-89899c7cb-nsfvj\" (UID: \"37d17d5c-b3c0-46bd-971e-194cda30f554\") " pod="openstack/heat-api-89899c7cb-nsfvj" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.438117 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0366de18-d4be-4a04-8bc7-b6343f5fc3f8-ovsdbserver-sb\") pod \"dnsmasq-dns-f6bc4c6c9-4mfq6\" (UID: \"0366de18-d4be-4a04-8bc7-b6343f5fc3f8\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-4mfq6" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.438141 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0366de18-d4be-4a04-8bc7-b6343f5fc3f8-dns-swift-storage-0\") pod \"dnsmasq-dns-f6bc4c6c9-4mfq6\" (UID: \"0366de18-d4be-4a04-8bc7-b6343f5fc3f8\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-4mfq6" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.438178 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37d17d5c-b3c0-46bd-971e-194cda30f554-config-data-custom\") pod \"heat-api-89899c7cb-nsfvj\" (UID: \"37d17d5c-b3c0-46bd-971e-194cda30f554\") " pod="openstack/heat-api-89899c7cb-nsfvj" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.438196 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0366de18-d4be-4a04-8bc7-b6343f5fc3f8-ovsdbserver-nb\") pod \"dnsmasq-dns-f6bc4c6c9-4mfq6\" (UID: \"0366de18-d4be-4a04-8bc7-b6343f5fc3f8\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-4mfq6" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.438227 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxlf7\" (UniqueName: \"kubernetes.io/projected/0366de18-d4be-4a04-8bc7-b6343f5fc3f8-kube-api-access-dxlf7\") pod \"dnsmasq-dns-f6bc4c6c9-4mfq6\" (UID: \"0366de18-d4be-4a04-8bc7-b6343f5fc3f8\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-4mfq6" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.440648 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0366de18-d4be-4a04-8bc7-b6343f5fc3f8-dns-svc\") pod \"dnsmasq-dns-f6bc4c6c9-4mfq6\" (UID: \"0366de18-d4be-4a04-8bc7-b6343f5fc3f8\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-4mfq6" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.440973 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0366de18-d4be-4a04-8bc7-b6343f5fc3f8-dns-swift-storage-0\") pod \"dnsmasq-dns-f6bc4c6c9-4mfq6\" (UID: \"0366de18-d4be-4a04-8bc7-b6343f5fc3f8\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-4mfq6" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.452982 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0366de18-d4be-4a04-8bc7-b6343f5fc3f8-ovsdbserver-sb\") pod \"dnsmasq-dns-f6bc4c6c9-4mfq6\" (UID: \"0366de18-d4be-4a04-8bc7-b6343f5fc3f8\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-4mfq6" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.453737 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0366de18-d4be-4a04-8bc7-b6343f5fc3f8-config\") pod \"dnsmasq-dns-f6bc4c6c9-4mfq6\" (UID: \"0366de18-d4be-4a04-8bc7-b6343f5fc3f8\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-4mfq6" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.454812 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0366de18-d4be-4a04-8bc7-b6343f5fc3f8-ovsdbserver-nb\") pod \"dnsmasq-dns-f6bc4c6c9-4mfq6\" (UID: \"0366de18-d4be-4a04-8bc7-b6343f5fc3f8\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-4mfq6" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.463398 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d17d5c-b3c0-46bd-971e-194cda30f554-combined-ca-bundle\") pod \"heat-api-89899c7cb-nsfvj\" (UID: \"37d17d5c-b3c0-46bd-971e-194cda30f554\") " pod="openstack/heat-api-89899c7cb-nsfvj" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.464002 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37d17d5c-b3c0-46bd-971e-194cda30f554-config-data-custom\") pod \"heat-api-89899c7cb-nsfvj\" (UID: \"37d17d5c-b3c0-46bd-971e-194cda30f554\") " pod="openstack/heat-api-89899c7cb-nsfvj" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.467223 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37d17d5c-b3c0-46bd-971e-194cda30f554-config-data\") pod \"heat-api-89899c7cb-nsfvj\" (UID: \"37d17d5c-b3c0-46bd-971e-194cda30f554\") " pod="openstack/heat-api-89899c7cb-nsfvj" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.469070 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl5w4\" (UniqueName: \"kubernetes.io/projected/37d17d5c-b3c0-46bd-971e-194cda30f554-kube-api-access-fl5w4\") pod \"heat-api-89899c7cb-nsfvj\" (UID: \"37d17d5c-b3c0-46bd-971e-194cda30f554\") " pod="openstack/heat-api-89899c7cb-nsfvj" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.474344 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxlf7\" (UniqueName: \"kubernetes.io/projected/0366de18-d4be-4a04-8bc7-b6343f5fc3f8-kube-api-access-dxlf7\") pod \"dnsmasq-dns-f6bc4c6c9-4mfq6\" (UID: \"0366de18-d4be-4a04-8bc7-b6343f5fc3f8\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-4mfq6" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.535416 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-89899c7cb-nsfvj" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.735955 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-t6mnp" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.751580 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f6bc4c6c9-4mfq6" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.853113 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22a1da61-42b1-4a7a-bf08-511689c59a16-ovsdbserver-sb\") pod \"22a1da61-42b1-4a7a-bf08-511689c59a16\" (UID: \"22a1da61-42b1-4a7a-bf08-511689c59a16\") " Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.853160 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22a1da61-42b1-4a7a-bf08-511689c59a16-dns-svc\") pod \"22a1da61-42b1-4a7a-bf08-511689c59a16\" (UID: \"22a1da61-42b1-4a7a-bf08-511689c59a16\") " Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.853177 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22a1da61-42b1-4a7a-bf08-511689c59a16-dns-swift-storage-0\") pod \"22a1da61-42b1-4a7a-bf08-511689c59a16\" (UID: \"22a1da61-42b1-4a7a-bf08-511689c59a16\") " Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.853214 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v76bz\" (UniqueName: \"kubernetes.io/projected/22a1da61-42b1-4a7a-bf08-511689c59a16-kube-api-access-v76bz\") pod \"22a1da61-42b1-4a7a-bf08-511689c59a16\" (UID: \"22a1da61-42b1-4a7a-bf08-511689c59a16\") " Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.853247 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22a1da61-42b1-4a7a-bf08-511689c59a16-config\") pod \"22a1da61-42b1-4a7a-bf08-511689c59a16\" (UID: \"22a1da61-42b1-4a7a-bf08-511689c59a16\") " Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.853290 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22a1da61-42b1-4a7a-bf08-511689c59a16-ovsdbserver-nb\") pod \"22a1da61-42b1-4a7a-bf08-511689c59a16\" (UID: \"22a1da61-42b1-4a7a-bf08-511689c59a16\") " Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.876469 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22a1da61-42b1-4a7a-bf08-511689c59a16-kube-api-access-v76bz" (OuterVolumeSpecName: "kube-api-access-v76bz") pod "22a1da61-42b1-4a7a-bf08-511689c59a16" (UID: "22a1da61-42b1-4a7a-bf08-511689c59a16"). InnerVolumeSpecName "kube-api-access-v76bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.925934 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22a1da61-42b1-4a7a-bf08-511689c59a16-config" (OuterVolumeSpecName: "config") pod "22a1da61-42b1-4a7a-bf08-511689c59a16" (UID: "22a1da61-42b1-4a7a-bf08-511689c59a16"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.938118 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22a1da61-42b1-4a7a-bf08-511689c59a16-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "22a1da61-42b1-4a7a-bf08-511689c59a16" (UID: "22a1da61-42b1-4a7a-bf08-511689c59a16"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.938726 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22a1da61-42b1-4a7a-bf08-511689c59a16-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "22a1da61-42b1-4a7a-bf08-511689c59a16" (UID: "22a1da61-42b1-4a7a-bf08-511689c59a16"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.947949 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22a1da61-42b1-4a7a-bf08-511689c59a16-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "22a1da61-42b1-4a7a-bf08-511689c59a16" (UID: "22a1da61-42b1-4a7a-bf08-511689c59a16"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.954818 4700 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22a1da61-42b1-4a7a-bf08-511689c59a16-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.954844 4700 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22a1da61-42b1-4a7a-bf08-511689c59a16-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.954854 4700 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22a1da61-42b1-4a7a-bf08-511689c59a16-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.954862 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v76bz\" (UniqueName: \"kubernetes.io/projected/22a1da61-42b1-4a7a-bf08-511689c59a16-kube-api-access-v76bz\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.954873 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22a1da61-42b1-4a7a-bf08-511689c59a16-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:22 crc kubenswrapper[4700]: I1007 11:38:22.980324 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22a1da61-42b1-4a7a-bf08-511689c59a16-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "22a1da61-42b1-4a7a-bf08-511689c59a16" (UID: "22a1da61-42b1-4a7a-bf08-511689c59a16"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:38:23 crc kubenswrapper[4700]: I1007 11:38:23.021952 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-759f9b554d-c5s6x"] Oct 07 11:38:23 crc kubenswrapper[4700]: I1007 11:38:23.059405 4700 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22a1da61-42b1-4a7a-bf08-511689c59a16-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:23 crc kubenswrapper[4700]: I1007 11:38:23.162748 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-79dfccd884-2qrjv"] Oct 07 11:38:23 crc kubenswrapper[4700]: I1007 11:38:23.262957 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-89899c7cb-nsfvj"] Oct 07 11:38:23 crc kubenswrapper[4700]: W1007 11:38:23.284381 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37d17d5c_b3c0_46bd_971e_194cda30f554.slice/crio-ce32ffef18cd6864708325d88ef63e3429c6d649aab2cc443c05d2c03112d8a0 WatchSource:0}: Error finding container ce32ffef18cd6864708325d88ef63e3429c6d649aab2cc443c05d2c03112d8a0: Status 404 returned error can't find the container with id ce32ffef18cd6864708325d88ef63e3429c6d649aab2cc443c05d2c03112d8a0 Oct 07 11:38:23 crc kubenswrapper[4700]: I1007 11:38:23.374910 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f6bc4c6c9-4mfq6"] Oct 07 11:38:23 crc kubenswrapper[4700]: I1007 11:38:23.420629 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6bc4c6c9-4mfq6" event={"ID":"0366de18-d4be-4a04-8bc7-b6343f5fc3f8","Type":"ContainerStarted","Data":"5445846a9d2be1fa12fe5f1e511331b5de4a91e53ed584762fcef7fe34c15919"} Oct 07 11:38:23 crc kubenswrapper[4700]: I1007 11:38:23.422386 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-89899c7cb-nsfvj" event={"ID":"37d17d5c-b3c0-46bd-971e-194cda30f554","Type":"ContainerStarted","Data":"ce32ffef18cd6864708325d88ef63e3429c6d649aab2cc443c05d2c03112d8a0"} Oct 07 11:38:23 crc kubenswrapper[4700]: I1007 11:38:23.423420 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-79dfccd884-2qrjv" event={"ID":"a816d013-1668-445f-a7ab-0d25ad465c16","Type":"ContainerStarted","Data":"eb6f8f74ae6c890a9c3d25c23bbea6b1300091256a40e2f694dd392fbdb28e1d"} Oct 07 11:38:23 crc kubenswrapper[4700]: I1007 11:38:23.425374 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-t6mnp" event={"ID":"22a1da61-42b1-4a7a-bf08-511689c59a16","Type":"ContainerDied","Data":"606fd13a7bb1600e518dbcf1ce71a030fcc5a0d4cba13cf10f28d15238599b6d"} Oct 07 11:38:23 crc kubenswrapper[4700]: I1007 11:38:23.425405 4700 scope.go:117] "RemoveContainer" containerID="e8e6f362d3fd8bda4f37e105243a3735d4f6b5ac13434e4af2fd4b7ef2d7c99c" Oct 07 11:38:23 crc kubenswrapper[4700]: I1007 11:38:23.425547 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-t6mnp" Oct 07 11:38:23 crc kubenswrapper[4700]: I1007 11:38:23.437141 4700 generic.go:334] "Generic (PLEG): container finished" podID="9fa37e8f-6dfc-44f0-8fab-1dfa641475cb" containerID="b05ef4da362b57f929cdd908137a3dd1230c10fadbd478752cd7975245420543" exitCode=0 Oct 07 11:38:23 crc kubenswrapper[4700]: I1007 11:38:23.437189 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9fa37e8f-6dfc-44f0-8fab-1dfa641475cb","Type":"ContainerDied","Data":"b05ef4da362b57f929cdd908137a3dd1230c10fadbd478752cd7975245420543"} Oct 07 11:38:23 crc kubenswrapper[4700]: I1007 11:38:23.439819 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-759f9b554d-c5s6x" event={"ID":"b7389d16-ec46-47d3-9466-2b844946b6c6","Type":"ContainerStarted","Data":"370391fb89d0b6dbfe5285a98a41f53e1e1f4d782e3b568d7f8fc6e55edf5a19"} Oct 07 11:38:23 crc kubenswrapper[4700]: I1007 11:38:23.439843 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-759f9b554d-c5s6x" event={"ID":"b7389d16-ec46-47d3-9466-2b844946b6c6","Type":"ContainerStarted","Data":"5091ad7e301d50258a7b61dce2237a37cdff4992ba75a4d5370ddf3d6f90764e"} Oct 07 11:38:23 crc kubenswrapper[4700]: I1007 11:38:23.440700 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-759f9b554d-c5s6x" Oct 07 11:38:23 crc kubenswrapper[4700]: I1007 11:38:23.475796 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-759f9b554d-c5s6x" podStartSLOduration=2.475777827 podStartE2EDuration="2.475777827s" podCreationTimestamp="2025-10-07 11:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:38:23.463679352 +0000 UTC m=+1070.260078341" watchObservedRunningTime="2025-10-07 11:38:23.475777827 +0000 UTC m=+1070.272176816" Oct 07 11:38:23 crc kubenswrapper[4700]: I1007 11:38:23.546404 4700 scope.go:117] "RemoveContainer" containerID="f0e131e76d6c7cf8906405d8fef1832141eb007d8aa0f5b261546e44fa897a55" Oct 07 11:38:23 crc kubenswrapper[4700]: I1007 11:38:23.558015 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-t6mnp"] Oct 07 11:38:23 crc kubenswrapper[4700]: I1007 11:38:23.571262 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-t6mnp"] Oct 07 11:38:23 crc kubenswrapper[4700]: I1007 11:38:23.977660 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22a1da61-42b1-4a7a-bf08-511689c59a16" path="/var/lib/kubelet/pods/22a1da61-42b1-4a7a-bf08-511689c59a16/volumes" Oct 07 11:38:24 crc kubenswrapper[4700]: I1007 11:38:24.466024 4700 generic.go:334] "Generic (PLEG): container finished" podID="0366de18-d4be-4a04-8bc7-b6343f5fc3f8" containerID="aab20955711c5d551669a5cabacec100b65721767ecfd298073c7c73c3d02a6d" exitCode=0 Oct 07 11:38:24 crc kubenswrapper[4700]: I1007 11:38:24.467556 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6bc4c6c9-4mfq6" event={"ID":"0366de18-d4be-4a04-8bc7-b6343f5fc3f8","Type":"ContainerDied","Data":"aab20955711c5d551669a5cabacec100b65721767ecfd298073c7c73c3d02a6d"} Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.039862 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.040519 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6a92f3ac-8266-4315-92f9-87bfffdc5660" containerName="ceilometer-central-agent" containerID="cri-o://e6d56bce692368beab37f3b7e247428068e180485ce31f6d8762a56c4b56248c" gracePeriod=30 Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.042063 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6a92f3ac-8266-4315-92f9-87bfffdc5660" containerName="sg-core" containerID="cri-o://93173686d028ce24f9157f7fdf9d1711876ac324a6aadc898e0a3f2526f1e730" gracePeriod=30 Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.042092 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6a92f3ac-8266-4315-92f9-87bfffdc5660" containerName="proxy-httpd" containerID="cri-o://ebb70f04c36bc4ab7c8c63ce82ead44c3fb182ad3084e81bb4259c1c980990ee" gracePeriod=30 Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.042193 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6a92f3ac-8266-4315-92f9-87bfffdc5660" containerName="ceilometer-notification-agent" containerID="cri-o://2739cdef6b7e2e259fb240f04cddb29fea6bbf362e7d5f1135d21f8fa61bb21b" gracePeriod=30 Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.058564 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="6a92f3ac-8266-4315-92f9-87bfffdc5660" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.159:3000/\": EOF" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.495761 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6bc4c6c9-4mfq6" event={"ID":"0366de18-d4be-4a04-8bc7-b6343f5fc3f8","Type":"ContainerStarted","Data":"d353466d27b429c9a343e53308a8c7a0c67a9f1555fa1792fcbb02b44add53bb"} Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.496066 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f6bc4c6c9-4mfq6" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.498245 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-89899c7cb-nsfvj" event={"ID":"37d17d5c-b3c0-46bd-971e-194cda30f554","Type":"ContainerStarted","Data":"84a782702159137ec346ea0622df1f9ecc1fbc33c47da26a05c7e0ab48c1c714"} Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.498877 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-89899c7cb-nsfvj" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.514758 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f6bc4c6c9-4mfq6" podStartSLOduration=5.514738671 podStartE2EDuration="5.514738671s" podCreationTimestamp="2025-10-07 11:38:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:38:27.511432845 +0000 UTC m=+1074.307831854" watchObservedRunningTime="2025-10-07 11:38:27.514738671 +0000 UTC m=+1074.311137660" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.520105 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.520904 4700 generic.go:334] "Generic (PLEG): container finished" podID="6a92f3ac-8266-4315-92f9-87bfffdc5660" containerID="ebb70f04c36bc4ab7c8c63ce82ead44c3fb182ad3084e81bb4259c1c980990ee" exitCode=0 Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.520928 4700 generic.go:334] "Generic (PLEG): container finished" podID="6a92f3ac-8266-4315-92f9-87bfffdc5660" containerID="93173686d028ce24f9157f7fdf9d1711876ac324a6aadc898e0a3f2526f1e730" exitCode=2 Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.520937 4700 generic.go:334] "Generic (PLEG): container finished" podID="6a92f3ac-8266-4315-92f9-87bfffdc5660" containerID="e6d56bce692368beab37f3b7e247428068e180485ce31f6d8762a56c4b56248c" exitCode=0 Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.520974 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a92f3ac-8266-4315-92f9-87bfffdc5660","Type":"ContainerDied","Data":"ebb70f04c36bc4ab7c8c63ce82ead44c3fb182ad3084e81bb4259c1c980990ee"} Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.520997 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a92f3ac-8266-4315-92f9-87bfffdc5660","Type":"ContainerDied","Data":"93173686d028ce24f9157f7fdf9d1711876ac324a6aadc898e0a3f2526f1e730"} Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.521007 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a92f3ac-8266-4315-92f9-87bfffdc5660","Type":"ContainerDied","Data":"e6d56bce692368beab37f3b7e247428068e180485ce31f6d8762a56c4b56248c"} Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.523155 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-79dfccd884-2qrjv" event={"ID":"a816d013-1668-445f-a7ab-0d25ad465c16","Type":"ContainerStarted","Data":"2d2eed9745935a5a88f513aebe6b941c9fdbb77102ee892ff8c5d1898c49a8a0"} Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.523927 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-79dfccd884-2qrjv" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.533351 4700 generic.go:334] "Generic (PLEG): container finished" podID="9fa37e8f-6dfc-44f0-8fab-1dfa641475cb" containerID="fad17572c6002f3df5339bd519e5c09534590a84db96bfc369b020f90e1dbe0e" exitCode=0 Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.533594 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9fa37e8f-6dfc-44f0-8fab-1dfa641475cb","Type":"ContainerDied","Data":"fad17572c6002f3df5339bd519e5c09534590a84db96bfc369b020f90e1dbe0e"} Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.533706 4700 scope.go:117] "RemoveContainer" containerID="b05ef4da362b57f929cdd908137a3dd1230c10fadbd478752cd7975245420543" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.533962 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.537259 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-89899c7cb-nsfvj" podStartSLOduration=1.914993517 podStartE2EDuration="5.537243067s" podCreationTimestamp="2025-10-07 11:38:22 +0000 UTC" firstStartedPulling="2025-10-07 11:38:23.28928493 +0000 UTC m=+1070.085683919" lastFinishedPulling="2025-10-07 11:38:26.91153448 +0000 UTC m=+1073.707933469" observedRunningTime="2025-10-07 11:38:27.525903822 +0000 UTC m=+1074.322302821" watchObservedRunningTime="2025-10-07 11:38:27.537243067 +0000 UTC m=+1074.333642056" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.541338 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-79dfccd884-2qrjv" podStartSLOduration=1.809916291 podStartE2EDuration="5.541324014s" podCreationTimestamp="2025-10-07 11:38:22 +0000 UTC" firstStartedPulling="2025-10-07 11:38:23.175603019 +0000 UTC m=+1069.972002008" lastFinishedPulling="2025-10-07 11:38:26.907010722 +0000 UTC m=+1073.703409731" observedRunningTime="2025-10-07 11:38:27.540470851 +0000 UTC m=+1074.336869850" watchObservedRunningTime="2025-10-07 11:38:27.541324014 +0000 UTC m=+1074.337723003" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.553227 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4h98\" (UniqueName: \"kubernetes.io/projected/9fa37e8f-6dfc-44f0-8fab-1dfa641475cb-kube-api-access-d4h98\") pod \"9fa37e8f-6dfc-44f0-8fab-1dfa641475cb\" (UID: \"9fa37e8f-6dfc-44f0-8fab-1dfa641475cb\") " Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.553335 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fa37e8f-6dfc-44f0-8fab-1dfa641475cb-combined-ca-bundle\") pod \"9fa37e8f-6dfc-44f0-8fab-1dfa641475cb\" (UID: \"9fa37e8f-6dfc-44f0-8fab-1dfa641475cb\") " Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.553402 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fa37e8f-6dfc-44f0-8fab-1dfa641475cb-scripts\") pod \"9fa37e8f-6dfc-44f0-8fab-1dfa641475cb\" (UID: \"9fa37e8f-6dfc-44f0-8fab-1dfa641475cb\") " Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.553681 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9fa37e8f-6dfc-44f0-8fab-1dfa641475cb-config-data-custom\") pod \"9fa37e8f-6dfc-44f0-8fab-1dfa641475cb\" (UID: \"9fa37e8f-6dfc-44f0-8fab-1dfa641475cb\") " Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.553741 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9fa37e8f-6dfc-44f0-8fab-1dfa641475cb-etc-machine-id\") pod \"9fa37e8f-6dfc-44f0-8fab-1dfa641475cb\" (UID: \"9fa37e8f-6dfc-44f0-8fab-1dfa641475cb\") " Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.553805 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fa37e8f-6dfc-44f0-8fab-1dfa641475cb-config-data\") pod \"9fa37e8f-6dfc-44f0-8fab-1dfa641475cb\" (UID: \"9fa37e8f-6dfc-44f0-8fab-1dfa641475cb\") " Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.556034 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9fa37e8f-6dfc-44f0-8fab-1dfa641475cb-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9fa37e8f-6dfc-44f0-8fab-1dfa641475cb" (UID: "9fa37e8f-6dfc-44f0-8fab-1dfa641475cb"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.558586 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fa37e8f-6dfc-44f0-8fab-1dfa641475cb-scripts" (OuterVolumeSpecName: "scripts") pod "9fa37e8f-6dfc-44f0-8fab-1dfa641475cb" (UID: "9fa37e8f-6dfc-44f0-8fab-1dfa641475cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.561158 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fa37e8f-6dfc-44f0-8fab-1dfa641475cb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9fa37e8f-6dfc-44f0-8fab-1dfa641475cb" (UID: "9fa37e8f-6dfc-44f0-8fab-1dfa641475cb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.565432 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fa37e8f-6dfc-44f0-8fab-1dfa641475cb-kube-api-access-d4h98" (OuterVolumeSpecName: "kube-api-access-d4h98") pod "9fa37e8f-6dfc-44f0-8fab-1dfa641475cb" (UID: "9fa37e8f-6dfc-44f0-8fab-1dfa641475cb"). InnerVolumeSpecName "kube-api-access-d4h98". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.607162 4700 scope.go:117] "RemoveContainer" containerID="fad17572c6002f3df5339bd519e5c09534590a84db96bfc369b020f90e1dbe0e" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.633955 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fa37e8f-6dfc-44f0-8fab-1dfa641475cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9fa37e8f-6dfc-44f0-8fab-1dfa641475cb" (UID: "9fa37e8f-6dfc-44f0-8fab-1dfa641475cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.643541 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5784cf869f-t6mnp" podUID="22a1da61-42b1-4a7a-bf08-511689c59a16" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.163:5353: i/o timeout" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.656250 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4h98\" (UniqueName: \"kubernetes.io/projected/9fa37e8f-6dfc-44f0-8fab-1dfa641475cb-kube-api-access-d4h98\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.656280 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fa37e8f-6dfc-44f0-8fab-1dfa641475cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.656289 4700 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fa37e8f-6dfc-44f0-8fab-1dfa641475cb-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.656299 4700 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9fa37e8f-6dfc-44f0-8fab-1dfa641475cb-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.656317 4700 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9fa37e8f-6dfc-44f0-8fab-1dfa641475cb-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.716053 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fa37e8f-6dfc-44f0-8fab-1dfa641475cb-config-data" (OuterVolumeSpecName: "config-data") pod "9fa37e8f-6dfc-44f0-8fab-1dfa641475cb" (UID: "9fa37e8f-6dfc-44f0-8fab-1dfa641475cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.758084 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fa37e8f-6dfc-44f0-8fab-1dfa641475cb-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.877188 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.887636 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.898363 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7cd7d44d75-xs58b"] Oct 07 11:38:27 crc kubenswrapper[4700]: E1007 11:38:27.898864 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22a1da61-42b1-4a7a-bf08-511689c59a16" containerName="init" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.898900 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="22a1da61-42b1-4a7a-bf08-511689c59a16" containerName="init" Oct 07 11:38:27 crc kubenswrapper[4700]: E1007 11:38:27.898927 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22a1da61-42b1-4a7a-bf08-511689c59a16" containerName="dnsmasq-dns" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.898934 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="22a1da61-42b1-4a7a-bf08-511689c59a16" containerName="dnsmasq-dns" Oct 07 11:38:27 crc kubenswrapper[4700]: E1007 11:38:27.898951 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fa37e8f-6dfc-44f0-8fab-1dfa641475cb" containerName="cinder-scheduler" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.898960 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fa37e8f-6dfc-44f0-8fab-1dfa641475cb" containerName="cinder-scheduler" Oct 07 11:38:27 crc kubenswrapper[4700]: E1007 11:38:27.898975 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fa37e8f-6dfc-44f0-8fab-1dfa641475cb" containerName="probe" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.898983 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fa37e8f-6dfc-44f0-8fab-1dfa641475cb" containerName="probe" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.899192 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fa37e8f-6dfc-44f0-8fab-1dfa641475cb" containerName="probe" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.899222 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fa37e8f-6dfc-44f0-8fab-1dfa641475cb" containerName="cinder-scheduler" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.899241 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="22a1da61-42b1-4a7a-bf08-511689c59a16" containerName="dnsmasq-dns" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.900493 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7cd7d44d75-xs58b" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.905371 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.905768 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.905818 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.912504 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.914008 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.919706 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.962429 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81-internal-tls-certs\") pod \"swift-proxy-7cd7d44d75-xs58b\" (UID: \"2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81\") " pod="openstack/swift-proxy-7cd7d44d75-xs58b" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.962491 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81-combined-ca-bundle\") pod \"swift-proxy-7cd7d44d75-xs58b\" (UID: \"2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81\") " pod="openstack/swift-proxy-7cd7d44d75-xs58b" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.962523 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81-config-data\") pod \"swift-proxy-7cd7d44d75-xs58b\" (UID: \"2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81\") " pod="openstack/swift-proxy-7cd7d44d75-xs58b" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.962598 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81-run-httpd\") pod \"swift-proxy-7cd7d44d75-xs58b\" (UID: \"2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81\") " pod="openstack/swift-proxy-7cd7d44d75-xs58b" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.962770 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36ec56ec-a014-4027-a6c0-c817f5bda5ca-scripts\") pod \"cinder-scheduler-0\" (UID: \"36ec56ec-a014-4027-a6c0-c817f5bda5ca\") " pod="openstack/cinder-scheduler-0" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.962834 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81-log-httpd\") pod \"swift-proxy-7cd7d44d75-xs58b\" (UID: \"2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81\") " pod="openstack/swift-proxy-7cd7d44d75-xs58b" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.962868 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81-public-tls-certs\") pod \"swift-proxy-7cd7d44d75-xs58b\" (UID: \"2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81\") " pod="openstack/swift-proxy-7cd7d44d75-xs58b" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.962924 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36ec56ec-a014-4027-a6c0-c817f5bda5ca-config-data\") pod \"cinder-scheduler-0\" (UID: \"36ec56ec-a014-4027-a6c0-c817f5bda5ca\") " pod="openstack/cinder-scheduler-0" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.962947 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqvpv\" (UniqueName: \"kubernetes.io/projected/36ec56ec-a014-4027-a6c0-c817f5bda5ca-kube-api-access-fqvpv\") pod \"cinder-scheduler-0\" (UID: \"36ec56ec-a014-4027-a6c0-c817f5bda5ca\") " pod="openstack/cinder-scheduler-0" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.962979 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/36ec56ec-a014-4027-a6c0-c817f5bda5ca-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"36ec56ec-a014-4027-a6c0-c817f5bda5ca\") " pod="openstack/cinder-scheduler-0" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.963047 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36ec56ec-a014-4027-a6c0-c817f5bda5ca-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"36ec56ec-a014-4027-a6c0-c817f5bda5ca\") " pod="openstack/cinder-scheduler-0" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.963092 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ec56ec-a014-4027-a6c0-c817f5bda5ca-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"36ec56ec-a014-4027-a6c0-c817f5bda5ca\") " pod="openstack/cinder-scheduler-0" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.963115 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81-etc-swift\") pod \"swift-proxy-7cd7d44d75-xs58b\" (UID: \"2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81\") " pod="openstack/swift-proxy-7cd7d44d75-xs58b" Oct 07 11:38:27 crc kubenswrapper[4700]: I1007 11:38:27.963143 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmthq\" (UniqueName: \"kubernetes.io/projected/2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81-kube-api-access-dmthq\") pod \"swift-proxy-7cd7d44d75-xs58b\" (UID: \"2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81\") " pod="openstack/swift-proxy-7cd7d44d75-xs58b" Oct 07 11:38:28 crc kubenswrapper[4700]: I1007 11:38:28.011755 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fa37e8f-6dfc-44f0-8fab-1dfa641475cb" path="/var/lib/kubelet/pods/9fa37e8f-6dfc-44f0-8fab-1dfa641475cb/volumes" Oct 07 11:38:28 crc kubenswrapper[4700]: I1007 11:38:28.012479 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7cd7d44d75-xs58b"] Oct 07 11:38:28 crc kubenswrapper[4700]: I1007 11:38:28.012506 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 11:38:28 crc kubenswrapper[4700]: I1007 11:38:28.070837 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81-run-httpd\") pod \"swift-proxy-7cd7d44d75-xs58b\" (UID: \"2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81\") " pod="openstack/swift-proxy-7cd7d44d75-xs58b" Oct 07 11:38:28 crc kubenswrapper[4700]: I1007 11:38:28.070972 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36ec56ec-a014-4027-a6c0-c817f5bda5ca-scripts\") pod \"cinder-scheduler-0\" (UID: \"36ec56ec-a014-4027-a6c0-c817f5bda5ca\") " pod="openstack/cinder-scheduler-0" Oct 07 11:38:28 crc kubenswrapper[4700]: I1007 11:38:28.071014 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81-log-httpd\") pod \"swift-proxy-7cd7d44d75-xs58b\" (UID: \"2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81\") " pod="openstack/swift-proxy-7cd7d44d75-xs58b" Oct 07 11:38:28 crc kubenswrapper[4700]: I1007 11:38:28.071048 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81-public-tls-certs\") pod \"swift-proxy-7cd7d44d75-xs58b\" (UID: \"2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81\") " pod="openstack/swift-proxy-7cd7d44d75-xs58b" Oct 07 11:38:28 crc kubenswrapper[4700]: I1007 11:38:28.071090 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36ec56ec-a014-4027-a6c0-c817f5bda5ca-config-data\") pod \"cinder-scheduler-0\" (UID: \"36ec56ec-a014-4027-a6c0-c817f5bda5ca\") " pod="openstack/cinder-scheduler-0" Oct 07 11:38:28 crc kubenswrapper[4700]: I1007 11:38:28.071115 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqvpv\" (UniqueName: \"kubernetes.io/projected/36ec56ec-a014-4027-a6c0-c817f5bda5ca-kube-api-access-fqvpv\") pod \"cinder-scheduler-0\" (UID: \"36ec56ec-a014-4027-a6c0-c817f5bda5ca\") " pod="openstack/cinder-scheduler-0" Oct 07 11:38:28 crc kubenswrapper[4700]: I1007 11:38:28.071146 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/36ec56ec-a014-4027-a6c0-c817f5bda5ca-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"36ec56ec-a014-4027-a6c0-c817f5bda5ca\") " pod="openstack/cinder-scheduler-0" Oct 07 11:38:28 crc kubenswrapper[4700]: I1007 11:38:28.071197 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36ec56ec-a014-4027-a6c0-c817f5bda5ca-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"36ec56ec-a014-4027-a6c0-c817f5bda5ca\") " pod="openstack/cinder-scheduler-0" Oct 07 11:38:28 crc kubenswrapper[4700]: I1007 11:38:28.071233 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ec56ec-a014-4027-a6c0-c817f5bda5ca-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"36ec56ec-a014-4027-a6c0-c817f5bda5ca\") " pod="openstack/cinder-scheduler-0" Oct 07 11:38:28 crc kubenswrapper[4700]: I1007 11:38:28.071258 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81-etc-swift\") pod \"swift-proxy-7cd7d44d75-xs58b\" (UID: \"2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81\") " pod="openstack/swift-proxy-7cd7d44d75-xs58b" Oct 07 11:38:28 crc kubenswrapper[4700]: I1007 11:38:28.071291 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmthq\" (UniqueName: \"kubernetes.io/projected/2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81-kube-api-access-dmthq\") pod \"swift-proxy-7cd7d44d75-xs58b\" (UID: \"2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81\") " pod="openstack/swift-proxy-7cd7d44d75-xs58b" Oct 07 11:38:28 crc kubenswrapper[4700]: I1007 11:38:28.071369 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81-internal-tls-certs\") pod \"swift-proxy-7cd7d44d75-xs58b\" (UID: \"2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81\") " pod="openstack/swift-proxy-7cd7d44d75-xs58b" Oct 07 11:38:28 crc kubenswrapper[4700]: I1007 11:38:28.071402 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81-combined-ca-bundle\") pod \"swift-proxy-7cd7d44d75-xs58b\" (UID: \"2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81\") " pod="openstack/swift-proxy-7cd7d44d75-xs58b" Oct 07 11:38:28 crc kubenswrapper[4700]: I1007 11:38:28.071432 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81-config-data\") pod \"swift-proxy-7cd7d44d75-xs58b\" (UID: \"2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81\") " pod="openstack/swift-proxy-7cd7d44d75-xs58b" Oct 07 11:38:28 crc kubenswrapper[4700]: I1007 11:38:28.072747 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/36ec56ec-a014-4027-a6c0-c817f5bda5ca-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"36ec56ec-a014-4027-a6c0-c817f5bda5ca\") " pod="openstack/cinder-scheduler-0" Oct 07 11:38:28 crc kubenswrapper[4700]: I1007 11:38:28.073662 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81-log-httpd\") pod \"swift-proxy-7cd7d44d75-xs58b\" (UID: \"2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81\") " pod="openstack/swift-proxy-7cd7d44d75-xs58b" Oct 07 11:38:28 crc kubenswrapper[4700]: I1007 11:38:28.073992 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81-run-httpd\") pod \"swift-proxy-7cd7d44d75-xs58b\" (UID: \"2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81\") " pod="openstack/swift-proxy-7cd7d44d75-xs58b" Oct 07 11:38:28 crc kubenswrapper[4700]: I1007 11:38:28.079899 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81-config-data\") pod \"swift-proxy-7cd7d44d75-xs58b\" (UID: \"2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81\") " pod="openstack/swift-proxy-7cd7d44d75-xs58b" Oct 07 11:38:28 crc kubenswrapper[4700]: I1007 11:38:28.085134 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81-etc-swift\") pod \"swift-proxy-7cd7d44d75-xs58b\" (UID: \"2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81\") " pod="openstack/swift-proxy-7cd7d44d75-xs58b" Oct 07 11:38:28 crc kubenswrapper[4700]: I1007 11:38:28.089621 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36ec56ec-a014-4027-a6c0-c817f5bda5ca-scripts\") pod \"cinder-scheduler-0\" (UID: \"36ec56ec-a014-4027-a6c0-c817f5bda5ca\") " pod="openstack/cinder-scheduler-0" Oct 07 11:38:28 crc kubenswrapper[4700]: I1007 11:38:28.102192 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81-public-tls-certs\") pod \"swift-proxy-7cd7d44d75-xs58b\" (UID: \"2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81\") " pod="openstack/swift-proxy-7cd7d44d75-xs58b" Oct 07 11:38:28 crc kubenswrapper[4700]: I1007 11:38:28.107067 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ec56ec-a014-4027-a6c0-c817f5bda5ca-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"36ec56ec-a014-4027-a6c0-c817f5bda5ca\") " pod="openstack/cinder-scheduler-0" Oct 07 11:38:28 crc kubenswrapper[4700]: I1007 11:38:28.114287 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqvpv\" (UniqueName: \"kubernetes.io/projected/36ec56ec-a014-4027-a6c0-c817f5bda5ca-kube-api-access-fqvpv\") pod \"cinder-scheduler-0\" (UID: \"36ec56ec-a014-4027-a6c0-c817f5bda5ca\") " pod="openstack/cinder-scheduler-0" Oct 07 11:38:28 crc kubenswrapper[4700]: I1007 11:38:28.116024 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36ec56ec-a014-4027-a6c0-c817f5bda5ca-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"36ec56ec-a014-4027-a6c0-c817f5bda5ca\") " pod="openstack/cinder-scheduler-0" Oct 07 11:38:28 crc kubenswrapper[4700]: I1007 11:38:28.116582 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36ec56ec-a014-4027-a6c0-c817f5bda5ca-config-data\") pod \"cinder-scheduler-0\" (UID: \"36ec56ec-a014-4027-a6c0-c817f5bda5ca\") " pod="openstack/cinder-scheduler-0" Oct 07 11:38:28 crc kubenswrapper[4700]: I1007 11:38:28.117253 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81-combined-ca-bundle\") pod \"swift-proxy-7cd7d44d75-xs58b\" (UID: \"2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81\") " pod="openstack/swift-proxy-7cd7d44d75-xs58b" Oct 07 11:38:28 crc kubenswrapper[4700]: I1007 11:38:28.117335 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmthq\" (UniqueName: \"kubernetes.io/projected/2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81-kube-api-access-dmthq\") pod \"swift-proxy-7cd7d44d75-xs58b\" (UID: \"2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81\") " pod="openstack/swift-proxy-7cd7d44d75-xs58b" Oct 07 11:38:28 crc kubenswrapper[4700]: I1007 11:38:28.117429 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81-internal-tls-certs\") pod \"swift-proxy-7cd7d44d75-xs58b\" (UID: \"2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81\") " pod="openstack/swift-proxy-7cd7d44d75-xs58b" Oct 07 11:38:28 crc kubenswrapper[4700]: I1007 11:38:28.332601 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7cd7d44d75-xs58b" Oct 07 11:38:28 crc kubenswrapper[4700]: I1007 11:38:28.346753 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 11:38:28 crc kubenswrapper[4700]: I1007 11:38:28.917167 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.602360 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-dbfd865bb-8242b"] Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.603600 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-dbfd865bb-8242b" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.615731 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-575565f88c-czn8g"] Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.617004 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-575565f88c-czn8g" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.630370 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-7f979f4f58-7tjtf"] Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.631886 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7f979f4f58-7tjtf" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.639475 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-dbfd865bb-8242b"] Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.649238 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-575565f88c-czn8g"] Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.661748 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7f979f4f58-7tjtf"] Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.802054 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/457a22b8-1662-44fc-a94d-35001b8c574d-config-data\") pod \"heat-api-7f979f4f58-7tjtf\" (UID: \"457a22b8-1662-44fc-a94d-35001b8c574d\") " pod="openstack/heat-api-7f979f4f58-7tjtf" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.802100 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dksf\" (UniqueName: \"kubernetes.io/projected/83eadcce-bdaa-493b-b76c-91cdfd9f8b15-kube-api-access-6dksf\") pod \"heat-engine-575565f88c-czn8g\" (UID: \"83eadcce-bdaa-493b-b76c-91cdfd9f8b15\") " pod="openstack/heat-engine-575565f88c-czn8g" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.802121 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52265b5d-530b-471a-b1ac-492e37284937-config-data-custom\") pod \"heat-cfnapi-dbfd865bb-8242b\" (UID: \"52265b5d-530b-471a-b1ac-492e37284937\") " pod="openstack/heat-cfnapi-dbfd865bb-8242b" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.802169 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52265b5d-530b-471a-b1ac-492e37284937-combined-ca-bundle\") pod \"heat-cfnapi-dbfd865bb-8242b\" (UID: \"52265b5d-530b-471a-b1ac-492e37284937\") " pod="openstack/heat-cfnapi-dbfd865bb-8242b" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.802193 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52265b5d-530b-471a-b1ac-492e37284937-config-data\") pod \"heat-cfnapi-dbfd865bb-8242b\" (UID: \"52265b5d-530b-471a-b1ac-492e37284937\") " pod="openstack/heat-cfnapi-dbfd865bb-8242b" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.802241 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gsd2\" (UniqueName: \"kubernetes.io/projected/457a22b8-1662-44fc-a94d-35001b8c574d-kube-api-access-6gsd2\") pod \"heat-api-7f979f4f58-7tjtf\" (UID: \"457a22b8-1662-44fc-a94d-35001b8c574d\") " pod="openstack/heat-api-7f979f4f58-7tjtf" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.802266 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83eadcce-bdaa-493b-b76c-91cdfd9f8b15-combined-ca-bundle\") pod \"heat-engine-575565f88c-czn8g\" (UID: \"83eadcce-bdaa-493b-b76c-91cdfd9f8b15\") " pod="openstack/heat-engine-575565f88c-czn8g" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.802325 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/457a22b8-1662-44fc-a94d-35001b8c574d-combined-ca-bundle\") pod \"heat-api-7f979f4f58-7tjtf\" (UID: \"457a22b8-1662-44fc-a94d-35001b8c574d\") " pod="openstack/heat-api-7f979f4f58-7tjtf" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.802340 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2td8r\" (UniqueName: \"kubernetes.io/projected/52265b5d-530b-471a-b1ac-492e37284937-kube-api-access-2td8r\") pod \"heat-cfnapi-dbfd865bb-8242b\" (UID: \"52265b5d-530b-471a-b1ac-492e37284937\") " pod="openstack/heat-cfnapi-dbfd865bb-8242b" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.802368 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/457a22b8-1662-44fc-a94d-35001b8c574d-config-data-custom\") pod \"heat-api-7f979f4f58-7tjtf\" (UID: \"457a22b8-1662-44fc-a94d-35001b8c574d\") " pod="openstack/heat-api-7f979f4f58-7tjtf" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.802398 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83eadcce-bdaa-493b-b76c-91cdfd9f8b15-config-data\") pod \"heat-engine-575565f88c-czn8g\" (UID: \"83eadcce-bdaa-493b-b76c-91cdfd9f8b15\") " pod="openstack/heat-engine-575565f88c-czn8g" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.802428 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/83eadcce-bdaa-493b-b76c-91cdfd9f8b15-config-data-custom\") pod \"heat-engine-575565f88c-czn8g\" (UID: \"83eadcce-bdaa-493b-b76c-91cdfd9f8b15\") " pod="openstack/heat-engine-575565f88c-czn8g" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.911141 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52265b5d-530b-471a-b1ac-492e37284937-combined-ca-bundle\") pod \"heat-cfnapi-dbfd865bb-8242b\" (UID: \"52265b5d-530b-471a-b1ac-492e37284937\") " pod="openstack/heat-cfnapi-dbfd865bb-8242b" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.911231 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52265b5d-530b-471a-b1ac-492e37284937-config-data\") pod \"heat-cfnapi-dbfd865bb-8242b\" (UID: \"52265b5d-530b-471a-b1ac-492e37284937\") " pod="openstack/heat-cfnapi-dbfd865bb-8242b" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.911405 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gsd2\" (UniqueName: \"kubernetes.io/projected/457a22b8-1662-44fc-a94d-35001b8c574d-kube-api-access-6gsd2\") pod \"heat-api-7f979f4f58-7tjtf\" (UID: \"457a22b8-1662-44fc-a94d-35001b8c574d\") " pod="openstack/heat-api-7f979f4f58-7tjtf" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.911456 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83eadcce-bdaa-493b-b76c-91cdfd9f8b15-combined-ca-bundle\") pod \"heat-engine-575565f88c-czn8g\" (UID: \"83eadcce-bdaa-493b-b76c-91cdfd9f8b15\") " pod="openstack/heat-engine-575565f88c-czn8g" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.911542 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/457a22b8-1662-44fc-a94d-35001b8c574d-combined-ca-bundle\") pod \"heat-api-7f979f4f58-7tjtf\" (UID: \"457a22b8-1662-44fc-a94d-35001b8c574d\") " pod="openstack/heat-api-7f979f4f58-7tjtf" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.911564 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2td8r\" (UniqueName: \"kubernetes.io/projected/52265b5d-530b-471a-b1ac-492e37284937-kube-api-access-2td8r\") pod \"heat-cfnapi-dbfd865bb-8242b\" (UID: \"52265b5d-530b-471a-b1ac-492e37284937\") " pod="openstack/heat-cfnapi-dbfd865bb-8242b" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.911617 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/457a22b8-1662-44fc-a94d-35001b8c574d-config-data-custom\") pod \"heat-api-7f979f4f58-7tjtf\" (UID: \"457a22b8-1662-44fc-a94d-35001b8c574d\") " pod="openstack/heat-api-7f979f4f58-7tjtf" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.911679 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83eadcce-bdaa-493b-b76c-91cdfd9f8b15-config-data\") pod \"heat-engine-575565f88c-czn8g\" (UID: \"83eadcce-bdaa-493b-b76c-91cdfd9f8b15\") " pod="openstack/heat-engine-575565f88c-czn8g" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.911735 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/83eadcce-bdaa-493b-b76c-91cdfd9f8b15-config-data-custom\") pod \"heat-engine-575565f88c-czn8g\" (UID: \"83eadcce-bdaa-493b-b76c-91cdfd9f8b15\") " pod="openstack/heat-engine-575565f88c-czn8g" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.911770 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/457a22b8-1662-44fc-a94d-35001b8c574d-config-data\") pod \"heat-api-7f979f4f58-7tjtf\" (UID: \"457a22b8-1662-44fc-a94d-35001b8c574d\") " pod="openstack/heat-api-7f979f4f58-7tjtf" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.911803 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dksf\" (UniqueName: \"kubernetes.io/projected/83eadcce-bdaa-493b-b76c-91cdfd9f8b15-kube-api-access-6dksf\") pod \"heat-engine-575565f88c-czn8g\" (UID: \"83eadcce-bdaa-493b-b76c-91cdfd9f8b15\") " pod="openstack/heat-engine-575565f88c-czn8g" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.911835 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52265b5d-530b-471a-b1ac-492e37284937-config-data-custom\") pod \"heat-cfnapi-dbfd865bb-8242b\" (UID: \"52265b5d-530b-471a-b1ac-492e37284937\") " pod="openstack/heat-cfnapi-dbfd865bb-8242b" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.917399 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/457a22b8-1662-44fc-a94d-35001b8c574d-config-data-custom\") pod \"heat-api-7f979f4f58-7tjtf\" (UID: \"457a22b8-1662-44fc-a94d-35001b8c574d\") " pod="openstack/heat-api-7f979f4f58-7tjtf" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.918239 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52265b5d-530b-471a-b1ac-492e37284937-combined-ca-bundle\") pod \"heat-cfnapi-dbfd865bb-8242b\" (UID: \"52265b5d-530b-471a-b1ac-492e37284937\") " pod="openstack/heat-cfnapi-dbfd865bb-8242b" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.924448 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/457a22b8-1662-44fc-a94d-35001b8c574d-config-data\") pod \"heat-api-7f979f4f58-7tjtf\" (UID: \"457a22b8-1662-44fc-a94d-35001b8c574d\") " pod="openstack/heat-api-7f979f4f58-7tjtf" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.935125 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52265b5d-530b-471a-b1ac-492e37284937-config-data-custom\") pod \"heat-cfnapi-dbfd865bb-8242b\" (UID: \"52265b5d-530b-471a-b1ac-492e37284937\") " pod="openstack/heat-cfnapi-dbfd865bb-8242b" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.935260 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52265b5d-530b-471a-b1ac-492e37284937-config-data\") pod \"heat-cfnapi-dbfd865bb-8242b\" (UID: \"52265b5d-530b-471a-b1ac-492e37284937\") " pod="openstack/heat-cfnapi-dbfd865bb-8242b" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.937624 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/457a22b8-1662-44fc-a94d-35001b8c574d-combined-ca-bundle\") pod \"heat-api-7f979f4f58-7tjtf\" (UID: \"457a22b8-1662-44fc-a94d-35001b8c574d\") " pod="openstack/heat-api-7f979f4f58-7tjtf" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.937625 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/83eadcce-bdaa-493b-b76c-91cdfd9f8b15-config-data-custom\") pod \"heat-engine-575565f88c-czn8g\" (UID: \"83eadcce-bdaa-493b-b76c-91cdfd9f8b15\") " pod="openstack/heat-engine-575565f88c-czn8g" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.938026 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83eadcce-bdaa-493b-b76c-91cdfd9f8b15-combined-ca-bundle\") pod \"heat-engine-575565f88c-czn8g\" (UID: \"83eadcce-bdaa-493b-b76c-91cdfd9f8b15\") " pod="openstack/heat-engine-575565f88c-czn8g" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.939104 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83eadcce-bdaa-493b-b76c-91cdfd9f8b15-config-data\") pod \"heat-engine-575565f88c-czn8g\" (UID: \"83eadcce-bdaa-493b-b76c-91cdfd9f8b15\") " pod="openstack/heat-engine-575565f88c-czn8g" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.941865 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2td8r\" (UniqueName: \"kubernetes.io/projected/52265b5d-530b-471a-b1ac-492e37284937-kube-api-access-2td8r\") pod \"heat-cfnapi-dbfd865bb-8242b\" (UID: \"52265b5d-530b-471a-b1ac-492e37284937\") " pod="openstack/heat-cfnapi-dbfd865bb-8242b" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.942044 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gsd2\" (UniqueName: \"kubernetes.io/projected/457a22b8-1662-44fc-a94d-35001b8c574d-kube-api-access-6gsd2\") pod \"heat-api-7f979f4f58-7tjtf\" (UID: \"457a22b8-1662-44fc-a94d-35001b8c574d\") " pod="openstack/heat-api-7f979f4f58-7tjtf" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.942118 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dksf\" (UniqueName: \"kubernetes.io/projected/83eadcce-bdaa-493b-b76c-91cdfd9f8b15-kube-api-access-6dksf\") pod \"heat-engine-575565f88c-czn8g\" (UID: \"83eadcce-bdaa-493b-b76c-91cdfd9f8b15\") " pod="openstack/heat-engine-575565f88c-czn8g" Oct 07 11:38:29 crc kubenswrapper[4700]: I1007 11:38:29.966819 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7f979f4f58-7tjtf" Oct 07 11:38:30 crc kubenswrapper[4700]: I1007 11:38:30.225839 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-dbfd865bb-8242b" Oct 07 11:38:30 crc kubenswrapper[4700]: I1007 11:38:30.240485 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-575565f88c-czn8g" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.074323 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-89899c7cb-nsfvj"] Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.074567 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-89899c7cb-nsfvj" podUID="37d17d5c-b3c0-46bd-971e-194cda30f554" containerName="heat-api" containerID="cri-o://84a782702159137ec346ea0622df1f9ecc1fbc33c47da26a05c7e0ab48c1c714" gracePeriod=60 Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.085780 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-79dfccd884-2qrjv"] Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.085995 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-79dfccd884-2qrjv" podUID="a816d013-1668-445f-a7ab-0d25ad465c16" containerName="heat-cfnapi" containerID="cri-o://2d2eed9745935a5a88f513aebe6b941c9fdbb77102ee892ff8c5d1898c49a8a0" gracePeriod=60 Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.120566 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-764bc4c4ff-fb769"] Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.121816 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-764bc4c4ff-fb769" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.125425 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.125445 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.133071 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5bd5586b7f-mt9tb"] Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.134624 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5bd5586b7f-mt9tb" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.139069 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.139751 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.178590 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d6d6a4d-b338-4b5f-8606-ecb9129b2a15-internal-tls-certs\") pod \"heat-api-5bd5586b7f-mt9tb\" (UID: \"6d6d6a4d-b338-4b5f-8606-ecb9129b2a15\") " pod="openstack/heat-api-5bd5586b7f-mt9tb" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.178680 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a792755-beef-4d08-a80d-8fd891e9027a-combined-ca-bundle\") pod \"heat-cfnapi-764bc4c4ff-fb769\" (UID: \"5a792755-beef-4d08-a80d-8fd891e9027a\") " pod="openstack/heat-cfnapi-764bc4c4ff-fb769" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.178723 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a792755-beef-4d08-a80d-8fd891e9027a-public-tls-certs\") pod \"heat-cfnapi-764bc4c4ff-fb769\" (UID: \"5a792755-beef-4d08-a80d-8fd891e9027a\") " pod="openstack/heat-cfnapi-764bc4c4ff-fb769" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.178755 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d6d6a4d-b338-4b5f-8606-ecb9129b2a15-public-tls-certs\") pod \"heat-api-5bd5586b7f-mt9tb\" (UID: \"6d6d6a4d-b338-4b5f-8606-ecb9129b2a15\") " pod="openstack/heat-api-5bd5586b7f-mt9tb" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.178799 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d6d6a4d-b338-4b5f-8606-ecb9129b2a15-combined-ca-bundle\") pod \"heat-api-5bd5586b7f-mt9tb\" (UID: \"6d6d6a4d-b338-4b5f-8606-ecb9129b2a15\") " pod="openstack/heat-api-5bd5586b7f-mt9tb" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.178845 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a792755-beef-4d08-a80d-8fd891e9027a-config-data\") pod \"heat-cfnapi-764bc4c4ff-fb769\" (UID: \"5a792755-beef-4d08-a80d-8fd891e9027a\") " pod="openstack/heat-cfnapi-764bc4c4ff-fb769" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.178877 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a792755-beef-4d08-a80d-8fd891e9027a-config-data-custom\") pod \"heat-cfnapi-764bc4c4ff-fb769\" (UID: \"5a792755-beef-4d08-a80d-8fd891e9027a\") " pod="openstack/heat-cfnapi-764bc4c4ff-fb769" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.178921 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d6d6a4d-b338-4b5f-8606-ecb9129b2a15-config-data\") pod \"heat-api-5bd5586b7f-mt9tb\" (UID: \"6d6d6a4d-b338-4b5f-8606-ecb9129b2a15\") " pod="openstack/heat-api-5bd5586b7f-mt9tb" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.178996 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68n2d\" (UniqueName: \"kubernetes.io/projected/6d6d6a4d-b338-4b5f-8606-ecb9129b2a15-kube-api-access-68n2d\") pod \"heat-api-5bd5586b7f-mt9tb\" (UID: \"6d6d6a4d-b338-4b5f-8606-ecb9129b2a15\") " pod="openstack/heat-api-5bd5586b7f-mt9tb" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.179080 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sfgv\" (UniqueName: \"kubernetes.io/projected/5a792755-beef-4d08-a80d-8fd891e9027a-kube-api-access-4sfgv\") pod \"heat-cfnapi-764bc4c4ff-fb769\" (UID: \"5a792755-beef-4d08-a80d-8fd891e9027a\") " pod="openstack/heat-cfnapi-764bc4c4ff-fb769" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.179114 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d6d6a4d-b338-4b5f-8606-ecb9129b2a15-config-data-custom\") pod \"heat-api-5bd5586b7f-mt9tb\" (UID: \"6d6d6a4d-b338-4b5f-8606-ecb9129b2a15\") " pod="openstack/heat-api-5bd5586b7f-mt9tb" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.179156 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a792755-beef-4d08-a80d-8fd891e9027a-internal-tls-certs\") pod \"heat-cfnapi-764bc4c4ff-fb769\" (UID: \"5a792755-beef-4d08-a80d-8fd891e9027a\") " pod="openstack/heat-cfnapi-764bc4c4ff-fb769" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.189175 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5bd5586b7f-mt9tb"] Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.241410 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-764bc4c4ff-fb769"] Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.280289 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a792755-beef-4d08-a80d-8fd891e9027a-public-tls-certs\") pod \"heat-cfnapi-764bc4c4ff-fb769\" (UID: \"5a792755-beef-4d08-a80d-8fd891e9027a\") " pod="openstack/heat-cfnapi-764bc4c4ff-fb769" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.280660 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d6d6a4d-b338-4b5f-8606-ecb9129b2a15-public-tls-certs\") pod \"heat-api-5bd5586b7f-mt9tb\" (UID: \"6d6d6a4d-b338-4b5f-8606-ecb9129b2a15\") " pod="openstack/heat-api-5bd5586b7f-mt9tb" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.280689 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d6d6a4d-b338-4b5f-8606-ecb9129b2a15-combined-ca-bundle\") pod \"heat-api-5bd5586b7f-mt9tb\" (UID: \"6d6d6a4d-b338-4b5f-8606-ecb9129b2a15\") " pod="openstack/heat-api-5bd5586b7f-mt9tb" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.280727 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a792755-beef-4d08-a80d-8fd891e9027a-config-data\") pod \"heat-cfnapi-764bc4c4ff-fb769\" (UID: \"5a792755-beef-4d08-a80d-8fd891e9027a\") " pod="openstack/heat-cfnapi-764bc4c4ff-fb769" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.280756 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a792755-beef-4d08-a80d-8fd891e9027a-config-data-custom\") pod \"heat-cfnapi-764bc4c4ff-fb769\" (UID: \"5a792755-beef-4d08-a80d-8fd891e9027a\") " pod="openstack/heat-cfnapi-764bc4c4ff-fb769" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.280786 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d6d6a4d-b338-4b5f-8606-ecb9129b2a15-config-data\") pod \"heat-api-5bd5586b7f-mt9tb\" (UID: \"6d6d6a4d-b338-4b5f-8606-ecb9129b2a15\") " pod="openstack/heat-api-5bd5586b7f-mt9tb" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.280834 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68n2d\" (UniqueName: \"kubernetes.io/projected/6d6d6a4d-b338-4b5f-8606-ecb9129b2a15-kube-api-access-68n2d\") pod \"heat-api-5bd5586b7f-mt9tb\" (UID: \"6d6d6a4d-b338-4b5f-8606-ecb9129b2a15\") " pod="openstack/heat-api-5bd5586b7f-mt9tb" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.280887 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sfgv\" (UniqueName: \"kubernetes.io/projected/5a792755-beef-4d08-a80d-8fd891e9027a-kube-api-access-4sfgv\") pod \"heat-cfnapi-764bc4c4ff-fb769\" (UID: \"5a792755-beef-4d08-a80d-8fd891e9027a\") " pod="openstack/heat-cfnapi-764bc4c4ff-fb769" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.282154 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d6d6a4d-b338-4b5f-8606-ecb9129b2a15-config-data-custom\") pod \"heat-api-5bd5586b7f-mt9tb\" (UID: \"6d6d6a4d-b338-4b5f-8606-ecb9129b2a15\") " pod="openstack/heat-api-5bd5586b7f-mt9tb" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.282244 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a792755-beef-4d08-a80d-8fd891e9027a-internal-tls-certs\") pod \"heat-cfnapi-764bc4c4ff-fb769\" (UID: \"5a792755-beef-4d08-a80d-8fd891e9027a\") " pod="openstack/heat-cfnapi-764bc4c4ff-fb769" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.282406 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d6d6a4d-b338-4b5f-8606-ecb9129b2a15-internal-tls-certs\") pod \"heat-api-5bd5586b7f-mt9tb\" (UID: \"6d6d6a4d-b338-4b5f-8606-ecb9129b2a15\") " pod="openstack/heat-api-5bd5586b7f-mt9tb" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.282486 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a792755-beef-4d08-a80d-8fd891e9027a-combined-ca-bundle\") pod \"heat-cfnapi-764bc4c4ff-fb769\" (UID: \"5a792755-beef-4d08-a80d-8fd891e9027a\") " pod="openstack/heat-cfnapi-764bc4c4ff-fb769" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.291025 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a792755-beef-4d08-a80d-8fd891e9027a-combined-ca-bundle\") pod \"heat-cfnapi-764bc4c4ff-fb769\" (UID: \"5a792755-beef-4d08-a80d-8fd891e9027a\") " pod="openstack/heat-cfnapi-764bc4c4ff-fb769" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.291289 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a792755-beef-4d08-a80d-8fd891e9027a-internal-tls-certs\") pod \"heat-cfnapi-764bc4c4ff-fb769\" (UID: \"5a792755-beef-4d08-a80d-8fd891e9027a\") " pod="openstack/heat-cfnapi-764bc4c4ff-fb769" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.291918 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a792755-beef-4d08-a80d-8fd891e9027a-public-tls-certs\") pod \"heat-cfnapi-764bc4c4ff-fb769\" (UID: \"5a792755-beef-4d08-a80d-8fd891e9027a\") " pod="openstack/heat-cfnapi-764bc4c4ff-fb769" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.293265 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d6d6a4d-b338-4b5f-8606-ecb9129b2a15-combined-ca-bundle\") pod \"heat-api-5bd5586b7f-mt9tb\" (UID: \"6d6d6a4d-b338-4b5f-8606-ecb9129b2a15\") " pod="openstack/heat-api-5bd5586b7f-mt9tb" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.294860 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d6d6a4d-b338-4b5f-8606-ecb9129b2a15-config-data-custom\") pod \"heat-api-5bd5586b7f-mt9tb\" (UID: \"6d6d6a4d-b338-4b5f-8606-ecb9129b2a15\") " pod="openstack/heat-api-5bd5586b7f-mt9tb" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.296494 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a792755-beef-4d08-a80d-8fd891e9027a-config-data-custom\") pod \"heat-cfnapi-764bc4c4ff-fb769\" (UID: \"5a792755-beef-4d08-a80d-8fd891e9027a\") " pod="openstack/heat-cfnapi-764bc4c4ff-fb769" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.298394 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d6d6a4d-b338-4b5f-8606-ecb9129b2a15-public-tls-certs\") pod \"heat-api-5bd5586b7f-mt9tb\" (UID: \"6d6d6a4d-b338-4b5f-8606-ecb9129b2a15\") " pod="openstack/heat-api-5bd5586b7f-mt9tb" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.298469 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d6d6a4d-b338-4b5f-8606-ecb9129b2a15-config-data\") pod \"heat-api-5bd5586b7f-mt9tb\" (UID: \"6d6d6a4d-b338-4b5f-8606-ecb9129b2a15\") " pod="openstack/heat-api-5bd5586b7f-mt9tb" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.298503 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a792755-beef-4d08-a80d-8fd891e9027a-config-data\") pod \"heat-cfnapi-764bc4c4ff-fb769\" (UID: \"5a792755-beef-4d08-a80d-8fd891e9027a\") " pod="openstack/heat-cfnapi-764bc4c4ff-fb769" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.300087 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sfgv\" (UniqueName: \"kubernetes.io/projected/5a792755-beef-4d08-a80d-8fd891e9027a-kube-api-access-4sfgv\") pod \"heat-cfnapi-764bc4c4ff-fb769\" (UID: \"5a792755-beef-4d08-a80d-8fd891e9027a\") " pod="openstack/heat-cfnapi-764bc4c4ff-fb769" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.300279 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d6d6a4d-b338-4b5f-8606-ecb9129b2a15-internal-tls-certs\") pod \"heat-api-5bd5586b7f-mt9tb\" (UID: \"6d6d6a4d-b338-4b5f-8606-ecb9129b2a15\") " pod="openstack/heat-api-5bd5586b7f-mt9tb" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.303506 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68n2d\" (UniqueName: \"kubernetes.io/projected/6d6d6a4d-b338-4b5f-8606-ecb9129b2a15-kube-api-access-68n2d\") pod \"heat-api-5bd5586b7f-mt9tb\" (UID: \"6d6d6a4d-b338-4b5f-8606-ecb9129b2a15\") " pod="openstack/heat-api-5bd5586b7f-mt9tb" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.459958 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-764bc4c4ff-fb769" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.473883 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5bd5586b7f-mt9tb" Oct 07 11:38:31 crc kubenswrapper[4700]: E1007 11:38:31.552426 4700 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda816d013_1668_445f_a7ab_0d25ad465c16.slice/crio-conmon-2d2eed9745935a5a88f513aebe6b941c9fdbb77102ee892ff8c5d1898c49a8a0.scope\": RecentStats: unable to find data in memory cache]" Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.581733 4700 generic.go:334] "Generic (PLEG): container finished" podID="a816d013-1668-445f-a7ab-0d25ad465c16" containerID="2d2eed9745935a5a88f513aebe6b941c9fdbb77102ee892ff8c5d1898c49a8a0" exitCode=0 Oct 07 11:38:31 crc kubenswrapper[4700]: I1007 11:38:31.581787 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-79dfccd884-2qrjv" event={"ID":"a816d013-1668-445f-a7ab-0d25ad465c16","Type":"ContainerDied","Data":"2d2eed9745935a5a88f513aebe6b941c9fdbb77102ee892ff8c5d1898c49a8a0"} Oct 07 11:38:32 crc kubenswrapper[4700]: I1007 11:38:32.433782 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-79dfccd884-2qrjv" podUID="a816d013-1668-445f-a7ab-0d25ad465c16" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.170:8000/healthcheck\": dial tcp 10.217.0.170:8000: connect: connection refused" Oct 07 11:38:32 crc kubenswrapper[4700]: I1007 11:38:32.538638 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-89899c7cb-nsfvj" podUID="37d17d5c-b3c0-46bd-971e-194cda30f554" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.171:8004/healthcheck\": dial tcp 10.217.0.171:8004: connect: connection refused" Oct 07 11:38:32 crc kubenswrapper[4700]: I1007 11:38:32.593454 4700 generic.go:334] "Generic (PLEG): container finished" podID="37d17d5c-b3c0-46bd-971e-194cda30f554" containerID="84a782702159137ec346ea0622df1f9ecc1fbc33c47da26a05c7e0ab48c1c714" exitCode=0 Oct 07 11:38:32 crc kubenswrapper[4700]: I1007 11:38:32.593523 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-89899c7cb-nsfvj" event={"ID":"37d17d5c-b3c0-46bd-971e-194cda30f554","Type":"ContainerDied","Data":"84a782702159137ec346ea0622df1f9ecc1fbc33c47da26a05c7e0ab48c1c714"} Oct 07 11:38:32 crc kubenswrapper[4700]: I1007 11:38:32.596252 4700 generic.go:334] "Generic (PLEG): container finished" podID="6a92f3ac-8266-4315-92f9-87bfffdc5660" containerID="2739cdef6b7e2e259fb240f04cddb29fea6bbf362e7d5f1135d21f8fa61bb21b" exitCode=0 Oct 07 11:38:32 crc kubenswrapper[4700]: I1007 11:38:32.596282 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a92f3ac-8266-4315-92f9-87bfffdc5660","Type":"ContainerDied","Data":"2739cdef6b7e2e259fb240f04cddb29fea6bbf362e7d5f1135d21f8fa61bb21b"} Oct 07 11:38:32 crc kubenswrapper[4700]: I1007 11:38:32.753928 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f6bc4c6c9-4mfq6" Oct 07 11:38:32 crc kubenswrapper[4700]: I1007 11:38:32.805668 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-795l6"] Oct 07 11:38:32 crc kubenswrapper[4700]: I1007 11:38:32.805897 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59d5ff467f-795l6" podUID="5fce846e-41b6-4b09-a5db-236763b8e5f9" containerName="dnsmasq-dns" containerID="cri-o://ac3e04171ef90a66417460183beaf2d07c8259a7eb36db92868dc5bc4b3aab4d" gracePeriod=10 Oct 07 11:38:33 crc kubenswrapper[4700]: I1007 11:38:33.614209 4700 generic.go:334] "Generic (PLEG): container finished" podID="5fce846e-41b6-4b09-a5db-236763b8e5f9" containerID="ac3e04171ef90a66417460183beaf2d07c8259a7eb36db92868dc5bc4b3aab4d" exitCode=0 Oct 07 11:38:33 crc kubenswrapper[4700]: I1007 11:38:33.614275 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-795l6" event={"ID":"5fce846e-41b6-4b09-a5db-236763b8e5f9","Type":"ContainerDied","Data":"ac3e04171ef90a66417460183beaf2d07c8259a7eb36db92868dc5bc4b3aab4d"} Oct 07 11:38:33 crc kubenswrapper[4700]: I1007 11:38:33.883388 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-59d5ff467f-795l6" podUID="5fce846e-41b6-4b09-a5db-236763b8e5f9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.154:5353: connect: connection refused" Oct 07 11:38:34 crc kubenswrapper[4700]: I1007 11:38:34.033588 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 11:38:34 crc kubenswrapper[4700]: I1007 11:38:34.033808 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4f62089a-2daa-491e-bcdb-7f793df7cd99" containerName="glance-log" containerID="cri-o://857a04c2d9ccedf07de6eec685d503be77a0237c655ba26ff0e0a2cfe018d57b" gracePeriod=30 Oct 07 11:38:34 crc kubenswrapper[4700]: I1007 11:38:34.034181 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4f62089a-2daa-491e-bcdb-7f793df7cd99" containerName="glance-httpd" containerID="cri-o://ab00a65ae7be37741c52aa0fdf0fd85e68897595cfa5bb120b2d48d05e2b3de6" gracePeriod=30 Oct 07 11:38:34 crc kubenswrapper[4700]: I1007 11:38:34.642666 4700 generic.go:334] "Generic (PLEG): container finished" podID="4f62089a-2daa-491e-bcdb-7f793df7cd99" containerID="857a04c2d9ccedf07de6eec685d503be77a0237c655ba26ff0e0a2cfe018d57b" exitCode=143 Oct 07 11:38:34 crc kubenswrapper[4700]: I1007 11:38:34.642723 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4f62089a-2daa-491e-bcdb-7f793df7cd99","Type":"ContainerDied","Data":"857a04c2d9ccedf07de6eec685d503be77a0237c655ba26ff0e0a2cfe018d57b"} Oct 07 11:38:34 crc kubenswrapper[4700]: I1007 11:38:34.873343 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5ff467f-795l6" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.068934 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5fce846e-41b6-4b09-a5db-236763b8e5f9-ovsdbserver-sb\") pod \"5fce846e-41b6-4b09-a5db-236763b8e5f9\" (UID: \"5fce846e-41b6-4b09-a5db-236763b8e5f9\") " Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.069000 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5fce846e-41b6-4b09-a5db-236763b8e5f9-dns-swift-storage-0\") pod \"5fce846e-41b6-4b09-a5db-236763b8e5f9\" (UID: \"5fce846e-41b6-4b09-a5db-236763b8e5f9\") " Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.069040 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fce846e-41b6-4b09-a5db-236763b8e5f9-config\") pod \"5fce846e-41b6-4b09-a5db-236763b8e5f9\" (UID: \"5fce846e-41b6-4b09-a5db-236763b8e5f9\") " Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.069092 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99rhm\" (UniqueName: \"kubernetes.io/projected/5fce846e-41b6-4b09-a5db-236763b8e5f9-kube-api-access-99rhm\") pod \"5fce846e-41b6-4b09-a5db-236763b8e5f9\" (UID: \"5fce846e-41b6-4b09-a5db-236763b8e5f9\") " Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.069134 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5fce846e-41b6-4b09-a5db-236763b8e5f9-ovsdbserver-nb\") pod \"5fce846e-41b6-4b09-a5db-236763b8e5f9\" (UID: \"5fce846e-41b6-4b09-a5db-236763b8e5f9\") " Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.069259 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5fce846e-41b6-4b09-a5db-236763b8e5f9-dns-svc\") pod \"5fce846e-41b6-4b09-a5db-236763b8e5f9\" (UID: \"5fce846e-41b6-4b09-a5db-236763b8e5f9\") " Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.114916 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fce846e-41b6-4b09-a5db-236763b8e5f9-kube-api-access-99rhm" (OuterVolumeSpecName: "kube-api-access-99rhm") pod "5fce846e-41b6-4b09-a5db-236763b8e5f9" (UID: "5fce846e-41b6-4b09-a5db-236763b8e5f9"). InnerVolumeSpecName "kube-api-access-99rhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.171165 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99rhm\" (UniqueName: \"kubernetes.io/projected/5fce846e-41b6-4b09-a5db-236763b8e5f9-kube-api-access-99rhm\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.172577 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fce846e-41b6-4b09-a5db-236763b8e5f9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5fce846e-41b6-4b09-a5db-236763b8e5f9" (UID: "5fce846e-41b6-4b09-a5db-236763b8e5f9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.173674 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.178267 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fce846e-41b6-4b09-a5db-236763b8e5f9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5fce846e-41b6-4b09-a5db-236763b8e5f9" (UID: "5fce846e-41b6-4b09-a5db-236763b8e5f9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.186634 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-89899c7cb-nsfvj" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.195376 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-79dfccd884-2qrjv" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.198676 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fce846e-41b6-4b09-a5db-236763b8e5f9-config" (OuterVolumeSpecName: "config") pod "5fce846e-41b6-4b09-a5db-236763b8e5f9" (UID: "5fce846e-41b6-4b09-a5db-236763b8e5f9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.209935 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fce846e-41b6-4b09-a5db-236763b8e5f9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5fce846e-41b6-4b09-a5db-236763b8e5f9" (UID: "5fce846e-41b6-4b09-a5db-236763b8e5f9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.213008 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fce846e-41b6-4b09-a5db-236763b8e5f9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5fce846e-41b6-4b09-a5db-236763b8e5f9" (UID: "5fce846e-41b6-4b09-a5db-236763b8e5f9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.278144 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a92f3ac-8266-4315-92f9-87bfffdc5660-scripts\") pod \"6a92f3ac-8266-4315-92f9-87bfffdc5660\" (UID: \"6a92f3ac-8266-4315-92f9-87bfffdc5660\") " Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.279660 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a92f3ac-8266-4315-92f9-87bfffdc5660-sg-core-conf-yaml\") pod \"6a92f3ac-8266-4315-92f9-87bfffdc5660\" (UID: \"6a92f3ac-8266-4315-92f9-87bfffdc5660\") " Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.279708 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a816d013-1668-445f-a7ab-0d25ad465c16-config-data\") pod \"a816d013-1668-445f-a7ab-0d25ad465c16\" (UID: \"a816d013-1668-445f-a7ab-0d25ad465c16\") " Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.279742 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a92f3ac-8266-4315-92f9-87bfffdc5660-log-httpd\") pod \"6a92f3ac-8266-4315-92f9-87bfffdc5660\" (UID: \"6a92f3ac-8266-4315-92f9-87bfffdc5660\") " Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.279762 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a92f3ac-8266-4315-92f9-87bfffdc5660-run-httpd\") pod \"6a92f3ac-8266-4315-92f9-87bfffdc5660\" (UID: \"6a92f3ac-8266-4315-92f9-87bfffdc5660\") " Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.279794 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37d17d5c-b3c0-46bd-971e-194cda30f554-config-data-custom\") pod \"37d17d5c-b3c0-46bd-971e-194cda30f554\" (UID: \"37d17d5c-b3c0-46bd-971e-194cda30f554\") " Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.279816 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a92f3ac-8266-4315-92f9-87bfffdc5660-config-data\") pod \"6a92f3ac-8266-4315-92f9-87bfffdc5660\" (UID: \"6a92f3ac-8266-4315-92f9-87bfffdc5660\") " Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.279841 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl5w4\" (UniqueName: \"kubernetes.io/projected/37d17d5c-b3c0-46bd-971e-194cda30f554-kube-api-access-fl5w4\") pod \"37d17d5c-b3c0-46bd-971e-194cda30f554\" (UID: \"37d17d5c-b3c0-46bd-971e-194cda30f554\") " Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.279863 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37d17d5c-b3c0-46bd-971e-194cda30f554-config-data\") pod \"37d17d5c-b3c0-46bd-971e-194cda30f554\" (UID: \"37d17d5c-b3c0-46bd-971e-194cda30f554\") " Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.279888 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a816d013-1668-445f-a7ab-0d25ad465c16-combined-ca-bundle\") pod \"a816d013-1668-445f-a7ab-0d25ad465c16\" (UID: \"a816d013-1668-445f-a7ab-0d25ad465c16\") " Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.279920 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmwb2\" (UniqueName: \"kubernetes.io/projected/a816d013-1668-445f-a7ab-0d25ad465c16-kube-api-access-rmwb2\") pod \"a816d013-1668-445f-a7ab-0d25ad465c16\" (UID: \"a816d013-1668-445f-a7ab-0d25ad465c16\") " Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.279956 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a92f3ac-8266-4315-92f9-87bfffdc5660-combined-ca-bundle\") pod \"6a92f3ac-8266-4315-92f9-87bfffdc5660\" (UID: \"6a92f3ac-8266-4315-92f9-87bfffdc5660\") " Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.279980 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d17d5c-b3c0-46bd-971e-194cda30f554-combined-ca-bundle\") pod \"37d17d5c-b3c0-46bd-971e-194cda30f554\" (UID: \"37d17d5c-b3c0-46bd-971e-194cda30f554\") " Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.280066 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a816d013-1668-445f-a7ab-0d25ad465c16-config-data-custom\") pod \"a816d013-1668-445f-a7ab-0d25ad465c16\" (UID: \"a816d013-1668-445f-a7ab-0d25ad465c16\") " Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.280104 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkmfw\" (UniqueName: \"kubernetes.io/projected/6a92f3ac-8266-4315-92f9-87bfffdc5660-kube-api-access-bkmfw\") pod \"6a92f3ac-8266-4315-92f9-87bfffdc5660\" (UID: \"6a92f3ac-8266-4315-92f9-87bfffdc5660\") " Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.280824 4700 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5fce846e-41b6-4b09-a5db-236763b8e5f9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.280848 4700 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5fce846e-41b6-4b09-a5db-236763b8e5f9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.280863 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fce846e-41b6-4b09-a5db-236763b8e5f9-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.280875 4700 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5fce846e-41b6-4b09-a5db-236763b8e5f9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.280888 4700 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5fce846e-41b6-4b09-a5db-236763b8e5f9-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.281517 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a92f3ac-8266-4315-92f9-87bfffdc5660-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6a92f3ac-8266-4315-92f9-87bfffdc5660" (UID: "6a92f3ac-8266-4315-92f9-87bfffdc5660"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.285355 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a92f3ac-8266-4315-92f9-87bfffdc5660-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6a92f3ac-8266-4315-92f9-87bfffdc5660" (UID: "6a92f3ac-8266-4315-92f9-87bfffdc5660"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.294023 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37d17d5c-b3c0-46bd-971e-194cda30f554-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "37d17d5c-b3c0-46bd-971e-194cda30f554" (UID: "37d17d5c-b3c0-46bd-971e-194cda30f554"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.295049 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a92f3ac-8266-4315-92f9-87bfffdc5660-scripts" (OuterVolumeSpecName: "scripts") pod "6a92f3ac-8266-4315-92f9-87bfffdc5660" (UID: "6a92f3ac-8266-4315-92f9-87bfffdc5660"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.298972 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37d17d5c-b3c0-46bd-971e-194cda30f554-kube-api-access-fl5w4" (OuterVolumeSpecName: "kube-api-access-fl5w4") pod "37d17d5c-b3c0-46bd-971e-194cda30f554" (UID: "37d17d5c-b3c0-46bd-971e-194cda30f554"). InnerVolumeSpecName "kube-api-access-fl5w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.305146 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a816d013-1668-445f-a7ab-0d25ad465c16-kube-api-access-rmwb2" (OuterVolumeSpecName: "kube-api-access-rmwb2") pod "a816d013-1668-445f-a7ab-0d25ad465c16" (UID: "a816d013-1668-445f-a7ab-0d25ad465c16"). InnerVolumeSpecName "kube-api-access-rmwb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.312842 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a92f3ac-8266-4315-92f9-87bfffdc5660-kube-api-access-bkmfw" (OuterVolumeSpecName: "kube-api-access-bkmfw") pod "6a92f3ac-8266-4315-92f9-87bfffdc5660" (UID: "6a92f3ac-8266-4315-92f9-87bfffdc5660"). InnerVolumeSpecName "kube-api-access-bkmfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.329007 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a92f3ac-8266-4315-92f9-87bfffdc5660-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6a92f3ac-8266-4315-92f9-87bfffdc5660" (UID: "6a92f3ac-8266-4315-92f9-87bfffdc5660"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.331429 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a816d013-1668-445f-a7ab-0d25ad465c16-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a816d013-1668-445f-a7ab-0d25ad465c16" (UID: "a816d013-1668-445f-a7ab-0d25ad465c16"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.352709 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37d17d5c-b3c0-46bd-971e-194cda30f554-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37d17d5c-b3c0-46bd-971e-194cda30f554" (UID: "37d17d5c-b3c0-46bd-971e-194cda30f554"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.355666 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a816d013-1668-445f-a7ab-0d25ad465c16-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a816d013-1668-445f-a7ab-0d25ad465c16" (UID: "a816d013-1668-445f-a7ab-0d25ad465c16"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.371734 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a816d013-1668-445f-a7ab-0d25ad465c16-config-data" (OuterVolumeSpecName: "config-data") pod "a816d013-1668-445f-a7ab-0d25ad465c16" (UID: "a816d013-1668-445f-a7ab-0d25ad465c16"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.383053 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkmfw\" (UniqueName: \"kubernetes.io/projected/6a92f3ac-8266-4315-92f9-87bfffdc5660-kube-api-access-bkmfw\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.383084 4700 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a92f3ac-8266-4315-92f9-87bfffdc5660-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.383093 4700 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a92f3ac-8266-4315-92f9-87bfffdc5660-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.383103 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a816d013-1668-445f-a7ab-0d25ad465c16-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.383115 4700 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a92f3ac-8266-4315-92f9-87bfffdc5660-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.383123 4700 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a92f3ac-8266-4315-92f9-87bfffdc5660-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.383134 4700 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37d17d5c-b3c0-46bd-971e-194cda30f554-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.383145 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fl5w4\" (UniqueName: \"kubernetes.io/projected/37d17d5c-b3c0-46bd-971e-194cda30f554-kube-api-access-fl5w4\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.383155 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a816d013-1668-445f-a7ab-0d25ad465c16-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.383165 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmwb2\" (UniqueName: \"kubernetes.io/projected/a816d013-1668-445f-a7ab-0d25ad465c16-kube-api-access-rmwb2\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.383174 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d17d5c-b3c0-46bd-971e-194cda30f554-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.383183 4700 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a816d013-1668-445f-a7ab-0d25ad465c16-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.386544 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37d17d5c-b3c0-46bd-971e-194cda30f554-config-data" (OuterVolumeSpecName: "config-data") pod "37d17d5c-b3c0-46bd-971e-194cda30f554" (UID: "37d17d5c-b3c0-46bd-971e-194cda30f554"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.419007 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a92f3ac-8266-4315-92f9-87bfffdc5660-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a92f3ac-8266-4315-92f9-87bfffdc5660" (UID: "6a92f3ac-8266-4315-92f9-87bfffdc5660"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.436355 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a92f3ac-8266-4315-92f9-87bfffdc5660-config-data" (OuterVolumeSpecName: "config-data") pod "6a92f3ac-8266-4315-92f9-87bfffdc5660" (UID: "6a92f3ac-8266-4315-92f9-87bfffdc5660"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.485554 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a92f3ac-8266-4315-92f9-87bfffdc5660-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.485595 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37d17d5c-b3c0-46bd-971e-194cda30f554-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.485610 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a92f3ac-8266-4315-92f9-87bfffdc5660-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.656293 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-795l6" event={"ID":"5fce846e-41b6-4b09-a5db-236763b8e5f9","Type":"ContainerDied","Data":"62415a0b8fe11be852914acc361137694130eb0cf4d453a8cfa3931efb545382"} Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.656336 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5ff467f-795l6" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.656381 4700 scope.go:117] "RemoveContainer" containerID="ac3e04171ef90a66417460183beaf2d07c8259a7eb36db92868dc5bc4b3aab4d" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.662603 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a92f3ac-8266-4315-92f9-87bfffdc5660","Type":"ContainerDied","Data":"5722fb1c2acc9096791376874910328a1debff62375f03abbfba347da299ef57"} Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.662757 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.672541 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-79dfccd884-2qrjv" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.672550 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-79dfccd884-2qrjv" event={"ID":"a816d013-1668-445f-a7ab-0d25ad465c16","Type":"ContainerDied","Data":"eb6f8f74ae6c890a9c3d25c23bbea6b1300091256a40e2f694dd392fbdb28e1d"} Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.677572 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"1cb5863c-c473-49b8-9ffa-ce83d51a061c","Type":"ContainerStarted","Data":"8274f66316f6312d8cf6b2599d570d71bb607f9434378e8043d2f90cfd921ace"} Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.680854 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-89899c7cb-nsfvj" event={"ID":"37d17d5c-b3c0-46bd-971e-194cda30f554","Type":"ContainerDied","Data":"ce32ffef18cd6864708325d88ef63e3429c6d649aab2cc443c05d2c03112d8a0"} Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.680939 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-89899c7cb-nsfvj" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.698880 4700 scope.go:117] "RemoveContainer" containerID="056c7b430e0188ff314111a56c4956d1ceb343cde4e15133609ce7a369e8a830" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.703433 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.75705145 podStartE2EDuration="15.703409986s" podCreationTimestamp="2025-10-07 11:38:20 +0000 UTC" firstStartedPulling="2025-10-07 11:38:20.859301015 +0000 UTC m=+1067.655700004" lastFinishedPulling="2025-10-07 11:38:34.805659551 +0000 UTC m=+1081.602058540" observedRunningTime="2025-10-07 11:38:35.6970494 +0000 UTC m=+1082.493448389" watchObservedRunningTime="2025-10-07 11:38:35.703409986 +0000 UTC m=+1082.499808975" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.731101 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-795l6"] Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.750013 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-795l6"] Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.752096 4700 scope.go:117] "RemoveContainer" containerID="ebb70f04c36bc4ab7c8c63ce82ead44c3fb182ad3084e81bb4259c1c980990ee" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.798476 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-79dfccd884-2qrjv"] Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.820007 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-79dfccd884-2qrjv"] Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.848925 4700 scope.go:117] "RemoveContainer" containerID="93173686d028ce24f9157f7fdf9d1711876ac324a6aadc898e0a3f2526f1e730" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.863439 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.871943 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-dbfd865bb-8242b"] Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.882886 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.898076 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:38:35 crc kubenswrapper[4700]: E1007 11:38:35.898607 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37d17d5c-b3c0-46bd-971e-194cda30f554" containerName="heat-api" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.898794 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="37d17d5c-b3c0-46bd-971e-194cda30f554" containerName="heat-api" Oct 07 11:38:35 crc kubenswrapper[4700]: E1007 11:38:35.898812 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fce846e-41b6-4b09-a5db-236763b8e5f9" containerName="init" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.898819 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fce846e-41b6-4b09-a5db-236763b8e5f9" containerName="init" Oct 07 11:38:35 crc kubenswrapper[4700]: E1007 11:38:35.898836 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a92f3ac-8266-4315-92f9-87bfffdc5660" containerName="sg-core" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.898842 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a92f3ac-8266-4315-92f9-87bfffdc5660" containerName="sg-core" Oct 07 11:38:35 crc kubenswrapper[4700]: E1007 11:38:35.899112 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a92f3ac-8266-4315-92f9-87bfffdc5660" containerName="ceilometer-notification-agent" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.899121 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a92f3ac-8266-4315-92f9-87bfffdc5660" containerName="ceilometer-notification-agent" Oct 07 11:38:35 crc kubenswrapper[4700]: E1007 11:38:35.899133 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fce846e-41b6-4b09-a5db-236763b8e5f9" containerName="dnsmasq-dns" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.899139 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fce846e-41b6-4b09-a5db-236763b8e5f9" containerName="dnsmasq-dns" Oct 07 11:38:35 crc kubenswrapper[4700]: E1007 11:38:35.899154 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a816d013-1668-445f-a7ab-0d25ad465c16" containerName="heat-cfnapi" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.899321 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="a816d013-1668-445f-a7ab-0d25ad465c16" containerName="heat-cfnapi" Oct 07 11:38:35 crc kubenswrapper[4700]: E1007 11:38:35.899348 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a92f3ac-8266-4315-92f9-87bfffdc5660" containerName="ceilometer-central-agent" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.899354 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a92f3ac-8266-4315-92f9-87bfffdc5660" containerName="ceilometer-central-agent" Oct 07 11:38:35 crc kubenswrapper[4700]: E1007 11:38:35.899369 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a92f3ac-8266-4315-92f9-87bfffdc5660" containerName="proxy-httpd" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.899375 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a92f3ac-8266-4315-92f9-87bfffdc5660" containerName="proxy-httpd" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.900787 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a92f3ac-8266-4315-92f9-87bfffdc5660" containerName="ceilometer-central-agent" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.901045 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="37d17d5c-b3c0-46bd-971e-194cda30f554" containerName="heat-api" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.901063 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a92f3ac-8266-4315-92f9-87bfffdc5660" containerName="ceilometer-notification-agent" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.901072 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a92f3ac-8266-4315-92f9-87bfffdc5660" containerName="sg-core" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.901087 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="a816d013-1668-445f-a7ab-0d25ad465c16" containerName="heat-cfnapi" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.901330 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fce846e-41b6-4b09-a5db-236763b8e5f9" containerName="dnsmasq-dns" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.901353 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a92f3ac-8266-4315-92f9-87bfffdc5660" containerName="proxy-httpd" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.904679 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.907561 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.909873 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.924251 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.925187 4700 scope.go:117] "RemoveContainer" containerID="2739cdef6b7e2e259fb240f04cddb29fea6bbf362e7d5f1135d21f8fa61bb21b" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.954135 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-575565f88c-czn8g"] Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.976793 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fce846e-41b6-4b09-a5db-236763b8e5f9" path="/var/lib/kubelet/pods/5fce846e-41b6-4b09-a5db-236763b8e5f9/volumes" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.977758 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a92f3ac-8266-4315-92f9-87bfffdc5660" path="/var/lib/kubelet/pods/6a92f3ac-8266-4315-92f9-87bfffdc5660/volumes" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.980138 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a816d013-1668-445f-a7ab-0d25ad465c16" path="/var/lib/kubelet/pods/a816d013-1668-445f-a7ab-0d25ad465c16/volumes" Oct 07 11:38:35 crc kubenswrapper[4700]: I1007 11:38:35.982441 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7f979f4f58-7tjtf"] Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.002508 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/818a295a-ff53-4276-9abd-3800769ccbf2-config-data\") pod \"ceilometer-0\" (UID: \"818a295a-ff53-4276-9abd-3800769ccbf2\") " pod="openstack/ceilometer-0" Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.002834 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqghv\" (UniqueName: \"kubernetes.io/projected/818a295a-ff53-4276-9abd-3800769ccbf2-kube-api-access-hqghv\") pod \"ceilometer-0\" (UID: \"818a295a-ff53-4276-9abd-3800769ccbf2\") " pod="openstack/ceilometer-0" Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.002995 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/818a295a-ff53-4276-9abd-3800769ccbf2-log-httpd\") pod \"ceilometer-0\" (UID: \"818a295a-ff53-4276-9abd-3800769ccbf2\") " pod="openstack/ceilometer-0" Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.003079 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/818a295a-ff53-4276-9abd-3800769ccbf2-scripts\") pod \"ceilometer-0\" (UID: \"818a295a-ff53-4276-9abd-3800769ccbf2\") " pod="openstack/ceilometer-0" Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.003192 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/818a295a-ff53-4276-9abd-3800769ccbf2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"818a295a-ff53-4276-9abd-3800769ccbf2\") " pod="openstack/ceilometer-0" Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.003424 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/818a295a-ff53-4276-9abd-3800769ccbf2-run-httpd\") pod \"ceilometer-0\" (UID: \"818a295a-ff53-4276-9abd-3800769ccbf2\") " pod="openstack/ceilometer-0" Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.003658 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/818a295a-ff53-4276-9abd-3800769ccbf2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"818a295a-ff53-4276-9abd-3800769ccbf2\") " pod="openstack/ceilometer-0" Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.004842 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-89899c7cb-nsfvj"] Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.018170 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-89899c7cb-nsfvj"] Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.024276 4700 scope.go:117] "RemoveContainer" containerID="e6d56bce692368beab37f3b7e247428068e180485ce31f6d8762a56c4b56248c" Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.028400 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.040109 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-764bc4c4ff-fb769"] Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.050344 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5bd5586b7f-mt9tb"] Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.063822 4700 scope.go:117] "RemoveContainer" containerID="2d2eed9745935a5a88f513aebe6b941c9fdbb77102ee892ff8c5d1898c49a8a0" Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.068245 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7cd7d44d75-xs58b"] Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.105066 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/818a295a-ff53-4276-9abd-3800769ccbf2-log-httpd\") pod \"ceilometer-0\" (UID: \"818a295a-ff53-4276-9abd-3800769ccbf2\") " pod="openstack/ceilometer-0" Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.105106 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/818a295a-ff53-4276-9abd-3800769ccbf2-scripts\") pod \"ceilometer-0\" (UID: \"818a295a-ff53-4276-9abd-3800769ccbf2\") " pod="openstack/ceilometer-0" Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.105137 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/818a295a-ff53-4276-9abd-3800769ccbf2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"818a295a-ff53-4276-9abd-3800769ccbf2\") " pod="openstack/ceilometer-0" Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.105204 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/818a295a-ff53-4276-9abd-3800769ccbf2-run-httpd\") pod \"ceilometer-0\" (UID: \"818a295a-ff53-4276-9abd-3800769ccbf2\") " pod="openstack/ceilometer-0" Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.105294 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/818a295a-ff53-4276-9abd-3800769ccbf2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"818a295a-ff53-4276-9abd-3800769ccbf2\") " pod="openstack/ceilometer-0" Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.105404 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/818a295a-ff53-4276-9abd-3800769ccbf2-config-data\") pod \"ceilometer-0\" (UID: \"818a295a-ff53-4276-9abd-3800769ccbf2\") " pod="openstack/ceilometer-0" Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.105429 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqghv\" (UniqueName: \"kubernetes.io/projected/818a295a-ff53-4276-9abd-3800769ccbf2-kube-api-access-hqghv\") pod \"ceilometer-0\" (UID: \"818a295a-ff53-4276-9abd-3800769ccbf2\") " pod="openstack/ceilometer-0" Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.105474 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/818a295a-ff53-4276-9abd-3800769ccbf2-log-httpd\") pod \"ceilometer-0\" (UID: \"818a295a-ff53-4276-9abd-3800769ccbf2\") " pod="openstack/ceilometer-0" Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.106563 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/818a295a-ff53-4276-9abd-3800769ccbf2-run-httpd\") pod \"ceilometer-0\" (UID: \"818a295a-ff53-4276-9abd-3800769ccbf2\") " pod="openstack/ceilometer-0" Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.111199 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/818a295a-ff53-4276-9abd-3800769ccbf2-scripts\") pod \"ceilometer-0\" (UID: \"818a295a-ff53-4276-9abd-3800769ccbf2\") " pod="openstack/ceilometer-0" Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.111759 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/818a295a-ff53-4276-9abd-3800769ccbf2-config-data\") pod \"ceilometer-0\" (UID: \"818a295a-ff53-4276-9abd-3800769ccbf2\") " pod="openstack/ceilometer-0" Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.113964 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/818a295a-ff53-4276-9abd-3800769ccbf2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"818a295a-ff53-4276-9abd-3800769ccbf2\") " pod="openstack/ceilometer-0" Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.115912 4700 scope.go:117] "RemoveContainer" containerID="84a782702159137ec346ea0622df1f9ecc1fbc33c47da26a05c7e0ab48c1c714" Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.116010 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/818a295a-ff53-4276-9abd-3800769ccbf2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"818a295a-ff53-4276-9abd-3800769ccbf2\") " pod="openstack/ceilometer-0" Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.136706 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqghv\" (UniqueName: \"kubernetes.io/projected/818a295a-ff53-4276-9abd-3800769ccbf2-kube-api-access-hqghv\") pod \"ceilometer-0\" (UID: \"818a295a-ff53-4276-9abd-3800769ccbf2\") " pod="openstack/ceilometer-0" Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.231757 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.717626 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-764bc4c4ff-fb769" event={"ID":"5a792755-beef-4d08-a80d-8fd891e9027a","Type":"ContainerStarted","Data":"ccfcc6bcf7c3fc36790a05520082e9df7d0521bc0b564a10a36bbbfb58a4b86e"} Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.717877 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-764bc4c4ff-fb769" event={"ID":"5a792755-beef-4d08-a80d-8fd891e9027a","Type":"ContainerStarted","Data":"2604d77fb497123b4500b541fe7aa6fff2099afc4746b80d361fef10255b9ccb"} Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.718969 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-764bc4c4ff-fb769" Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.728479 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-dbfd865bb-8242b" event={"ID":"52265b5d-530b-471a-b1ac-492e37284937","Type":"ContainerStarted","Data":"e17d019ff2a8adfba59c98cecf4eb9d6b262859645b116fa430e9b8fdeed6ea1"} Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.728517 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-dbfd865bb-8242b" event={"ID":"52265b5d-530b-471a-b1ac-492e37284937","Type":"ContainerStarted","Data":"924da609d2f72c820fe94bafaf15936dd5f8fdb198ee05e8e687efc0912c8a61"} Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.728633 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-dbfd865bb-8242b" Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.747398 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-764bc4c4ff-fb769" podStartSLOduration=5.747382309 podStartE2EDuration="5.747382309s" podCreationTimestamp="2025-10-07 11:38:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:38:36.744595736 +0000 UTC m=+1083.540994725" watchObservedRunningTime="2025-10-07 11:38:36.747382309 +0000 UTC m=+1083.543781298" Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.762967 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5bd5586b7f-mt9tb" event={"ID":"6d6d6a4d-b338-4b5f-8606-ecb9129b2a15","Type":"ContainerStarted","Data":"78d9641fea6928bd0d55e5ca6793de08c7ff3a1ef7f7ff5eb3dc7f6e8eb35f0b"} Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.763010 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5bd5586b7f-mt9tb" event={"ID":"6d6d6a4d-b338-4b5f-8606-ecb9129b2a15","Type":"ContainerStarted","Data":"41ff0e1850e27b12c98d4b181ddc6fda90e142dc258278ee3a201d2f03457107"} Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.763901 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5bd5586b7f-mt9tb" Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.765824 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"36ec56ec-a014-4027-a6c0-c817f5bda5ca","Type":"ContainerStarted","Data":"b3e9b9693bd062965ab1ec26983bc6a87873a88e1f961a689abba3921da9b812"} Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.777404 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-dbfd865bb-8242b" podStartSLOduration=7.77738135 podStartE2EDuration="7.77738135s" podCreationTimestamp="2025-10-07 11:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:38:36.772026981 +0000 UTC m=+1083.568425970" watchObservedRunningTime="2025-10-07 11:38:36.77738135 +0000 UTC m=+1083.573780339" Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.788564 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7cd7d44d75-xs58b" event={"ID":"2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81","Type":"ContainerStarted","Data":"a9e8e296900662ddcdc2e0c538eda4aa4ade9c2acdcde588ae524b6b429c6b13"} Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.788619 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7cd7d44d75-xs58b" event={"ID":"2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81","Type":"ContainerStarted","Data":"a31fe5670fa0a64f358e9129eb7b9d6d19b307fc84faa62361b8f097090024ab"} Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.791825 4700 generic.go:334] "Generic (PLEG): container finished" podID="457a22b8-1662-44fc-a94d-35001b8c574d" containerID="c4d5e8afbabe402548e62f3a80bc4e5b0cd644f0a01db9d923ebf4e2dea4277d" exitCode=1 Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.791883 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7f979f4f58-7tjtf" event={"ID":"457a22b8-1662-44fc-a94d-35001b8c574d","Type":"ContainerDied","Data":"c4d5e8afbabe402548e62f3a80bc4e5b0cd644f0a01db9d923ebf4e2dea4277d"} Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.791918 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7f979f4f58-7tjtf" event={"ID":"457a22b8-1662-44fc-a94d-35001b8c574d","Type":"ContainerStarted","Data":"9a27eda8f0042e57d6fe8287e865ee1c69126d7b81c00cf539e80119a760bb92"} Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.792595 4700 scope.go:117] "RemoveContainer" containerID="c4d5e8afbabe402548e62f3a80bc4e5b0cd644f0a01db9d923ebf4e2dea4277d" Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.812898 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-5bd5586b7f-mt9tb" podStartSLOduration=5.812875265 podStartE2EDuration="5.812875265s" podCreationTimestamp="2025-10-07 11:38:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:38:36.799743753 +0000 UTC m=+1083.596142762" watchObservedRunningTime="2025-10-07 11:38:36.812875265 +0000 UTC m=+1083.609274254" Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.818450 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-575565f88c-czn8g" event={"ID":"83eadcce-bdaa-493b-b76c-91cdfd9f8b15","Type":"ContainerStarted","Data":"752aa2ad481de65d78fcdad8ef43984e550e54dce8d3a8890b8650ec6c58e16e"} Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.818496 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-575565f88c-czn8g" Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.818506 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-575565f88c-czn8g" event={"ID":"83eadcce-bdaa-493b-b76c-91cdfd9f8b15","Type":"ContainerStarted","Data":"2f0eb5a3d00350bd773d07bcbbe0eece19b5fb4b5aa7eb65aae67e4ead6c753d"} Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.855378 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:38:36 crc kubenswrapper[4700]: I1007 11:38:36.861213 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-575565f88c-czn8g" podStartSLOduration=7.861195433 podStartE2EDuration="7.861195433s" podCreationTimestamp="2025-10-07 11:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:38:36.849893249 +0000 UTC m=+1083.646292238" watchObservedRunningTime="2025-10-07 11:38:36.861195433 +0000 UTC m=+1083.657594422" Oct 07 11:38:37 crc kubenswrapper[4700]: W1007 11:38:37.248898 4700 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod457a22b8_1662_44fc_a94d_35001b8c574d.slice/crio-conmon-c4d5e8afbabe402548e62f3a80bc4e5b0cd644f0a01db9d923ebf4e2dea4277d.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod457a22b8_1662_44fc_a94d_35001b8c574d.slice/crio-conmon-c4d5e8afbabe402548e62f3a80bc4e5b0cd644f0a01db9d923ebf4e2dea4277d.scope: no such file or directory Oct 07 11:38:37 crc kubenswrapper[4700]: W1007 11:38:37.249201 4700 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod457a22b8_1662_44fc_a94d_35001b8c574d.slice/crio-c4d5e8afbabe402548e62f3a80bc4e5b0cd644f0a01db9d923ebf4e2dea4277d.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod457a22b8_1662_44fc_a94d_35001b8c574d.slice/crio-c4d5e8afbabe402548e62f3a80bc4e5b0cd644f0a01db9d923ebf4e2dea4277d.scope: no such file or directory Oct 07 11:38:37 crc kubenswrapper[4700]: W1007 11:38:37.249234 4700 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52265b5d_530b_471a_b1ac_492e37284937.slice/crio-conmon-e17d019ff2a8adfba59c98cecf4eb9d6b262859645b116fa430e9b8fdeed6ea1.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52265b5d_530b_471a_b1ac_492e37284937.slice/crio-conmon-e17d019ff2a8adfba59c98cecf4eb9d6b262859645b116fa430e9b8fdeed6ea1.scope: no such file or directory Oct 07 11:38:37 crc kubenswrapper[4700]: W1007 11:38:37.249264 4700 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52265b5d_530b_471a_b1ac_492e37284937.slice/crio-e17d019ff2a8adfba59c98cecf4eb9d6b262859645b116fa430e9b8fdeed6ea1.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52265b5d_530b_471a_b1ac_492e37284937.slice/crio-e17d019ff2a8adfba59c98cecf4eb9d6b262859645b116fa430e9b8fdeed6ea1.scope: no such file or directory Oct 07 11:38:37 crc kubenswrapper[4700]: I1007 11:38:37.778532 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 11:38:37 crc kubenswrapper[4700]: I1007 11:38:37.846935 4700 generic.go:334] "Generic (PLEG): container finished" podID="52265b5d-530b-471a-b1ac-492e37284937" containerID="e17d019ff2a8adfba59c98cecf4eb9d6b262859645b116fa430e9b8fdeed6ea1" exitCode=1 Oct 07 11:38:37 crc kubenswrapper[4700]: I1007 11:38:37.846998 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-dbfd865bb-8242b" event={"ID":"52265b5d-530b-471a-b1ac-492e37284937","Type":"ContainerDied","Data":"e17d019ff2a8adfba59c98cecf4eb9d6b262859645b116fa430e9b8fdeed6ea1"} Oct 07 11:38:37 crc kubenswrapper[4700]: I1007 11:38:37.847652 4700 scope.go:117] "RemoveContainer" containerID="e17d019ff2a8adfba59c98cecf4eb9d6b262859645b116fa430e9b8fdeed6ea1" Oct 07 11:38:37 crc kubenswrapper[4700]: I1007 11:38:37.856143 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7cd7d44d75-xs58b" event={"ID":"2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81","Type":"ContainerStarted","Data":"b1577fee94d1339fe1dc79c606d0da92ef6961929ad8034e38e767364a381deb"} Oct 07 11:38:37 crc kubenswrapper[4700]: I1007 11:38:37.856279 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7cd7d44d75-xs58b" Oct 07 11:38:37 crc kubenswrapper[4700]: I1007 11:38:37.856331 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7cd7d44d75-xs58b" Oct 07 11:38:37 crc kubenswrapper[4700]: I1007 11:38:37.875664 4700 generic.go:334] "Generic (PLEG): container finished" podID="4f62089a-2daa-491e-bcdb-7f793df7cd99" containerID="ab00a65ae7be37741c52aa0fdf0fd85e68897595cfa5bb120b2d48d05e2b3de6" exitCode=0 Oct 07 11:38:37 crc kubenswrapper[4700]: I1007 11:38:37.875806 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 11:38:37 crc kubenswrapper[4700]: I1007 11:38:37.875860 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4f62089a-2daa-491e-bcdb-7f793df7cd99","Type":"ContainerDied","Data":"ab00a65ae7be37741c52aa0fdf0fd85e68897595cfa5bb120b2d48d05e2b3de6"} Oct 07 11:38:37 crc kubenswrapper[4700]: I1007 11:38:37.875915 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4f62089a-2daa-491e-bcdb-7f793df7cd99","Type":"ContainerDied","Data":"a1623f978fcf4d67b261a6e3c9463e9d73f57e01b13bcb9dfb6af5bd60b69f55"} Oct 07 11:38:37 crc kubenswrapper[4700]: I1007 11:38:37.875936 4700 scope.go:117] "RemoveContainer" containerID="ab00a65ae7be37741c52aa0fdf0fd85e68897595cfa5bb120b2d48d05e2b3de6" Oct 07 11:38:37 crc kubenswrapper[4700]: I1007 11:38:37.893648 4700 generic.go:334] "Generic (PLEG): container finished" podID="457a22b8-1662-44fc-a94d-35001b8c574d" containerID="a4b0a44c0fbf9aebf449db8ecb60a2d7269b0e32d8524bc59ce477368bd91f52" exitCode=1 Oct 07 11:38:37 crc kubenswrapper[4700]: I1007 11:38:37.893740 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7f979f4f58-7tjtf" event={"ID":"457a22b8-1662-44fc-a94d-35001b8c574d","Type":"ContainerDied","Data":"a4b0a44c0fbf9aebf449db8ecb60a2d7269b0e32d8524bc59ce477368bd91f52"} Oct 07 11:38:37 crc kubenswrapper[4700]: I1007 11:38:37.894596 4700 scope.go:117] "RemoveContainer" containerID="a4b0a44c0fbf9aebf449db8ecb60a2d7269b0e32d8524bc59ce477368bd91f52" Oct 07 11:38:37 crc kubenswrapper[4700]: E1007 11:38:37.894818 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-7f979f4f58-7tjtf_openstack(457a22b8-1662-44fc-a94d-35001b8c574d)\"" pod="openstack/heat-api-7f979f4f58-7tjtf" podUID="457a22b8-1662-44fc-a94d-35001b8c574d" Oct 07 11:38:37 crc kubenswrapper[4700]: I1007 11:38:37.896136 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7cd7d44d75-xs58b" podStartSLOduration=10.896121511 podStartE2EDuration="10.896121511s" podCreationTimestamp="2025-10-07 11:38:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:38:37.893490682 +0000 UTC m=+1084.689889671" watchObservedRunningTime="2025-10-07 11:38:37.896121511 +0000 UTC m=+1084.692520520" Oct 07 11:38:37 crc kubenswrapper[4700]: I1007 11:38:37.897021 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"818a295a-ff53-4276-9abd-3800769ccbf2","Type":"ContainerStarted","Data":"4ef90bdf64ec71d230353654d84c3acd17d59654c6f5c09969faa6a9c1f502dc"} Oct 07 11:38:37 crc kubenswrapper[4700]: I1007 11:38:37.897049 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"818a295a-ff53-4276-9abd-3800769ccbf2","Type":"ContainerStarted","Data":"50b326d9cedf2b7477d7479ef2989153c6aaf5b027d54b8895050e21dc1dd0f1"} Oct 07 11:38:37 crc kubenswrapper[4700]: I1007 11:38:37.907472 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"36ec56ec-a014-4027-a6c0-c817f5bda5ca","Type":"ContainerStarted","Data":"911cb70eead502d64e2ed3d2f2afc75f7bf8232e79850e168013e4515cc31cf7"} Oct 07 11:38:37 crc kubenswrapper[4700]: I1007 11:38:37.907547 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"36ec56ec-a014-4027-a6c0-c817f5bda5ca","Type":"ContainerStarted","Data":"8b9287f344c28a0206a56e6b898f1b2962afe51dbae84dd8a80fc525de41bccb"} Oct 07 11:38:37 crc kubenswrapper[4700]: I1007 11:38:37.931011 4700 scope.go:117] "RemoveContainer" containerID="857a04c2d9ccedf07de6eec685d503be77a0237c655ba26ff0e0a2cfe018d57b" Oct 07 11:38:37 crc kubenswrapper[4700]: I1007 11:38:37.939901 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4f62089a-2daa-491e-bcdb-7f793df7cd99-httpd-run\") pod \"4f62089a-2daa-491e-bcdb-7f793df7cd99\" (UID: \"4f62089a-2daa-491e-bcdb-7f793df7cd99\") " Oct 07 11:38:37 crc kubenswrapper[4700]: I1007 11:38:37.939949 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f62089a-2daa-491e-bcdb-7f793df7cd99-config-data\") pod \"4f62089a-2daa-491e-bcdb-7f793df7cd99\" (UID: \"4f62089a-2daa-491e-bcdb-7f793df7cd99\") " Oct 07 11:38:37 crc kubenswrapper[4700]: I1007 11:38:37.940000 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbncm\" (UniqueName: \"kubernetes.io/projected/4f62089a-2daa-491e-bcdb-7f793df7cd99-kube-api-access-vbncm\") pod \"4f62089a-2daa-491e-bcdb-7f793df7cd99\" (UID: \"4f62089a-2daa-491e-bcdb-7f793df7cd99\") " Oct 07 11:38:37 crc kubenswrapper[4700]: I1007 11:38:37.940046 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f62089a-2daa-491e-bcdb-7f793df7cd99-logs\") pod \"4f62089a-2daa-491e-bcdb-7f793df7cd99\" (UID: \"4f62089a-2daa-491e-bcdb-7f793df7cd99\") " Oct 07 11:38:37 crc kubenswrapper[4700]: I1007 11:38:37.940069 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f62089a-2daa-491e-bcdb-7f793df7cd99-scripts\") pod \"4f62089a-2daa-491e-bcdb-7f793df7cd99\" (UID: \"4f62089a-2daa-491e-bcdb-7f793df7cd99\") " Oct 07 11:38:37 crc kubenswrapper[4700]: I1007 11:38:37.940179 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f62089a-2daa-491e-bcdb-7f793df7cd99-public-tls-certs\") pod \"4f62089a-2daa-491e-bcdb-7f793df7cd99\" (UID: \"4f62089a-2daa-491e-bcdb-7f793df7cd99\") " Oct 07 11:38:37 crc kubenswrapper[4700]: I1007 11:38:37.940232 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"4f62089a-2daa-491e-bcdb-7f793df7cd99\" (UID: \"4f62089a-2daa-491e-bcdb-7f793df7cd99\") " Oct 07 11:38:37 crc kubenswrapper[4700]: I1007 11:38:37.940284 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f62089a-2daa-491e-bcdb-7f793df7cd99-combined-ca-bundle\") pod \"4f62089a-2daa-491e-bcdb-7f793df7cd99\" (UID: \"4f62089a-2daa-491e-bcdb-7f793df7cd99\") " Oct 07 11:38:37 crc kubenswrapper[4700]: I1007 11:38:37.941452 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f62089a-2daa-491e-bcdb-7f793df7cd99-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4f62089a-2daa-491e-bcdb-7f793df7cd99" (UID: "4f62089a-2daa-491e-bcdb-7f793df7cd99"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:38:37 crc kubenswrapper[4700]: I1007 11:38:37.944140 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f62089a-2daa-491e-bcdb-7f793df7cd99-logs" (OuterVolumeSpecName: "logs") pod "4f62089a-2daa-491e-bcdb-7f793df7cd99" (UID: "4f62089a-2daa-491e-bcdb-7f793df7cd99"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:38:37 crc kubenswrapper[4700]: I1007 11:38:37.958891 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "4f62089a-2daa-491e-bcdb-7f793df7cd99" (UID: "4f62089a-2daa-491e-bcdb-7f793df7cd99"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 11:38:37 crc kubenswrapper[4700]: I1007 11:38:37.959608 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f62089a-2daa-491e-bcdb-7f793df7cd99-kube-api-access-vbncm" (OuterVolumeSpecName: "kube-api-access-vbncm") pod "4f62089a-2daa-491e-bcdb-7f793df7cd99" (UID: "4f62089a-2daa-491e-bcdb-7f793df7cd99"). InnerVolumeSpecName "kube-api-access-vbncm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:38:37 crc kubenswrapper[4700]: I1007 11:38:37.977254 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f62089a-2daa-491e-bcdb-7f793df7cd99-scripts" (OuterVolumeSpecName: "scripts") pod "4f62089a-2daa-491e-bcdb-7f793df7cd99" (UID: "4f62089a-2daa-491e-bcdb-7f793df7cd99"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:37 crc kubenswrapper[4700]: I1007 11:38:37.986480 4700 scope.go:117] "RemoveContainer" containerID="ab00a65ae7be37741c52aa0fdf0fd85e68897595cfa5bb120b2d48d05e2b3de6" Oct 07 11:38:38 crc kubenswrapper[4700]: E1007 11:38:38.007602 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab00a65ae7be37741c52aa0fdf0fd85e68897595cfa5bb120b2d48d05e2b3de6\": container with ID starting with ab00a65ae7be37741c52aa0fdf0fd85e68897595cfa5bb120b2d48d05e2b3de6 not found: ID does not exist" containerID="ab00a65ae7be37741c52aa0fdf0fd85e68897595cfa5bb120b2d48d05e2b3de6" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.007667 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab00a65ae7be37741c52aa0fdf0fd85e68897595cfa5bb120b2d48d05e2b3de6"} err="failed to get container status \"ab00a65ae7be37741c52aa0fdf0fd85e68897595cfa5bb120b2d48d05e2b3de6\": rpc error: code = NotFound desc = could not find container \"ab00a65ae7be37741c52aa0fdf0fd85e68897595cfa5bb120b2d48d05e2b3de6\": container with ID starting with ab00a65ae7be37741c52aa0fdf0fd85e68897595cfa5bb120b2d48d05e2b3de6 not found: ID does not exist" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.007691 4700 scope.go:117] "RemoveContainer" containerID="857a04c2d9ccedf07de6eec685d503be77a0237c655ba26ff0e0a2cfe018d57b" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.017228 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37d17d5c-b3c0-46bd-971e-194cda30f554" path="/var/lib/kubelet/pods/37d17d5c-b3c0-46bd-971e-194cda30f554/volumes" Oct 07 11:38:38 crc kubenswrapper[4700]: E1007 11:38:38.018688 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"857a04c2d9ccedf07de6eec685d503be77a0237c655ba26ff0e0a2cfe018d57b\": container with ID starting with 857a04c2d9ccedf07de6eec685d503be77a0237c655ba26ff0e0a2cfe018d57b not found: ID does not exist" containerID="857a04c2d9ccedf07de6eec685d503be77a0237c655ba26ff0e0a2cfe018d57b" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.018727 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"857a04c2d9ccedf07de6eec685d503be77a0237c655ba26ff0e0a2cfe018d57b"} err="failed to get container status \"857a04c2d9ccedf07de6eec685d503be77a0237c655ba26ff0e0a2cfe018d57b\": rpc error: code = NotFound desc = could not find container \"857a04c2d9ccedf07de6eec685d503be77a0237c655ba26ff0e0a2cfe018d57b\": container with ID starting with 857a04c2d9ccedf07de6eec685d503be77a0237c655ba26ff0e0a2cfe018d57b not found: ID does not exist" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.018767 4700 scope.go:117] "RemoveContainer" containerID="c4d5e8afbabe402548e62f3a80bc4e5b0cd644f0a01db9d923ebf4e2dea4277d" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.021707 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=11.021687381 podStartE2EDuration="11.021687381s" podCreationTimestamp="2025-10-07 11:38:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:38:37.931593484 +0000 UTC m=+1084.727992473" watchObservedRunningTime="2025-10-07 11:38:38.021687381 +0000 UTC m=+1084.818086370" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.046967 4700 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.046997 4700 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4f62089a-2daa-491e-bcdb-7f793df7cd99-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.047007 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbncm\" (UniqueName: \"kubernetes.io/projected/4f62089a-2daa-491e-bcdb-7f793df7cd99-kube-api-access-vbncm\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.047018 4700 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f62089a-2daa-491e-bcdb-7f793df7cd99-logs\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.047027 4700 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f62089a-2daa-491e-bcdb-7f793df7cd99-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.074869 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f62089a-2daa-491e-bcdb-7f793df7cd99-config-data" (OuterVolumeSpecName: "config-data") pod "4f62089a-2daa-491e-bcdb-7f793df7cd99" (UID: "4f62089a-2daa-491e-bcdb-7f793df7cd99"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.101868 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f62089a-2daa-491e-bcdb-7f793df7cd99-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4f62089a-2daa-491e-bcdb-7f793df7cd99" (UID: "4f62089a-2daa-491e-bcdb-7f793df7cd99"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.120055 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f62089a-2daa-491e-bcdb-7f793df7cd99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f62089a-2daa-491e-bcdb-7f793df7cd99" (UID: "4f62089a-2daa-491e-bcdb-7f793df7cd99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.128929 4700 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.148895 4700 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f62089a-2daa-491e-bcdb-7f793df7cd99-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.148929 4700 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.148940 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f62089a-2daa-491e-bcdb-7f793df7cd99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.148949 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f62089a-2daa-491e-bcdb-7f793df7cd99-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.215365 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.226580 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.237170 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 11:38:38 crc kubenswrapper[4700]: E1007 11:38:38.237580 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f62089a-2daa-491e-bcdb-7f793df7cd99" containerName="glance-httpd" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.237600 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f62089a-2daa-491e-bcdb-7f793df7cd99" containerName="glance-httpd" Oct 07 11:38:38 crc kubenswrapper[4700]: E1007 11:38:38.237626 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f62089a-2daa-491e-bcdb-7f793df7cd99" containerName="glance-log" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.237634 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f62089a-2daa-491e-bcdb-7f793df7cd99" containerName="glance-log" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.237835 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f62089a-2daa-491e-bcdb-7f793df7cd99" containerName="glance-log" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.237855 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f62089a-2daa-491e-bcdb-7f793df7cd99" containerName="glance-httpd" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.238759 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.242904 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.244281 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.254771 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.350414 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.355376 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c374f64-ff8f-42c4-b879-fc4a8462a252-scripts\") pod \"glance-default-external-api-0\" (UID: \"2c374f64-ff8f-42c4-b879-fc4a8462a252\") " pod="openstack/glance-default-external-api-0" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.355636 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c374f64-ff8f-42c4-b879-fc4a8462a252-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2c374f64-ff8f-42c4-b879-fc4a8462a252\") " pod="openstack/glance-default-external-api-0" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.355757 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"2c374f64-ff8f-42c4-b879-fc4a8462a252\") " pod="openstack/glance-default-external-api-0" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.355840 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gh5z\" (UniqueName: \"kubernetes.io/projected/2c374f64-ff8f-42c4-b879-fc4a8462a252-kube-api-access-2gh5z\") pod \"glance-default-external-api-0\" (UID: \"2c374f64-ff8f-42c4-b879-fc4a8462a252\") " pod="openstack/glance-default-external-api-0" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.355936 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2c374f64-ff8f-42c4-b879-fc4a8462a252-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2c374f64-ff8f-42c4-b879-fc4a8462a252\") " pod="openstack/glance-default-external-api-0" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.356018 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c374f64-ff8f-42c4-b879-fc4a8462a252-logs\") pod \"glance-default-external-api-0\" (UID: \"2c374f64-ff8f-42c4-b879-fc4a8462a252\") " pod="openstack/glance-default-external-api-0" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.356088 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c374f64-ff8f-42c4-b879-fc4a8462a252-config-data\") pod \"glance-default-external-api-0\" (UID: \"2c374f64-ff8f-42c4-b879-fc4a8462a252\") " pod="openstack/glance-default-external-api-0" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.356153 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c374f64-ff8f-42c4-b879-fc4a8462a252-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2c374f64-ff8f-42c4-b879-fc4a8462a252\") " pod="openstack/glance-default-external-api-0" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.461254 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"2c374f64-ff8f-42c4-b879-fc4a8462a252\") " pod="openstack/glance-default-external-api-0" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.461320 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gh5z\" (UniqueName: \"kubernetes.io/projected/2c374f64-ff8f-42c4-b879-fc4a8462a252-kube-api-access-2gh5z\") pod \"glance-default-external-api-0\" (UID: \"2c374f64-ff8f-42c4-b879-fc4a8462a252\") " pod="openstack/glance-default-external-api-0" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.461369 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2c374f64-ff8f-42c4-b879-fc4a8462a252-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2c374f64-ff8f-42c4-b879-fc4a8462a252\") " pod="openstack/glance-default-external-api-0" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.461397 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c374f64-ff8f-42c4-b879-fc4a8462a252-logs\") pod \"glance-default-external-api-0\" (UID: \"2c374f64-ff8f-42c4-b879-fc4a8462a252\") " pod="openstack/glance-default-external-api-0" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.461416 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c374f64-ff8f-42c4-b879-fc4a8462a252-config-data\") pod \"glance-default-external-api-0\" (UID: \"2c374f64-ff8f-42c4-b879-fc4a8462a252\") " pod="openstack/glance-default-external-api-0" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.461442 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c374f64-ff8f-42c4-b879-fc4a8462a252-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2c374f64-ff8f-42c4-b879-fc4a8462a252\") " pod="openstack/glance-default-external-api-0" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.461486 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c374f64-ff8f-42c4-b879-fc4a8462a252-scripts\") pod \"glance-default-external-api-0\" (UID: \"2c374f64-ff8f-42c4-b879-fc4a8462a252\") " pod="openstack/glance-default-external-api-0" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.461516 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c374f64-ff8f-42c4-b879-fc4a8462a252-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2c374f64-ff8f-42c4-b879-fc4a8462a252\") " pod="openstack/glance-default-external-api-0" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.462372 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c374f64-ff8f-42c4-b879-fc4a8462a252-logs\") pod \"glance-default-external-api-0\" (UID: \"2c374f64-ff8f-42c4-b879-fc4a8462a252\") " pod="openstack/glance-default-external-api-0" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.462464 4700 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"2c374f64-ff8f-42c4-b879-fc4a8462a252\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.463031 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2c374f64-ff8f-42c4-b879-fc4a8462a252-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2c374f64-ff8f-42c4-b879-fc4a8462a252\") " pod="openstack/glance-default-external-api-0" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.475285 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c374f64-ff8f-42c4-b879-fc4a8462a252-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2c374f64-ff8f-42c4-b879-fc4a8462a252\") " pod="openstack/glance-default-external-api-0" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.481832 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c374f64-ff8f-42c4-b879-fc4a8462a252-scripts\") pod \"glance-default-external-api-0\" (UID: \"2c374f64-ff8f-42c4-b879-fc4a8462a252\") " pod="openstack/glance-default-external-api-0" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.487277 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c374f64-ff8f-42c4-b879-fc4a8462a252-config-data\") pod \"glance-default-external-api-0\" (UID: \"2c374f64-ff8f-42c4-b879-fc4a8462a252\") " pod="openstack/glance-default-external-api-0" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.494991 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gh5z\" (UniqueName: \"kubernetes.io/projected/2c374f64-ff8f-42c4-b879-fc4a8462a252-kube-api-access-2gh5z\") pod \"glance-default-external-api-0\" (UID: \"2c374f64-ff8f-42c4-b879-fc4a8462a252\") " pod="openstack/glance-default-external-api-0" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.510408 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c374f64-ff8f-42c4-b879-fc4a8462a252-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2c374f64-ff8f-42c4-b879-fc4a8462a252\") " pod="openstack/glance-default-external-api-0" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.544614 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"2c374f64-ff8f-42c4-b879-fc4a8462a252\") " pod="openstack/glance-default-external-api-0" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.562759 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.947430 4700 generic.go:334] "Generic (PLEG): container finished" podID="52265b5d-530b-471a-b1ac-492e37284937" containerID="8cb6fabf578b2bdc7504a7296e7e046b4c155bd505f882845174cc5ebc4c0e63" exitCode=1 Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.947538 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-dbfd865bb-8242b" event={"ID":"52265b5d-530b-471a-b1ac-492e37284937","Type":"ContainerDied","Data":"8cb6fabf578b2bdc7504a7296e7e046b4c155bd505f882845174cc5ebc4c0e63"} Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.948459 4700 scope.go:117] "RemoveContainer" containerID="8cb6fabf578b2bdc7504a7296e7e046b4c155bd505f882845174cc5ebc4c0e63" Oct 07 11:38:38 crc kubenswrapper[4700]: E1007 11:38:38.949653 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-dbfd865bb-8242b_openstack(52265b5d-530b-471a-b1ac-492e37284937)\"" pod="openstack/heat-cfnapi-dbfd865bb-8242b" podUID="52265b5d-530b-471a-b1ac-492e37284937" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.949905 4700 scope.go:117] "RemoveContainer" containerID="e17d019ff2a8adfba59c98cecf4eb9d6b262859645b116fa430e9b8fdeed6ea1" Oct 07 11:38:38 crc kubenswrapper[4700]: I1007 11:38:38.982332 4700 scope.go:117] "RemoveContainer" containerID="a4b0a44c0fbf9aebf449db8ecb60a2d7269b0e32d8524bc59ce477368bd91f52" Oct 07 11:38:38 crc kubenswrapper[4700]: E1007 11:38:38.982507 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-7f979f4f58-7tjtf_openstack(457a22b8-1662-44fc-a94d-35001b8c574d)\"" pod="openstack/heat-api-7f979f4f58-7tjtf" podUID="457a22b8-1662-44fc-a94d-35001b8c574d" Oct 07 11:38:39 crc kubenswrapper[4700]: I1007 11:38:39.236671 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 11:38:39 crc kubenswrapper[4700]: I1007 11:38:39.968341 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f62089a-2daa-491e-bcdb-7f793df7cd99" path="/var/lib/kubelet/pods/4f62089a-2daa-491e-bcdb-7f793df7cd99/volumes" Oct 07 11:38:39 crc kubenswrapper[4700]: I1007 11:38:39.969368 4700 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-7f979f4f58-7tjtf" Oct 07 11:38:39 crc kubenswrapper[4700]: I1007 11:38:39.969597 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-7f979f4f58-7tjtf" Oct 07 11:38:39 crc kubenswrapper[4700]: I1007 11:38:39.991917 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"818a295a-ff53-4276-9abd-3800769ccbf2","Type":"ContainerStarted","Data":"9e17660cd50ef266dc24d9f1106b346c9dacb3943e8d105ba116b0bf36cacf5a"} Oct 07 11:38:39 crc kubenswrapper[4700]: I1007 11:38:39.994732 4700 scope.go:117] "RemoveContainer" containerID="8cb6fabf578b2bdc7504a7296e7e046b4c155bd505f882845174cc5ebc4c0e63" Oct 07 11:38:39 crc kubenswrapper[4700]: E1007 11:38:39.995035 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-dbfd865bb-8242b_openstack(52265b5d-530b-471a-b1ac-492e37284937)\"" pod="openstack/heat-cfnapi-dbfd865bb-8242b" podUID="52265b5d-530b-471a-b1ac-492e37284937" Oct 07 11:38:39 crc kubenswrapper[4700]: I1007 11:38:39.997202 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2c374f64-ff8f-42c4-b879-fc4a8462a252","Type":"ContainerStarted","Data":"16d90f14d6e5b18c48f68c9ed2390e4ea766edef50985107cfb8e521d192a58b"} Oct 07 11:38:39 crc kubenswrapper[4700]: I1007 11:38:39.998144 4700 scope.go:117] "RemoveContainer" containerID="a4b0a44c0fbf9aebf449db8ecb60a2d7269b0e32d8524bc59ce477368bd91f52" Oct 07 11:38:39 crc kubenswrapper[4700]: E1007 11:38:39.998363 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-7f979f4f58-7tjtf_openstack(457a22b8-1662-44fc-a94d-35001b8c574d)\"" pod="openstack/heat-api-7f979f4f58-7tjtf" podUID="457a22b8-1662-44fc-a94d-35001b8c574d" Oct 07 11:38:40 crc kubenswrapper[4700]: I1007 11:38:40.225979 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-dbfd865bb-8242b" Oct 07 11:38:40 crc kubenswrapper[4700]: I1007 11:38:40.226021 4700 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-dbfd865bb-8242b" Oct 07 11:38:41 crc kubenswrapper[4700]: I1007 11:38:41.016732 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2c374f64-ff8f-42c4-b879-fc4a8462a252","Type":"ContainerStarted","Data":"ff7698d7784921eef0717f91d613cf38271acdd2a798449c3bacd3efc93d0fc7"} Oct 07 11:38:41 crc kubenswrapper[4700]: I1007 11:38:41.023264 4700 scope.go:117] "RemoveContainer" containerID="8cb6fabf578b2bdc7504a7296e7e046b4c155bd505f882845174cc5ebc4c0e63" Oct 07 11:38:41 crc kubenswrapper[4700]: E1007 11:38:41.024070 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-dbfd865bb-8242b_openstack(52265b5d-530b-471a-b1ac-492e37284937)\"" pod="openstack/heat-cfnapi-dbfd865bb-8242b" podUID="52265b5d-530b-471a-b1ac-492e37284937" Oct 07 11:38:41 crc kubenswrapper[4700]: I1007 11:38:41.346940 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 11:38:41 crc kubenswrapper[4700]: I1007 11:38:41.347533 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cba6fb68-aeed-4282-ab35-a7cc04b39f52" containerName="glance-log" containerID="cri-o://0112a381ded2be88f59f7f1f1a3984214d56e25e9f218ab1c136ca1b27748040" gracePeriod=30 Oct 07 11:38:41 crc kubenswrapper[4700]: I1007 11:38:41.348030 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cba6fb68-aeed-4282-ab35-a7cc04b39f52" containerName="glance-httpd" containerID="cri-o://54c2c37fcb8e59df9fa07e4fc2f35213d67d2d5206f780dd01cd2bed58173646" gracePeriod=30 Oct 07 11:38:41 crc kubenswrapper[4700]: I1007 11:38:41.951052 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:38:42 crc kubenswrapper[4700]: I1007 11:38:42.026795 4700 generic.go:334] "Generic (PLEG): container finished" podID="cba6fb68-aeed-4282-ab35-a7cc04b39f52" containerID="0112a381ded2be88f59f7f1f1a3984214d56e25e9f218ab1c136ca1b27748040" exitCode=143 Oct 07 11:38:42 crc kubenswrapper[4700]: I1007 11:38:42.026885 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cba6fb68-aeed-4282-ab35-a7cc04b39f52","Type":"ContainerDied","Data":"0112a381ded2be88f59f7f1f1a3984214d56e25e9f218ab1c136ca1b27748040"} Oct 07 11:38:42 crc kubenswrapper[4700]: I1007 11:38:42.029992 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2c374f64-ff8f-42c4-b879-fc4a8462a252","Type":"ContainerStarted","Data":"9d5584feec7b0efa2883483ec9cb8a70c7c4a4f3313b98cafca86c10bdaf5e5b"} Oct 07 11:38:42 crc kubenswrapper[4700]: I1007 11:38:42.032577 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"818a295a-ff53-4276-9abd-3800769ccbf2","Type":"ContainerStarted","Data":"95ccfae1932686b26ab12a6cd401f2c671d964796eee5f9a7f751f7b8c0b0246"} Oct 07 11:38:42 crc kubenswrapper[4700]: I1007 11:38:42.053782 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.053763886 podStartE2EDuration="4.053763886s" podCreationTimestamp="2025-10-07 11:38:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:38:42.049900796 +0000 UTC m=+1088.846299805" watchObservedRunningTime="2025-10-07 11:38:42.053763886 +0000 UTC m=+1088.850162875" Oct 07 11:38:42 crc kubenswrapper[4700]: I1007 11:38:42.402722 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-759f9b554d-c5s6x" Oct 07 11:38:42 crc kubenswrapper[4700]: I1007 11:38:42.701368 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7554f5f8dd-vxdsk" Oct 07 11:38:43 crc kubenswrapper[4700]: I1007 11:38:43.040204 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-5bd5586b7f-mt9tb" Oct 07 11:38:43 crc kubenswrapper[4700]: I1007 11:38:43.044521 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"818a295a-ff53-4276-9abd-3800769ccbf2","Type":"ContainerStarted","Data":"d8ad8e75f6e73ccbcb84da8063db2a39053cdb02500dd7b41696686adb6bf94e"} Oct 07 11:38:43 crc kubenswrapper[4700]: I1007 11:38:43.044673 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 11:38:43 crc kubenswrapper[4700]: I1007 11:38:43.044706 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="818a295a-ff53-4276-9abd-3800769ccbf2" containerName="proxy-httpd" containerID="cri-o://d8ad8e75f6e73ccbcb84da8063db2a39053cdb02500dd7b41696686adb6bf94e" gracePeriod=30 Oct 07 11:38:43 crc kubenswrapper[4700]: I1007 11:38:43.044721 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="818a295a-ff53-4276-9abd-3800769ccbf2" containerName="ceilometer-notification-agent" containerID="cri-o://9e17660cd50ef266dc24d9f1106b346c9dacb3943e8d105ba116b0bf36cacf5a" gracePeriod=30 Oct 07 11:38:43 crc kubenswrapper[4700]: I1007 11:38:43.044683 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="818a295a-ff53-4276-9abd-3800769ccbf2" containerName="ceilometer-central-agent" containerID="cri-o://4ef90bdf64ec71d230353654d84c3acd17d59654c6f5c09969faa6a9c1f502dc" gracePeriod=30 Oct 07 11:38:43 crc kubenswrapper[4700]: I1007 11:38:43.044707 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="818a295a-ff53-4276-9abd-3800769ccbf2" containerName="sg-core" containerID="cri-o://95ccfae1932686b26ab12a6cd401f2c671d964796eee5f9a7f751f7b8c0b0246" gracePeriod=30 Oct 07 11:38:43 crc kubenswrapper[4700]: I1007 11:38:43.079287 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-764bc4c4ff-fb769" Oct 07 11:38:43 crc kubenswrapper[4700]: I1007 11:38:43.094528 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7f979f4f58-7tjtf"] Oct 07 11:38:43 crc kubenswrapper[4700]: I1007 11:38:43.097956 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.409551277 podStartE2EDuration="8.097942185s" podCreationTimestamp="2025-10-07 11:38:35 +0000 UTC" firstStartedPulling="2025-10-07 11:38:36.907644313 +0000 UTC m=+1083.704043302" lastFinishedPulling="2025-10-07 11:38:42.596035221 +0000 UTC m=+1089.392434210" observedRunningTime="2025-10-07 11:38:43.083689583 +0000 UTC m=+1089.880088572" watchObservedRunningTime="2025-10-07 11:38:43.097942185 +0000 UTC m=+1089.894341174" Oct 07 11:38:43 crc kubenswrapper[4700]: I1007 11:38:43.208163 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-dbfd865bb-8242b"] Oct 07 11:38:43 crc kubenswrapper[4700]: I1007 11:38:43.344720 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7cd7d44d75-xs58b" Oct 07 11:38:43 crc kubenswrapper[4700]: I1007 11:38:43.345571 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7cd7d44d75-xs58b" Oct 07 11:38:43 crc kubenswrapper[4700]: I1007 11:38:43.561928 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7f979f4f58-7tjtf" Oct 07 11:38:43 crc kubenswrapper[4700]: I1007 11:38:43.689182 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 07 11:38:43 crc kubenswrapper[4700]: I1007 11:38:43.701454 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/457a22b8-1662-44fc-a94d-35001b8c574d-config-data\") pod \"457a22b8-1662-44fc-a94d-35001b8c574d\" (UID: \"457a22b8-1662-44fc-a94d-35001b8c574d\") " Oct 07 11:38:43 crc kubenswrapper[4700]: I1007 11:38:43.701530 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/457a22b8-1662-44fc-a94d-35001b8c574d-combined-ca-bundle\") pod \"457a22b8-1662-44fc-a94d-35001b8c574d\" (UID: \"457a22b8-1662-44fc-a94d-35001b8c574d\") " Oct 07 11:38:43 crc kubenswrapper[4700]: I1007 11:38:43.701601 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/457a22b8-1662-44fc-a94d-35001b8c574d-config-data-custom\") pod \"457a22b8-1662-44fc-a94d-35001b8c574d\" (UID: \"457a22b8-1662-44fc-a94d-35001b8c574d\") " Oct 07 11:38:43 crc kubenswrapper[4700]: I1007 11:38:43.701659 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gsd2\" (UniqueName: \"kubernetes.io/projected/457a22b8-1662-44fc-a94d-35001b8c574d-kube-api-access-6gsd2\") pod \"457a22b8-1662-44fc-a94d-35001b8c574d\" (UID: \"457a22b8-1662-44fc-a94d-35001b8c574d\") " Oct 07 11:38:43 crc kubenswrapper[4700]: I1007 11:38:43.708893 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/457a22b8-1662-44fc-a94d-35001b8c574d-kube-api-access-6gsd2" (OuterVolumeSpecName: "kube-api-access-6gsd2") pod "457a22b8-1662-44fc-a94d-35001b8c574d" (UID: "457a22b8-1662-44fc-a94d-35001b8c574d"). InnerVolumeSpecName "kube-api-access-6gsd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:38:43 crc kubenswrapper[4700]: I1007 11:38:43.714505 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/457a22b8-1662-44fc-a94d-35001b8c574d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "457a22b8-1662-44fc-a94d-35001b8c574d" (UID: "457a22b8-1662-44fc-a94d-35001b8c574d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:43 crc kubenswrapper[4700]: I1007 11:38:43.772160 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/457a22b8-1662-44fc-a94d-35001b8c574d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "457a22b8-1662-44fc-a94d-35001b8c574d" (UID: "457a22b8-1662-44fc-a94d-35001b8c574d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:43 crc kubenswrapper[4700]: I1007 11:38:43.780477 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/457a22b8-1662-44fc-a94d-35001b8c574d-config-data" (OuterVolumeSpecName: "config-data") pod "457a22b8-1662-44fc-a94d-35001b8c574d" (UID: "457a22b8-1662-44fc-a94d-35001b8c574d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:43 crc kubenswrapper[4700]: I1007 11:38:43.805164 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/457a22b8-1662-44fc-a94d-35001b8c574d-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:43 crc kubenswrapper[4700]: I1007 11:38:43.805187 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/457a22b8-1662-44fc-a94d-35001b8c574d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:43 crc kubenswrapper[4700]: I1007 11:38:43.805198 4700 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/457a22b8-1662-44fc-a94d-35001b8c574d-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:43 crc kubenswrapper[4700]: I1007 11:38:43.805206 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gsd2\" (UniqueName: \"kubernetes.io/projected/457a22b8-1662-44fc-a94d-35001b8c574d-kube-api-access-6gsd2\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:43 crc kubenswrapper[4700]: I1007 11:38:43.837200 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-dbfd865bb-8242b" Oct 07 11:38:44 crc kubenswrapper[4700]: I1007 11:38:44.007463 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52265b5d-530b-471a-b1ac-492e37284937-config-data-custom\") pod \"52265b5d-530b-471a-b1ac-492e37284937\" (UID: \"52265b5d-530b-471a-b1ac-492e37284937\") " Oct 07 11:38:44 crc kubenswrapper[4700]: I1007 11:38:44.007507 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52265b5d-530b-471a-b1ac-492e37284937-config-data\") pod \"52265b5d-530b-471a-b1ac-492e37284937\" (UID: \"52265b5d-530b-471a-b1ac-492e37284937\") " Oct 07 11:38:44 crc kubenswrapper[4700]: I1007 11:38:44.007582 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52265b5d-530b-471a-b1ac-492e37284937-combined-ca-bundle\") pod \"52265b5d-530b-471a-b1ac-492e37284937\" (UID: \"52265b5d-530b-471a-b1ac-492e37284937\") " Oct 07 11:38:44 crc kubenswrapper[4700]: I1007 11:38:44.007658 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2td8r\" (UniqueName: \"kubernetes.io/projected/52265b5d-530b-471a-b1ac-492e37284937-kube-api-access-2td8r\") pod \"52265b5d-530b-471a-b1ac-492e37284937\" (UID: \"52265b5d-530b-471a-b1ac-492e37284937\") " Oct 07 11:38:44 crc kubenswrapper[4700]: I1007 11:38:44.011033 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52265b5d-530b-471a-b1ac-492e37284937-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "52265b5d-530b-471a-b1ac-492e37284937" (UID: "52265b5d-530b-471a-b1ac-492e37284937"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:44 crc kubenswrapper[4700]: I1007 11:38:44.015587 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52265b5d-530b-471a-b1ac-492e37284937-kube-api-access-2td8r" (OuterVolumeSpecName: "kube-api-access-2td8r") pod "52265b5d-530b-471a-b1ac-492e37284937" (UID: "52265b5d-530b-471a-b1ac-492e37284937"). InnerVolumeSpecName "kube-api-access-2td8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:38:44 crc kubenswrapper[4700]: I1007 11:38:44.043415 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52265b5d-530b-471a-b1ac-492e37284937-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52265b5d-530b-471a-b1ac-492e37284937" (UID: "52265b5d-530b-471a-b1ac-492e37284937"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:44 crc kubenswrapper[4700]: I1007 11:38:44.057934 4700 generic.go:334] "Generic (PLEG): container finished" podID="818a295a-ff53-4276-9abd-3800769ccbf2" containerID="d8ad8e75f6e73ccbcb84da8063db2a39053cdb02500dd7b41696686adb6bf94e" exitCode=0 Oct 07 11:38:44 crc kubenswrapper[4700]: I1007 11:38:44.057965 4700 generic.go:334] "Generic (PLEG): container finished" podID="818a295a-ff53-4276-9abd-3800769ccbf2" containerID="95ccfae1932686b26ab12a6cd401f2c671d964796eee5f9a7f751f7b8c0b0246" exitCode=2 Oct 07 11:38:44 crc kubenswrapper[4700]: I1007 11:38:44.057973 4700 generic.go:334] "Generic (PLEG): container finished" podID="818a295a-ff53-4276-9abd-3800769ccbf2" containerID="9e17660cd50ef266dc24d9f1106b346c9dacb3943e8d105ba116b0bf36cacf5a" exitCode=0 Oct 07 11:38:44 crc kubenswrapper[4700]: I1007 11:38:44.058005 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"818a295a-ff53-4276-9abd-3800769ccbf2","Type":"ContainerDied","Data":"d8ad8e75f6e73ccbcb84da8063db2a39053cdb02500dd7b41696686adb6bf94e"} Oct 07 11:38:44 crc kubenswrapper[4700]: I1007 11:38:44.058030 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"818a295a-ff53-4276-9abd-3800769ccbf2","Type":"ContainerDied","Data":"95ccfae1932686b26ab12a6cd401f2c671d964796eee5f9a7f751f7b8c0b0246"} Oct 07 11:38:44 crc kubenswrapper[4700]: I1007 11:38:44.058039 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"818a295a-ff53-4276-9abd-3800769ccbf2","Type":"ContainerDied","Data":"9e17660cd50ef266dc24d9f1106b346c9dacb3943e8d105ba116b0bf36cacf5a"} Oct 07 11:38:44 crc kubenswrapper[4700]: I1007 11:38:44.059500 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-dbfd865bb-8242b" event={"ID":"52265b5d-530b-471a-b1ac-492e37284937","Type":"ContainerDied","Data":"924da609d2f72c820fe94bafaf15936dd5f8fdb198ee05e8e687efc0912c8a61"} Oct 07 11:38:44 crc kubenswrapper[4700]: I1007 11:38:44.059532 4700 scope.go:117] "RemoveContainer" containerID="8cb6fabf578b2bdc7504a7296e7e046b4c155bd505f882845174cc5ebc4c0e63" Oct 07 11:38:44 crc kubenswrapper[4700]: I1007 11:38:44.059609 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-dbfd865bb-8242b" Oct 07 11:38:44 crc kubenswrapper[4700]: I1007 11:38:44.065483 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7f979f4f58-7tjtf" event={"ID":"457a22b8-1662-44fc-a94d-35001b8c574d","Type":"ContainerDied","Data":"9a27eda8f0042e57d6fe8287e865ee1c69126d7b81c00cf539e80119a760bb92"} Oct 07 11:38:44 crc kubenswrapper[4700]: I1007 11:38:44.065720 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7f979f4f58-7tjtf" Oct 07 11:38:44 crc kubenswrapper[4700]: I1007 11:38:44.085328 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52265b5d-530b-471a-b1ac-492e37284937-config-data" (OuterVolumeSpecName: "config-data") pod "52265b5d-530b-471a-b1ac-492e37284937" (UID: "52265b5d-530b-471a-b1ac-492e37284937"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:44 crc kubenswrapper[4700]: I1007 11:38:44.098451 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7f979f4f58-7tjtf"] Oct 07 11:38:44 crc kubenswrapper[4700]: I1007 11:38:44.107507 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-7f979f4f58-7tjtf"] Oct 07 11:38:44 crc kubenswrapper[4700]: I1007 11:38:44.107782 4700 scope.go:117] "RemoveContainer" containerID="a4b0a44c0fbf9aebf449db8ecb60a2d7269b0e32d8524bc59ce477368bd91f52" Oct 07 11:38:44 crc kubenswrapper[4700]: I1007 11:38:44.109347 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2td8r\" (UniqueName: \"kubernetes.io/projected/52265b5d-530b-471a-b1ac-492e37284937-kube-api-access-2td8r\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:44 crc kubenswrapper[4700]: I1007 11:38:44.109367 4700 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52265b5d-530b-471a-b1ac-492e37284937-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:44 crc kubenswrapper[4700]: I1007 11:38:44.109377 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52265b5d-530b-471a-b1ac-492e37284937-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:44 crc kubenswrapper[4700]: I1007 11:38:44.109385 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52265b5d-530b-471a-b1ac-492e37284937-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:44 crc kubenswrapper[4700]: I1007 11:38:44.388276 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-dbfd865bb-8242b"] Oct 07 11:38:44 crc kubenswrapper[4700]: I1007 11:38:44.395456 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-dbfd865bb-8242b"] Oct 07 11:38:44 crc kubenswrapper[4700]: I1007 11:38:44.945607 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.024641 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba6fb68-aeed-4282-ab35-a7cc04b39f52-internal-tls-certs\") pod \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\" (UID: \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\") " Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.024979 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba6fb68-aeed-4282-ab35-a7cc04b39f52-combined-ca-bundle\") pod \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\" (UID: \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\") " Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.025070 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cba6fb68-aeed-4282-ab35-a7cc04b39f52-logs\") pod \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\" (UID: \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\") " Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.025100 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpz78\" (UniqueName: \"kubernetes.io/projected/cba6fb68-aeed-4282-ab35-a7cc04b39f52-kube-api-access-lpz78\") pod \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\" (UID: \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\") " Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.025138 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba6fb68-aeed-4282-ab35-a7cc04b39f52-scripts\") pod \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\" (UID: \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\") " Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.025164 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\" (UID: \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\") " Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.025233 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cba6fb68-aeed-4282-ab35-a7cc04b39f52-httpd-run\") pod \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\" (UID: \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\") " Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.025264 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba6fb68-aeed-4282-ab35-a7cc04b39f52-config-data\") pod \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\" (UID: \"cba6fb68-aeed-4282-ab35-a7cc04b39f52\") " Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.025965 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cba6fb68-aeed-4282-ab35-a7cc04b39f52-logs" (OuterVolumeSpecName: "logs") pod "cba6fb68-aeed-4282-ab35-a7cc04b39f52" (UID: "cba6fb68-aeed-4282-ab35-a7cc04b39f52"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.026055 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cba6fb68-aeed-4282-ab35-a7cc04b39f52-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cba6fb68-aeed-4282-ab35-a7cc04b39f52" (UID: "cba6fb68-aeed-4282-ab35-a7cc04b39f52"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.026761 4700 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cba6fb68-aeed-4282-ab35-a7cc04b39f52-logs\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.026777 4700 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cba6fb68-aeed-4282-ab35-a7cc04b39f52-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.032439 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "cba6fb68-aeed-4282-ab35-a7cc04b39f52" (UID: "cba6fb68-aeed-4282-ab35-a7cc04b39f52"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.041395 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cba6fb68-aeed-4282-ab35-a7cc04b39f52-kube-api-access-lpz78" (OuterVolumeSpecName: "kube-api-access-lpz78") pod "cba6fb68-aeed-4282-ab35-a7cc04b39f52" (UID: "cba6fb68-aeed-4282-ab35-a7cc04b39f52"). InnerVolumeSpecName "kube-api-access-lpz78". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.067915 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba6fb68-aeed-4282-ab35-a7cc04b39f52-scripts" (OuterVolumeSpecName: "scripts") pod "cba6fb68-aeed-4282-ab35-a7cc04b39f52" (UID: "cba6fb68-aeed-4282-ab35-a7cc04b39f52"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.076369 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba6fb68-aeed-4282-ab35-a7cc04b39f52-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cba6fb68-aeed-4282-ab35-a7cc04b39f52" (UID: "cba6fb68-aeed-4282-ab35-a7cc04b39f52"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.077911 4700 generic.go:334] "Generic (PLEG): container finished" podID="cba6fb68-aeed-4282-ab35-a7cc04b39f52" containerID="54c2c37fcb8e59df9fa07e4fc2f35213d67d2d5206f780dd01cd2bed58173646" exitCode=0 Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.078037 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.078059 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cba6fb68-aeed-4282-ab35-a7cc04b39f52","Type":"ContainerDied","Data":"54c2c37fcb8e59df9fa07e4fc2f35213d67d2d5206f780dd01cd2bed58173646"} Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.078497 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cba6fb68-aeed-4282-ab35-a7cc04b39f52","Type":"ContainerDied","Data":"b8701cccb134785eebf4bd59ef51de73fb7d5891b3a7d4738de6869724c3e920"} Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.078517 4700 scope.go:117] "RemoveContainer" containerID="54c2c37fcb8e59df9fa07e4fc2f35213d67d2d5206f780dd01cd2bed58173646" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.102263 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba6fb68-aeed-4282-ab35-a7cc04b39f52-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cba6fb68-aeed-4282-ab35-a7cc04b39f52" (UID: "cba6fb68-aeed-4282-ab35-a7cc04b39f52"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.109911 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba6fb68-aeed-4282-ab35-a7cc04b39f52-config-data" (OuterVolumeSpecName: "config-data") pod "cba6fb68-aeed-4282-ab35-a7cc04b39f52" (UID: "cba6fb68-aeed-4282-ab35-a7cc04b39f52"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.129890 4700 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba6fb68-aeed-4282-ab35-a7cc04b39f52-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.129924 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba6fb68-aeed-4282-ab35-a7cc04b39f52-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.129934 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpz78\" (UniqueName: \"kubernetes.io/projected/cba6fb68-aeed-4282-ab35-a7cc04b39f52-kube-api-access-lpz78\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.129954 4700 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba6fb68-aeed-4282-ab35-a7cc04b39f52-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.129975 4700 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.129986 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba6fb68-aeed-4282-ab35-a7cc04b39f52-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.148740 4700 scope.go:117] "RemoveContainer" containerID="0112a381ded2be88f59f7f1f1a3984214d56e25e9f218ab1c136ca1b27748040" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.152674 4700 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.166850 4700 scope.go:117] "RemoveContainer" containerID="54c2c37fcb8e59df9fa07e4fc2f35213d67d2d5206f780dd01cd2bed58173646" Oct 07 11:38:45 crc kubenswrapper[4700]: E1007 11:38:45.167384 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54c2c37fcb8e59df9fa07e4fc2f35213d67d2d5206f780dd01cd2bed58173646\": container with ID starting with 54c2c37fcb8e59df9fa07e4fc2f35213d67d2d5206f780dd01cd2bed58173646 not found: ID does not exist" containerID="54c2c37fcb8e59df9fa07e4fc2f35213d67d2d5206f780dd01cd2bed58173646" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.167427 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54c2c37fcb8e59df9fa07e4fc2f35213d67d2d5206f780dd01cd2bed58173646"} err="failed to get container status \"54c2c37fcb8e59df9fa07e4fc2f35213d67d2d5206f780dd01cd2bed58173646\": rpc error: code = NotFound desc = could not find container \"54c2c37fcb8e59df9fa07e4fc2f35213d67d2d5206f780dd01cd2bed58173646\": container with ID starting with 54c2c37fcb8e59df9fa07e4fc2f35213d67d2d5206f780dd01cd2bed58173646 not found: ID does not exist" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.167461 4700 scope.go:117] "RemoveContainer" containerID="0112a381ded2be88f59f7f1f1a3984214d56e25e9f218ab1c136ca1b27748040" Oct 07 11:38:45 crc kubenswrapper[4700]: E1007 11:38:45.167880 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0112a381ded2be88f59f7f1f1a3984214d56e25e9f218ab1c136ca1b27748040\": container with ID starting with 0112a381ded2be88f59f7f1f1a3984214d56e25e9f218ab1c136ca1b27748040 not found: ID does not exist" containerID="0112a381ded2be88f59f7f1f1a3984214d56e25e9f218ab1c136ca1b27748040" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.167920 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0112a381ded2be88f59f7f1f1a3984214d56e25e9f218ab1c136ca1b27748040"} err="failed to get container status \"0112a381ded2be88f59f7f1f1a3984214d56e25e9f218ab1c136ca1b27748040\": rpc error: code = NotFound desc = could not find container \"0112a381ded2be88f59f7f1f1a3984214d56e25e9f218ab1c136ca1b27748040\": container with ID starting with 0112a381ded2be88f59f7f1f1a3984214d56e25e9f218ab1c136ca1b27748040 not found: ID does not exist" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.232047 4700 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.333914 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.333974 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.419692 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.434242 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.473611 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 11:38:45 crc kubenswrapper[4700]: E1007 11:38:45.474050 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52265b5d-530b-471a-b1ac-492e37284937" containerName="heat-cfnapi" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.474068 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="52265b5d-530b-471a-b1ac-492e37284937" containerName="heat-cfnapi" Oct 07 11:38:45 crc kubenswrapper[4700]: E1007 11:38:45.474078 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba6fb68-aeed-4282-ab35-a7cc04b39f52" containerName="glance-httpd" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.474085 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba6fb68-aeed-4282-ab35-a7cc04b39f52" containerName="glance-httpd" Oct 07 11:38:45 crc kubenswrapper[4700]: E1007 11:38:45.474108 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="457a22b8-1662-44fc-a94d-35001b8c574d" containerName="heat-api" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.474114 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="457a22b8-1662-44fc-a94d-35001b8c574d" containerName="heat-api" Oct 07 11:38:45 crc kubenswrapper[4700]: E1007 11:38:45.474133 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba6fb68-aeed-4282-ab35-a7cc04b39f52" containerName="glance-log" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.474139 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba6fb68-aeed-4282-ab35-a7cc04b39f52" containerName="glance-log" Oct 07 11:38:45 crc kubenswrapper[4700]: E1007 11:38:45.474148 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52265b5d-530b-471a-b1ac-492e37284937" containerName="heat-cfnapi" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.474153 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="52265b5d-530b-471a-b1ac-492e37284937" containerName="heat-cfnapi" Oct 07 11:38:45 crc kubenswrapper[4700]: E1007 11:38:45.474169 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="457a22b8-1662-44fc-a94d-35001b8c574d" containerName="heat-api" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.474176 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="457a22b8-1662-44fc-a94d-35001b8c574d" containerName="heat-api" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.474356 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="52265b5d-530b-471a-b1ac-492e37284937" containerName="heat-cfnapi" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.474369 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="cba6fb68-aeed-4282-ab35-a7cc04b39f52" containerName="glance-httpd" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.474377 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="457a22b8-1662-44fc-a94d-35001b8c574d" containerName="heat-api" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.474388 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="cba6fb68-aeed-4282-ab35-a7cc04b39f52" containerName="glance-log" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.474785 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="457a22b8-1662-44fc-a94d-35001b8c574d" containerName="heat-api" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.474818 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="52265b5d-530b-471a-b1ac-492e37284937" containerName="heat-cfnapi" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.475479 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.481687 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.487907 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.488096 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.641064 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"91a6e182-e619-4e81-a9f1-4a31630788c5\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.641137 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91a6e182-e619-4e81-a9f1-4a31630788c5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"91a6e182-e619-4e81-a9f1-4a31630788c5\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.641160 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/91a6e182-e619-4e81-a9f1-4a31630788c5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"91a6e182-e619-4e81-a9f1-4a31630788c5\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.641196 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/91a6e182-e619-4e81-a9f1-4a31630788c5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"91a6e182-e619-4e81-a9f1-4a31630788c5\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.641220 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91a6e182-e619-4e81-a9f1-4a31630788c5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"91a6e182-e619-4e81-a9f1-4a31630788c5\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.641248 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsr7v\" (UniqueName: \"kubernetes.io/projected/91a6e182-e619-4e81-a9f1-4a31630788c5-kube-api-access-tsr7v\") pod \"glance-default-internal-api-0\" (UID: \"91a6e182-e619-4e81-a9f1-4a31630788c5\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.641272 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91a6e182-e619-4e81-a9f1-4a31630788c5-logs\") pod \"glance-default-internal-api-0\" (UID: \"91a6e182-e619-4e81-a9f1-4a31630788c5\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.641298 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91a6e182-e619-4e81-a9f1-4a31630788c5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"91a6e182-e619-4e81-a9f1-4a31630788c5\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.742594 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/91a6e182-e619-4e81-a9f1-4a31630788c5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"91a6e182-e619-4e81-a9f1-4a31630788c5\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.742992 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91a6e182-e619-4e81-a9f1-4a31630788c5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"91a6e182-e619-4e81-a9f1-4a31630788c5\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.743033 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsr7v\" (UniqueName: \"kubernetes.io/projected/91a6e182-e619-4e81-a9f1-4a31630788c5-kube-api-access-tsr7v\") pod \"glance-default-internal-api-0\" (UID: \"91a6e182-e619-4e81-a9f1-4a31630788c5\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.743074 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91a6e182-e619-4e81-a9f1-4a31630788c5-logs\") pod \"glance-default-internal-api-0\" (UID: \"91a6e182-e619-4e81-a9f1-4a31630788c5\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.743131 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91a6e182-e619-4e81-a9f1-4a31630788c5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"91a6e182-e619-4e81-a9f1-4a31630788c5\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.743224 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"91a6e182-e619-4e81-a9f1-4a31630788c5\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.743283 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91a6e182-e619-4e81-a9f1-4a31630788c5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"91a6e182-e619-4e81-a9f1-4a31630788c5\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.743388 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/91a6e182-e619-4e81-a9f1-4a31630788c5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"91a6e182-e619-4e81-a9f1-4a31630788c5\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.743636 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91a6e182-e619-4e81-a9f1-4a31630788c5-logs\") pod \"glance-default-internal-api-0\" (UID: \"91a6e182-e619-4e81-a9f1-4a31630788c5\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.743861 4700 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"91a6e182-e619-4e81-a9f1-4a31630788c5\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.743907 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/91a6e182-e619-4e81-a9f1-4a31630788c5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"91a6e182-e619-4e81-a9f1-4a31630788c5\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.749354 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91a6e182-e619-4e81-a9f1-4a31630788c5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"91a6e182-e619-4e81-a9f1-4a31630788c5\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.749952 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/91a6e182-e619-4e81-a9f1-4a31630788c5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"91a6e182-e619-4e81-a9f1-4a31630788c5\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.752163 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91a6e182-e619-4e81-a9f1-4a31630788c5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"91a6e182-e619-4e81-a9f1-4a31630788c5\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.760096 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91a6e182-e619-4e81-a9f1-4a31630788c5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"91a6e182-e619-4e81-a9f1-4a31630788c5\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.764165 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsr7v\" (UniqueName: \"kubernetes.io/projected/91a6e182-e619-4e81-a9f1-4a31630788c5-kube-api-access-tsr7v\") pod \"glance-default-internal-api-0\" (UID: \"91a6e182-e619-4e81-a9f1-4a31630788c5\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.783368 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"91a6e182-e619-4e81-a9f1-4a31630788c5\") " pod="openstack/glance-default-internal-api-0" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.809327 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.885558 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.971134 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="457a22b8-1662-44fc-a94d-35001b8c574d" path="/var/lib/kubelet/pods/457a22b8-1662-44fc-a94d-35001b8c574d/volumes" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.972268 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52265b5d-530b-471a-b1ac-492e37284937" path="/var/lib/kubelet/pods/52265b5d-530b-471a-b1ac-492e37284937/volumes" Oct 07 11:38:45 crc kubenswrapper[4700]: I1007 11:38:45.972847 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cba6fb68-aeed-4282-ab35-a7cc04b39f52" path="/var/lib/kubelet/pods/cba6fb68-aeed-4282-ab35-a7cc04b39f52/volumes" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.054087 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqghv\" (UniqueName: \"kubernetes.io/projected/818a295a-ff53-4276-9abd-3800769ccbf2-kube-api-access-hqghv\") pod \"818a295a-ff53-4276-9abd-3800769ccbf2\" (UID: \"818a295a-ff53-4276-9abd-3800769ccbf2\") " Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.054159 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/818a295a-ff53-4276-9abd-3800769ccbf2-run-httpd\") pod \"818a295a-ff53-4276-9abd-3800769ccbf2\" (UID: \"818a295a-ff53-4276-9abd-3800769ccbf2\") " Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.054212 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/818a295a-ff53-4276-9abd-3800769ccbf2-scripts\") pod \"818a295a-ff53-4276-9abd-3800769ccbf2\" (UID: \"818a295a-ff53-4276-9abd-3800769ccbf2\") " Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.054279 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/818a295a-ff53-4276-9abd-3800769ccbf2-combined-ca-bundle\") pod \"818a295a-ff53-4276-9abd-3800769ccbf2\" (UID: \"818a295a-ff53-4276-9abd-3800769ccbf2\") " Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.054396 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/818a295a-ff53-4276-9abd-3800769ccbf2-sg-core-conf-yaml\") pod \"818a295a-ff53-4276-9abd-3800769ccbf2\" (UID: \"818a295a-ff53-4276-9abd-3800769ccbf2\") " Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.054435 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/818a295a-ff53-4276-9abd-3800769ccbf2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "818a295a-ff53-4276-9abd-3800769ccbf2" (UID: "818a295a-ff53-4276-9abd-3800769ccbf2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.054544 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/818a295a-ff53-4276-9abd-3800769ccbf2-config-data\") pod \"818a295a-ff53-4276-9abd-3800769ccbf2\" (UID: \"818a295a-ff53-4276-9abd-3800769ccbf2\") " Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.054581 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/818a295a-ff53-4276-9abd-3800769ccbf2-log-httpd\") pod \"818a295a-ff53-4276-9abd-3800769ccbf2\" (UID: \"818a295a-ff53-4276-9abd-3800769ccbf2\") " Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.055163 4700 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/818a295a-ff53-4276-9abd-3800769ccbf2-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.055662 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/818a295a-ff53-4276-9abd-3800769ccbf2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "818a295a-ff53-4276-9abd-3800769ccbf2" (UID: "818a295a-ff53-4276-9abd-3800769ccbf2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.061325 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/818a295a-ff53-4276-9abd-3800769ccbf2-scripts" (OuterVolumeSpecName: "scripts") pod "818a295a-ff53-4276-9abd-3800769ccbf2" (UID: "818a295a-ff53-4276-9abd-3800769ccbf2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.066316 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/818a295a-ff53-4276-9abd-3800769ccbf2-kube-api-access-hqghv" (OuterVolumeSpecName: "kube-api-access-hqghv") pod "818a295a-ff53-4276-9abd-3800769ccbf2" (UID: "818a295a-ff53-4276-9abd-3800769ccbf2"). InnerVolumeSpecName "kube-api-access-hqghv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.093443 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/818a295a-ff53-4276-9abd-3800769ccbf2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "818a295a-ff53-4276-9abd-3800769ccbf2" (UID: "818a295a-ff53-4276-9abd-3800769ccbf2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.118854 4700 generic.go:334] "Generic (PLEG): container finished" podID="818a295a-ff53-4276-9abd-3800769ccbf2" containerID="4ef90bdf64ec71d230353654d84c3acd17d59654c6f5c09969faa6a9c1f502dc" exitCode=0 Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.118912 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"818a295a-ff53-4276-9abd-3800769ccbf2","Type":"ContainerDied","Data":"4ef90bdf64ec71d230353654d84c3acd17d59654c6f5c09969faa6a9c1f502dc"} Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.118940 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"818a295a-ff53-4276-9abd-3800769ccbf2","Type":"ContainerDied","Data":"50b326d9cedf2b7477d7479ef2989153c6aaf5b027d54b8895050e21dc1dd0f1"} Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.118957 4700 scope.go:117] "RemoveContainer" containerID="d8ad8e75f6e73ccbcb84da8063db2a39053cdb02500dd7b41696686adb6bf94e" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.119074 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.142344 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/818a295a-ff53-4276-9abd-3800769ccbf2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "818a295a-ff53-4276-9abd-3800769ccbf2" (UID: "818a295a-ff53-4276-9abd-3800769ccbf2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.153553 4700 scope.go:117] "RemoveContainer" containerID="95ccfae1932686b26ab12a6cd401f2c671d964796eee5f9a7f751f7b8c0b0246" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.156447 4700 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/818a295a-ff53-4276-9abd-3800769ccbf2-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.156473 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqghv\" (UniqueName: \"kubernetes.io/projected/818a295a-ff53-4276-9abd-3800769ccbf2-kube-api-access-hqghv\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.156484 4700 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/818a295a-ff53-4276-9abd-3800769ccbf2-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.156497 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/818a295a-ff53-4276-9abd-3800769ccbf2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.156508 4700 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/818a295a-ff53-4276-9abd-3800769ccbf2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.177992 4700 scope.go:117] "RemoveContainer" containerID="9e17660cd50ef266dc24d9f1106b346c9dacb3943e8d105ba116b0bf36cacf5a" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.205035 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/818a295a-ff53-4276-9abd-3800769ccbf2-config-data" (OuterVolumeSpecName: "config-data") pod "818a295a-ff53-4276-9abd-3800769ccbf2" (UID: "818a295a-ff53-4276-9abd-3800769ccbf2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.205945 4700 scope.go:117] "RemoveContainer" containerID="4ef90bdf64ec71d230353654d84c3acd17d59654c6f5c09969faa6a9c1f502dc" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.227597 4700 scope.go:117] "RemoveContainer" containerID="d8ad8e75f6e73ccbcb84da8063db2a39053cdb02500dd7b41696686adb6bf94e" Oct 07 11:38:46 crc kubenswrapper[4700]: E1007 11:38:46.228062 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8ad8e75f6e73ccbcb84da8063db2a39053cdb02500dd7b41696686adb6bf94e\": container with ID starting with d8ad8e75f6e73ccbcb84da8063db2a39053cdb02500dd7b41696686adb6bf94e not found: ID does not exist" containerID="d8ad8e75f6e73ccbcb84da8063db2a39053cdb02500dd7b41696686adb6bf94e" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.228106 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8ad8e75f6e73ccbcb84da8063db2a39053cdb02500dd7b41696686adb6bf94e"} err="failed to get container status \"d8ad8e75f6e73ccbcb84da8063db2a39053cdb02500dd7b41696686adb6bf94e\": rpc error: code = NotFound desc = could not find container \"d8ad8e75f6e73ccbcb84da8063db2a39053cdb02500dd7b41696686adb6bf94e\": container with ID starting with d8ad8e75f6e73ccbcb84da8063db2a39053cdb02500dd7b41696686adb6bf94e not found: ID does not exist" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.228142 4700 scope.go:117] "RemoveContainer" containerID="95ccfae1932686b26ab12a6cd401f2c671d964796eee5f9a7f751f7b8c0b0246" Oct 07 11:38:46 crc kubenswrapper[4700]: E1007 11:38:46.228646 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95ccfae1932686b26ab12a6cd401f2c671d964796eee5f9a7f751f7b8c0b0246\": container with ID starting with 95ccfae1932686b26ab12a6cd401f2c671d964796eee5f9a7f751f7b8c0b0246 not found: ID does not exist" containerID="95ccfae1932686b26ab12a6cd401f2c671d964796eee5f9a7f751f7b8c0b0246" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.228683 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95ccfae1932686b26ab12a6cd401f2c671d964796eee5f9a7f751f7b8c0b0246"} err="failed to get container status \"95ccfae1932686b26ab12a6cd401f2c671d964796eee5f9a7f751f7b8c0b0246\": rpc error: code = NotFound desc = could not find container \"95ccfae1932686b26ab12a6cd401f2c671d964796eee5f9a7f751f7b8c0b0246\": container with ID starting with 95ccfae1932686b26ab12a6cd401f2c671d964796eee5f9a7f751f7b8c0b0246 not found: ID does not exist" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.228736 4700 scope.go:117] "RemoveContainer" containerID="9e17660cd50ef266dc24d9f1106b346c9dacb3943e8d105ba116b0bf36cacf5a" Oct 07 11:38:46 crc kubenswrapper[4700]: E1007 11:38:46.229134 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e17660cd50ef266dc24d9f1106b346c9dacb3943e8d105ba116b0bf36cacf5a\": container with ID starting with 9e17660cd50ef266dc24d9f1106b346c9dacb3943e8d105ba116b0bf36cacf5a not found: ID does not exist" containerID="9e17660cd50ef266dc24d9f1106b346c9dacb3943e8d105ba116b0bf36cacf5a" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.229173 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e17660cd50ef266dc24d9f1106b346c9dacb3943e8d105ba116b0bf36cacf5a"} err="failed to get container status \"9e17660cd50ef266dc24d9f1106b346c9dacb3943e8d105ba116b0bf36cacf5a\": rpc error: code = NotFound desc = could not find container \"9e17660cd50ef266dc24d9f1106b346c9dacb3943e8d105ba116b0bf36cacf5a\": container with ID starting with 9e17660cd50ef266dc24d9f1106b346c9dacb3943e8d105ba116b0bf36cacf5a not found: ID does not exist" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.229193 4700 scope.go:117] "RemoveContainer" containerID="4ef90bdf64ec71d230353654d84c3acd17d59654c6f5c09969faa6a9c1f502dc" Oct 07 11:38:46 crc kubenswrapper[4700]: E1007 11:38:46.229564 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ef90bdf64ec71d230353654d84c3acd17d59654c6f5c09969faa6a9c1f502dc\": container with ID starting with 4ef90bdf64ec71d230353654d84c3acd17d59654c6f5c09969faa6a9c1f502dc not found: ID does not exist" containerID="4ef90bdf64ec71d230353654d84c3acd17d59654c6f5c09969faa6a9c1f502dc" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.229596 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ef90bdf64ec71d230353654d84c3acd17d59654c6f5c09969faa6a9c1f502dc"} err="failed to get container status \"4ef90bdf64ec71d230353654d84c3acd17d59654c6f5c09969faa6a9c1f502dc\": rpc error: code = NotFound desc = could not find container \"4ef90bdf64ec71d230353654d84c3acd17d59654c6f5c09969faa6a9c1f502dc\": container with ID starting with 4ef90bdf64ec71d230353654d84c3acd17d59654c6f5c09969faa6a9c1f502dc not found: ID does not exist" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.258761 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/818a295a-ff53-4276-9abd-3800769ccbf2-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.382521 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 11:38:46 crc kubenswrapper[4700]: W1007 11:38:46.389929 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91a6e182_e619_4e81_a9f1_4a31630788c5.slice/crio-b260f4d94c01b5020d1633dcf6dedee9a1749017a858010c4083c362cf2e39dd WatchSource:0}: Error finding container b260f4d94c01b5020d1633dcf6dedee9a1749017a858010c4083c362cf2e39dd: Status 404 returned error can't find the container with id b260f4d94c01b5020d1633dcf6dedee9a1749017a858010c4083c362cf2e39dd Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.462848 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.471479 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.485361 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:38:46 crc kubenswrapper[4700]: E1007 11:38:46.485929 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="818a295a-ff53-4276-9abd-3800769ccbf2" containerName="ceilometer-notification-agent" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.485953 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="818a295a-ff53-4276-9abd-3800769ccbf2" containerName="ceilometer-notification-agent" Oct 07 11:38:46 crc kubenswrapper[4700]: E1007 11:38:46.485972 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="818a295a-ff53-4276-9abd-3800769ccbf2" containerName="ceilometer-central-agent" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.485980 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="818a295a-ff53-4276-9abd-3800769ccbf2" containerName="ceilometer-central-agent" Oct 07 11:38:46 crc kubenswrapper[4700]: E1007 11:38:46.485991 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="818a295a-ff53-4276-9abd-3800769ccbf2" containerName="proxy-httpd" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.485999 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="818a295a-ff53-4276-9abd-3800769ccbf2" containerName="proxy-httpd" Oct 07 11:38:46 crc kubenswrapper[4700]: E1007 11:38:46.486036 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="818a295a-ff53-4276-9abd-3800769ccbf2" containerName="sg-core" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.486044 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="818a295a-ff53-4276-9abd-3800769ccbf2" containerName="sg-core" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.486272 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="818a295a-ff53-4276-9abd-3800769ccbf2" containerName="proxy-httpd" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.486293 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="818a295a-ff53-4276-9abd-3800769ccbf2" containerName="ceilometer-notification-agent" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.486320 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="818a295a-ff53-4276-9abd-3800769ccbf2" containerName="sg-core" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.486349 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="818a295a-ff53-4276-9abd-3800769ccbf2" containerName="ceilometer-central-agent" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.498286 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.502598 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.502827 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.526525 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.664172 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87b85928-284c-4a99-81b2-26be9c9d8acf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87b85928-284c-4a99-81b2-26be9c9d8acf\") " pod="openstack/ceilometer-0" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.664494 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87b85928-284c-4a99-81b2-26be9c9d8acf-run-httpd\") pod \"ceilometer-0\" (UID: \"87b85928-284c-4a99-81b2-26be9c9d8acf\") " pod="openstack/ceilometer-0" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.664516 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kglrw\" (UniqueName: \"kubernetes.io/projected/87b85928-284c-4a99-81b2-26be9c9d8acf-kube-api-access-kglrw\") pod \"ceilometer-0\" (UID: \"87b85928-284c-4a99-81b2-26be9c9d8acf\") " pod="openstack/ceilometer-0" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.664676 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87b85928-284c-4a99-81b2-26be9c9d8acf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87b85928-284c-4a99-81b2-26be9c9d8acf\") " pod="openstack/ceilometer-0" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.664735 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87b85928-284c-4a99-81b2-26be9c9d8acf-log-httpd\") pod \"ceilometer-0\" (UID: \"87b85928-284c-4a99-81b2-26be9c9d8acf\") " pod="openstack/ceilometer-0" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.664826 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87b85928-284c-4a99-81b2-26be9c9d8acf-config-data\") pod \"ceilometer-0\" (UID: \"87b85928-284c-4a99-81b2-26be9c9d8acf\") " pod="openstack/ceilometer-0" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.665036 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87b85928-284c-4a99-81b2-26be9c9d8acf-scripts\") pod \"ceilometer-0\" (UID: \"87b85928-284c-4a99-81b2-26be9c9d8acf\") " pod="openstack/ceilometer-0" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.766524 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87b85928-284c-4a99-81b2-26be9c9d8acf-log-httpd\") pod \"ceilometer-0\" (UID: \"87b85928-284c-4a99-81b2-26be9c9d8acf\") " pod="openstack/ceilometer-0" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.766581 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87b85928-284c-4a99-81b2-26be9c9d8acf-config-data\") pod \"ceilometer-0\" (UID: \"87b85928-284c-4a99-81b2-26be9c9d8acf\") " pod="openstack/ceilometer-0" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.766681 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87b85928-284c-4a99-81b2-26be9c9d8acf-scripts\") pod \"ceilometer-0\" (UID: \"87b85928-284c-4a99-81b2-26be9c9d8acf\") " pod="openstack/ceilometer-0" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.766769 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87b85928-284c-4a99-81b2-26be9c9d8acf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87b85928-284c-4a99-81b2-26be9c9d8acf\") " pod="openstack/ceilometer-0" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.766792 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87b85928-284c-4a99-81b2-26be9c9d8acf-run-httpd\") pod \"ceilometer-0\" (UID: \"87b85928-284c-4a99-81b2-26be9c9d8acf\") " pod="openstack/ceilometer-0" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.766813 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kglrw\" (UniqueName: \"kubernetes.io/projected/87b85928-284c-4a99-81b2-26be9c9d8acf-kube-api-access-kglrw\") pod \"ceilometer-0\" (UID: \"87b85928-284c-4a99-81b2-26be9c9d8acf\") " pod="openstack/ceilometer-0" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.766907 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87b85928-284c-4a99-81b2-26be9c9d8acf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87b85928-284c-4a99-81b2-26be9c9d8acf\") " pod="openstack/ceilometer-0" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.768116 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87b85928-284c-4a99-81b2-26be9c9d8acf-run-httpd\") pod \"ceilometer-0\" (UID: \"87b85928-284c-4a99-81b2-26be9c9d8acf\") " pod="openstack/ceilometer-0" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.768132 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87b85928-284c-4a99-81b2-26be9c9d8acf-log-httpd\") pod \"ceilometer-0\" (UID: \"87b85928-284c-4a99-81b2-26be9c9d8acf\") " pod="openstack/ceilometer-0" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.774437 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87b85928-284c-4a99-81b2-26be9c9d8acf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87b85928-284c-4a99-81b2-26be9c9d8acf\") " pod="openstack/ceilometer-0" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.775056 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87b85928-284c-4a99-81b2-26be9c9d8acf-config-data\") pod \"ceilometer-0\" (UID: \"87b85928-284c-4a99-81b2-26be9c9d8acf\") " pod="openstack/ceilometer-0" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.781924 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87b85928-284c-4a99-81b2-26be9c9d8acf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87b85928-284c-4a99-81b2-26be9c9d8acf\") " pod="openstack/ceilometer-0" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.791006 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87b85928-284c-4a99-81b2-26be9c9d8acf-scripts\") pod \"ceilometer-0\" (UID: \"87b85928-284c-4a99-81b2-26be9c9d8acf\") " pod="openstack/ceilometer-0" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.791198 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kglrw\" (UniqueName: \"kubernetes.io/projected/87b85928-284c-4a99-81b2-26be9c9d8acf-kube-api-access-kglrw\") pod \"ceilometer-0\" (UID: \"87b85928-284c-4a99-81b2-26be9c9d8acf\") " pod="openstack/ceilometer-0" Oct 07 11:38:46 crc kubenswrapper[4700]: I1007 11:38:46.829106 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 11:38:47 crc kubenswrapper[4700]: I1007 11:38:47.149958 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"91a6e182-e619-4e81-a9f1-4a31630788c5","Type":"ContainerStarted","Data":"9b3bc6577e21842d0234043224b8740eace87fbe828c846020d516c762f9176c"} Oct 07 11:38:47 crc kubenswrapper[4700]: I1007 11:38:47.152141 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"91a6e182-e619-4e81-a9f1-4a31630788c5","Type":"ContainerStarted","Data":"b260f4d94c01b5020d1633dcf6dedee9a1749017a858010c4083c362cf2e39dd"} Oct 07 11:38:47 crc kubenswrapper[4700]: I1007 11:38:47.295865 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:38:47 crc kubenswrapper[4700]: W1007 11:38:47.303529 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87b85928_284c_4a99_81b2_26be9c9d8acf.slice/crio-54dac636d8b75719e831e6ccbbeb7beea303e7c79a5cf67aa59a7d1f9f80433e WatchSource:0}: Error finding container 54dac636d8b75719e831e6ccbbeb7beea303e7c79a5cf67aa59a7d1f9f80433e: Status 404 returned error can't find the container with id 54dac636d8b75719e831e6ccbbeb7beea303e7c79a5cf67aa59a7d1f9f80433e Oct 07 11:38:47 crc kubenswrapper[4700]: I1007 11:38:47.491535 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:38:47 crc kubenswrapper[4700]: I1007 11:38:47.835965 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-744b8f5559-c67wh" Oct 07 11:38:47 crc kubenswrapper[4700]: I1007 11:38:47.905188 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7554f5f8dd-vxdsk"] Oct 07 11:38:47 crc kubenswrapper[4700]: I1007 11:38:47.905428 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7554f5f8dd-vxdsk" podUID="08ef410f-221b-4062-a3a0-e20f2169c488" containerName="neutron-api" containerID="cri-o://9fa20af259c3639795e1b24f809c20552c592b54ecad02fd4658e73f808d6748" gracePeriod=30 Oct 07 11:38:47 crc kubenswrapper[4700]: I1007 11:38:47.905820 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7554f5f8dd-vxdsk" podUID="08ef410f-221b-4062-a3a0-e20f2169c488" containerName="neutron-httpd" containerID="cri-o://8e8a1783e23903d63dcdd26bf47452295ba061c9dca42571739b921dd0e8979f" gracePeriod=30 Oct 07 11:38:48 crc kubenswrapper[4700]: I1007 11:38:48.029738 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="818a295a-ff53-4276-9abd-3800769ccbf2" path="/var/lib/kubelet/pods/818a295a-ff53-4276-9abd-3800769ccbf2/volumes" Oct 07 11:38:48 crc kubenswrapper[4700]: I1007 11:38:48.164096 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87b85928-284c-4a99-81b2-26be9c9d8acf","Type":"ContainerStarted","Data":"58bdb19ade8ce95e35546896ae95fb94a0df748f3882c0b1421bab46fe6d5890"} Oct 07 11:38:48 crc kubenswrapper[4700]: I1007 11:38:48.164140 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87b85928-284c-4a99-81b2-26be9c9d8acf","Type":"ContainerStarted","Data":"54dac636d8b75719e831e6ccbbeb7beea303e7c79a5cf67aa59a7d1f9f80433e"} Oct 07 11:38:48 crc kubenswrapper[4700]: I1007 11:38:48.165333 4700 generic.go:334] "Generic (PLEG): container finished" podID="08ef410f-221b-4062-a3a0-e20f2169c488" containerID="8e8a1783e23903d63dcdd26bf47452295ba061c9dca42571739b921dd0e8979f" exitCode=0 Oct 07 11:38:48 crc kubenswrapper[4700]: I1007 11:38:48.165371 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7554f5f8dd-vxdsk" event={"ID":"08ef410f-221b-4062-a3a0-e20f2169c488","Type":"ContainerDied","Data":"8e8a1783e23903d63dcdd26bf47452295ba061c9dca42571739b921dd0e8979f"} Oct 07 11:38:48 crc kubenswrapper[4700]: I1007 11:38:48.173277 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"91a6e182-e619-4e81-a9f1-4a31630788c5","Type":"ContainerStarted","Data":"c9418f139fa8ffae1603372286692260391f3c71f5b647d318a10bde23787325"} Oct 07 11:38:48 crc kubenswrapper[4700]: I1007 11:38:48.202083 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.202066415 podStartE2EDuration="3.202066415s" podCreationTimestamp="2025-10-07 11:38:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:38:48.193182243 +0000 UTC m=+1094.989581232" watchObservedRunningTime="2025-10-07 11:38:48.202066415 +0000 UTC m=+1094.998465394" Oct 07 11:38:48 crc kubenswrapper[4700]: I1007 11:38:48.564163 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 07 11:38:48 crc kubenswrapper[4700]: I1007 11:38:48.564222 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 07 11:38:48 crc kubenswrapper[4700]: I1007 11:38:48.604973 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 07 11:38:48 crc kubenswrapper[4700]: I1007 11:38:48.635130 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 07 11:38:49 crc kubenswrapper[4700]: I1007 11:38:49.187199 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87b85928-284c-4a99-81b2-26be9c9d8acf","Type":"ContainerStarted","Data":"4762363538b931b06f4bcc27987dc9942b1f61aa91e7d787452e19fd9509a6f9"} Oct 07 11:38:49 crc kubenswrapper[4700]: I1007 11:38:49.187805 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 07 11:38:49 crc kubenswrapper[4700]: I1007 11:38:49.188469 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 07 11:38:50 crc kubenswrapper[4700]: I1007 11:38:50.011352 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7554f5f8dd-vxdsk" Oct 07 11:38:50 crc kubenswrapper[4700]: I1007 11:38:50.136002 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/08ef410f-221b-4062-a3a0-e20f2169c488-config\") pod \"08ef410f-221b-4062-a3a0-e20f2169c488\" (UID: \"08ef410f-221b-4062-a3a0-e20f2169c488\") " Oct 07 11:38:50 crc kubenswrapper[4700]: I1007 11:38:50.136114 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8qbk\" (UniqueName: \"kubernetes.io/projected/08ef410f-221b-4062-a3a0-e20f2169c488-kube-api-access-k8qbk\") pod \"08ef410f-221b-4062-a3a0-e20f2169c488\" (UID: \"08ef410f-221b-4062-a3a0-e20f2169c488\") " Oct 07 11:38:50 crc kubenswrapper[4700]: I1007 11:38:50.136181 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08ef410f-221b-4062-a3a0-e20f2169c488-combined-ca-bundle\") pod \"08ef410f-221b-4062-a3a0-e20f2169c488\" (UID: \"08ef410f-221b-4062-a3a0-e20f2169c488\") " Oct 07 11:38:50 crc kubenswrapper[4700]: I1007 11:38:50.136338 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/08ef410f-221b-4062-a3a0-e20f2169c488-httpd-config\") pod \"08ef410f-221b-4062-a3a0-e20f2169c488\" (UID: \"08ef410f-221b-4062-a3a0-e20f2169c488\") " Oct 07 11:38:50 crc kubenswrapper[4700]: I1007 11:38:50.136365 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/08ef410f-221b-4062-a3a0-e20f2169c488-ovndb-tls-certs\") pod \"08ef410f-221b-4062-a3a0-e20f2169c488\" (UID: \"08ef410f-221b-4062-a3a0-e20f2169c488\") " Oct 07 11:38:50 crc kubenswrapper[4700]: I1007 11:38:50.141998 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08ef410f-221b-4062-a3a0-e20f2169c488-kube-api-access-k8qbk" (OuterVolumeSpecName: "kube-api-access-k8qbk") pod "08ef410f-221b-4062-a3a0-e20f2169c488" (UID: "08ef410f-221b-4062-a3a0-e20f2169c488"). InnerVolumeSpecName "kube-api-access-k8qbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:38:50 crc kubenswrapper[4700]: I1007 11:38:50.155470 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08ef410f-221b-4062-a3a0-e20f2169c488-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "08ef410f-221b-4062-a3a0-e20f2169c488" (UID: "08ef410f-221b-4062-a3a0-e20f2169c488"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:50 crc kubenswrapper[4700]: I1007 11:38:50.191590 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08ef410f-221b-4062-a3a0-e20f2169c488-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08ef410f-221b-4062-a3a0-e20f2169c488" (UID: "08ef410f-221b-4062-a3a0-e20f2169c488"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:50 crc kubenswrapper[4700]: I1007 11:38:50.198118 4700 generic.go:334] "Generic (PLEG): container finished" podID="08ef410f-221b-4062-a3a0-e20f2169c488" containerID="9fa20af259c3639795e1b24f809c20552c592b54ecad02fd4658e73f808d6748" exitCode=0 Oct 07 11:38:50 crc kubenswrapper[4700]: I1007 11:38:50.198185 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7554f5f8dd-vxdsk" event={"ID":"08ef410f-221b-4062-a3a0-e20f2169c488","Type":"ContainerDied","Data":"9fa20af259c3639795e1b24f809c20552c592b54ecad02fd4658e73f808d6748"} Oct 07 11:38:50 crc kubenswrapper[4700]: I1007 11:38:50.198217 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7554f5f8dd-vxdsk" event={"ID":"08ef410f-221b-4062-a3a0-e20f2169c488","Type":"ContainerDied","Data":"77ca4d0b0ddb89f5894ebf89de49125c5b6a6ac9c0c3a1b0f5adc1acc650055a"} Oct 07 11:38:50 crc kubenswrapper[4700]: I1007 11:38:50.198237 4700 scope.go:117] "RemoveContainer" containerID="8e8a1783e23903d63dcdd26bf47452295ba061c9dca42571739b921dd0e8979f" Oct 07 11:38:50 crc kubenswrapper[4700]: I1007 11:38:50.198394 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7554f5f8dd-vxdsk" Oct 07 11:38:50 crc kubenswrapper[4700]: I1007 11:38:50.205581 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87b85928-284c-4a99-81b2-26be9c9d8acf","Type":"ContainerStarted","Data":"c2ae3a6b0d2d0d5fd01773f2116a8b784c0147ffe5ca9c23a1686da9ce5700ab"} Oct 07 11:38:50 crc kubenswrapper[4700]: I1007 11:38:50.219446 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08ef410f-221b-4062-a3a0-e20f2169c488-config" (OuterVolumeSpecName: "config") pod "08ef410f-221b-4062-a3a0-e20f2169c488" (UID: "08ef410f-221b-4062-a3a0-e20f2169c488"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:50 crc kubenswrapper[4700]: I1007 11:38:50.243522 4700 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/08ef410f-221b-4062-a3a0-e20f2169c488-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:50 crc kubenswrapper[4700]: I1007 11:38:50.243552 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/08ef410f-221b-4062-a3a0-e20f2169c488-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:50 crc kubenswrapper[4700]: I1007 11:38:50.243566 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8qbk\" (UniqueName: \"kubernetes.io/projected/08ef410f-221b-4062-a3a0-e20f2169c488-kube-api-access-k8qbk\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:50 crc kubenswrapper[4700]: I1007 11:38:50.243579 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08ef410f-221b-4062-a3a0-e20f2169c488-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:50 crc kubenswrapper[4700]: I1007 11:38:50.253427 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08ef410f-221b-4062-a3a0-e20f2169c488-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "08ef410f-221b-4062-a3a0-e20f2169c488" (UID: "08ef410f-221b-4062-a3a0-e20f2169c488"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:50 crc kubenswrapper[4700]: I1007 11:38:50.260893 4700 scope.go:117] "RemoveContainer" containerID="9fa20af259c3639795e1b24f809c20552c592b54ecad02fd4658e73f808d6748" Oct 07 11:38:50 crc kubenswrapper[4700]: I1007 11:38:50.279829 4700 scope.go:117] "RemoveContainer" containerID="8e8a1783e23903d63dcdd26bf47452295ba061c9dca42571739b921dd0e8979f" Oct 07 11:38:50 crc kubenswrapper[4700]: E1007 11:38:50.281655 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e8a1783e23903d63dcdd26bf47452295ba061c9dca42571739b921dd0e8979f\": container with ID starting with 8e8a1783e23903d63dcdd26bf47452295ba061c9dca42571739b921dd0e8979f not found: ID does not exist" containerID="8e8a1783e23903d63dcdd26bf47452295ba061c9dca42571739b921dd0e8979f" Oct 07 11:38:50 crc kubenswrapper[4700]: I1007 11:38:50.281692 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e8a1783e23903d63dcdd26bf47452295ba061c9dca42571739b921dd0e8979f"} err="failed to get container status \"8e8a1783e23903d63dcdd26bf47452295ba061c9dca42571739b921dd0e8979f\": rpc error: code = NotFound desc = could not find container \"8e8a1783e23903d63dcdd26bf47452295ba061c9dca42571739b921dd0e8979f\": container with ID starting with 8e8a1783e23903d63dcdd26bf47452295ba061c9dca42571739b921dd0e8979f not found: ID does not exist" Oct 07 11:38:50 crc kubenswrapper[4700]: I1007 11:38:50.281732 4700 scope.go:117] "RemoveContainer" containerID="9fa20af259c3639795e1b24f809c20552c592b54ecad02fd4658e73f808d6748" Oct 07 11:38:50 crc kubenswrapper[4700]: E1007 11:38:50.282077 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fa20af259c3639795e1b24f809c20552c592b54ecad02fd4658e73f808d6748\": container with ID starting with 9fa20af259c3639795e1b24f809c20552c592b54ecad02fd4658e73f808d6748 not found: ID does not exist" containerID="9fa20af259c3639795e1b24f809c20552c592b54ecad02fd4658e73f808d6748" Oct 07 11:38:50 crc kubenswrapper[4700]: I1007 11:38:50.282110 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fa20af259c3639795e1b24f809c20552c592b54ecad02fd4658e73f808d6748"} err="failed to get container status \"9fa20af259c3639795e1b24f809c20552c592b54ecad02fd4658e73f808d6748\": rpc error: code = NotFound desc = could not find container \"9fa20af259c3639795e1b24f809c20552c592b54ecad02fd4658e73f808d6748\": container with ID starting with 9fa20af259c3639795e1b24f809c20552c592b54ecad02fd4658e73f808d6748 not found: ID does not exist" Oct 07 11:38:50 crc kubenswrapper[4700]: I1007 11:38:50.321711 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-575565f88c-czn8g" Oct 07 11:38:50 crc kubenswrapper[4700]: I1007 11:38:50.345046 4700 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/08ef410f-221b-4062-a3a0-e20f2169c488-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:50 crc kubenswrapper[4700]: I1007 11:38:50.367481 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-759f9b554d-c5s6x"] Oct 07 11:38:50 crc kubenswrapper[4700]: I1007 11:38:50.367715 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-759f9b554d-c5s6x" podUID="b7389d16-ec46-47d3-9466-2b844946b6c6" containerName="heat-engine" containerID="cri-o://370391fb89d0b6dbfe5285a98a41f53e1e1f4d782e3b568d7f8fc6e55edf5a19" gracePeriod=60 Oct 07 11:38:50 crc kubenswrapper[4700]: I1007 11:38:50.529274 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7554f5f8dd-vxdsk"] Oct 07 11:38:50 crc kubenswrapper[4700]: I1007 11:38:50.536097 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7554f5f8dd-vxdsk"] Oct 07 11:38:51 crc kubenswrapper[4700]: I1007 11:38:51.214092 4700 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 11:38:51 crc kubenswrapper[4700]: I1007 11:38:51.214114 4700 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 11:38:51 crc kubenswrapper[4700]: I1007 11:38:51.775936 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 07 11:38:51 crc kubenswrapper[4700]: I1007 11:38:51.783208 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 07 11:38:51 crc kubenswrapper[4700]: I1007 11:38:51.973115 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08ef410f-221b-4062-a3a0-e20f2169c488" path="/var/lib/kubelet/pods/08ef410f-221b-4062-a3a0-e20f2169c488/volumes" Oct 07 11:38:52 crc kubenswrapper[4700]: I1007 11:38:52.225032 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87b85928-284c-4a99-81b2-26be9c9d8acf","Type":"ContainerStarted","Data":"15590034b036eb03441919f6e0f80b2cc5054d90b680ea624b049cd1eff18513"} Oct 07 11:38:52 crc kubenswrapper[4700]: I1007 11:38:52.225897 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 11:38:52 crc kubenswrapper[4700]: I1007 11:38:52.225186 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="87b85928-284c-4a99-81b2-26be9c9d8acf" containerName="proxy-httpd" containerID="cri-o://15590034b036eb03441919f6e0f80b2cc5054d90b680ea624b049cd1eff18513" gracePeriod=30 Oct 07 11:38:52 crc kubenswrapper[4700]: I1007 11:38:52.225148 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="87b85928-284c-4a99-81b2-26be9c9d8acf" containerName="ceilometer-central-agent" containerID="cri-o://58bdb19ade8ce95e35546896ae95fb94a0df748f3882c0b1421bab46fe6d5890" gracePeriod=30 Oct 07 11:38:52 crc kubenswrapper[4700]: I1007 11:38:52.225211 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="87b85928-284c-4a99-81b2-26be9c9d8acf" containerName="ceilometer-notification-agent" containerID="cri-o://4762363538b931b06f4bcc27987dc9942b1f61aa91e7d787452e19fd9509a6f9" gracePeriod=30 Oct 07 11:38:52 crc kubenswrapper[4700]: I1007 11:38:52.225201 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="87b85928-284c-4a99-81b2-26be9c9d8acf" containerName="sg-core" containerID="cri-o://c2ae3a6b0d2d0d5fd01773f2116a8b784c0147ffe5ca9c23a1686da9ce5700ab" gracePeriod=30 Oct 07 11:38:52 crc kubenswrapper[4700]: I1007 11:38:52.257353 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.04951974 podStartE2EDuration="6.257339774s" podCreationTimestamp="2025-10-07 11:38:46 +0000 UTC" firstStartedPulling="2025-10-07 11:38:47.307615896 +0000 UTC m=+1094.104014885" lastFinishedPulling="2025-10-07 11:38:51.51543593 +0000 UTC m=+1098.311834919" observedRunningTime="2025-10-07 11:38:52.252702233 +0000 UTC m=+1099.049101232" watchObservedRunningTime="2025-10-07 11:38:52.257339774 +0000 UTC m=+1099.053738763" Oct 07 11:38:52 crc kubenswrapper[4700]: E1007 11:38:52.366393 4700 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="370391fb89d0b6dbfe5285a98a41f53e1e1f4d782e3b568d7f8fc6e55edf5a19" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 07 11:38:52 crc kubenswrapper[4700]: E1007 11:38:52.374630 4700 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="370391fb89d0b6dbfe5285a98a41f53e1e1f4d782e3b568d7f8fc6e55edf5a19" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 07 11:38:52 crc kubenswrapper[4700]: E1007 11:38:52.378960 4700 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="370391fb89d0b6dbfe5285a98a41f53e1e1f4d782e3b568d7f8fc6e55edf5a19" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 07 11:38:52 crc kubenswrapper[4700]: E1007 11:38:52.379104 4700 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-759f9b554d-c5s6x" podUID="b7389d16-ec46-47d3-9466-2b844946b6c6" containerName="heat-engine" Oct 07 11:38:53 crc kubenswrapper[4700]: I1007 11:38:53.237455 4700 generic.go:334] "Generic (PLEG): container finished" podID="87b85928-284c-4a99-81b2-26be9c9d8acf" containerID="15590034b036eb03441919f6e0f80b2cc5054d90b680ea624b049cd1eff18513" exitCode=0 Oct 07 11:38:53 crc kubenswrapper[4700]: I1007 11:38:53.237495 4700 generic.go:334] "Generic (PLEG): container finished" podID="87b85928-284c-4a99-81b2-26be9c9d8acf" containerID="c2ae3a6b0d2d0d5fd01773f2116a8b784c0147ffe5ca9c23a1686da9ce5700ab" exitCode=2 Oct 07 11:38:53 crc kubenswrapper[4700]: I1007 11:38:53.237507 4700 generic.go:334] "Generic (PLEG): container finished" podID="87b85928-284c-4a99-81b2-26be9c9d8acf" containerID="4762363538b931b06f4bcc27987dc9942b1f61aa91e7d787452e19fd9509a6f9" exitCode=0 Oct 07 11:38:53 crc kubenswrapper[4700]: I1007 11:38:53.237778 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87b85928-284c-4a99-81b2-26be9c9d8acf","Type":"ContainerDied","Data":"15590034b036eb03441919f6e0f80b2cc5054d90b680ea624b049cd1eff18513"} Oct 07 11:38:53 crc kubenswrapper[4700]: I1007 11:38:53.237821 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87b85928-284c-4a99-81b2-26be9c9d8acf","Type":"ContainerDied","Data":"c2ae3a6b0d2d0d5fd01773f2116a8b784c0147ffe5ca9c23a1686da9ce5700ab"} Oct 07 11:38:53 crc kubenswrapper[4700]: I1007 11:38:53.237831 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87b85928-284c-4a99-81b2-26be9c9d8acf","Type":"ContainerDied","Data":"4762363538b931b06f4bcc27987dc9942b1f61aa91e7d787452e19fd9509a6f9"} Oct 07 11:38:55 crc kubenswrapper[4700]: I1007 11:38:55.811079 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 07 11:38:55 crc kubenswrapper[4700]: I1007 11:38:55.813718 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 07 11:38:55 crc kubenswrapper[4700]: I1007 11:38:55.848285 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 07 11:38:55 crc kubenswrapper[4700]: I1007 11:38:55.856625 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 07 11:38:56 crc kubenswrapper[4700]: I1007 11:38:56.262845 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 07 11:38:56 crc kubenswrapper[4700]: I1007 11:38:56.263134 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 07 11:38:56 crc kubenswrapper[4700]: I1007 11:38:56.796274 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 11:38:56 crc kubenswrapper[4700]: I1007 11:38:56.876800 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87b85928-284c-4a99-81b2-26be9c9d8acf-combined-ca-bundle\") pod \"87b85928-284c-4a99-81b2-26be9c9d8acf\" (UID: \"87b85928-284c-4a99-81b2-26be9c9d8acf\") " Oct 07 11:38:56 crc kubenswrapper[4700]: I1007 11:38:56.876852 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87b85928-284c-4a99-81b2-26be9c9d8acf-sg-core-conf-yaml\") pod \"87b85928-284c-4a99-81b2-26be9c9d8acf\" (UID: \"87b85928-284c-4a99-81b2-26be9c9d8acf\") " Oct 07 11:38:56 crc kubenswrapper[4700]: I1007 11:38:56.876898 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87b85928-284c-4a99-81b2-26be9c9d8acf-log-httpd\") pod \"87b85928-284c-4a99-81b2-26be9c9d8acf\" (UID: \"87b85928-284c-4a99-81b2-26be9c9d8acf\") " Oct 07 11:38:56 crc kubenswrapper[4700]: I1007 11:38:56.876951 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87b85928-284c-4a99-81b2-26be9c9d8acf-run-httpd\") pod \"87b85928-284c-4a99-81b2-26be9c9d8acf\" (UID: \"87b85928-284c-4a99-81b2-26be9c9d8acf\") " Oct 07 11:38:56 crc kubenswrapper[4700]: I1007 11:38:56.876965 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87b85928-284c-4a99-81b2-26be9c9d8acf-scripts\") pod \"87b85928-284c-4a99-81b2-26be9c9d8acf\" (UID: \"87b85928-284c-4a99-81b2-26be9c9d8acf\") " Oct 07 11:38:56 crc kubenswrapper[4700]: I1007 11:38:56.877037 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kglrw\" (UniqueName: \"kubernetes.io/projected/87b85928-284c-4a99-81b2-26be9c9d8acf-kube-api-access-kglrw\") pod \"87b85928-284c-4a99-81b2-26be9c9d8acf\" (UID: \"87b85928-284c-4a99-81b2-26be9c9d8acf\") " Oct 07 11:38:56 crc kubenswrapper[4700]: I1007 11:38:56.877082 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87b85928-284c-4a99-81b2-26be9c9d8acf-config-data\") pod \"87b85928-284c-4a99-81b2-26be9c9d8acf\" (UID: \"87b85928-284c-4a99-81b2-26be9c9d8acf\") " Oct 07 11:38:56 crc kubenswrapper[4700]: I1007 11:38:56.877756 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87b85928-284c-4a99-81b2-26be9c9d8acf-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "87b85928-284c-4a99-81b2-26be9c9d8acf" (UID: "87b85928-284c-4a99-81b2-26be9c9d8acf"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:38:56 crc kubenswrapper[4700]: I1007 11:38:56.879936 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87b85928-284c-4a99-81b2-26be9c9d8acf-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "87b85928-284c-4a99-81b2-26be9c9d8acf" (UID: "87b85928-284c-4a99-81b2-26be9c9d8acf"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:38:56 crc kubenswrapper[4700]: I1007 11:38:56.889455 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87b85928-284c-4a99-81b2-26be9c9d8acf-kube-api-access-kglrw" (OuterVolumeSpecName: "kube-api-access-kglrw") pod "87b85928-284c-4a99-81b2-26be9c9d8acf" (UID: "87b85928-284c-4a99-81b2-26be9c9d8acf"). InnerVolumeSpecName "kube-api-access-kglrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:38:56 crc kubenswrapper[4700]: I1007 11:38:56.890073 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87b85928-284c-4a99-81b2-26be9c9d8acf-scripts" (OuterVolumeSpecName: "scripts") pod "87b85928-284c-4a99-81b2-26be9c9d8acf" (UID: "87b85928-284c-4a99-81b2-26be9c9d8acf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:56 crc kubenswrapper[4700]: I1007 11:38:56.918400 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87b85928-284c-4a99-81b2-26be9c9d8acf-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "87b85928-284c-4a99-81b2-26be9c9d8acf" (UID: "87b85928-284c-4a99-81b2-26be9c9d8acf"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:56 crc kubenswrapper[4700]: I1007 11:38:56.962448 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87b85928-284c-4a99-81b2-26be9c9d8acf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87b85928-284c-4a99-81b2-26be9c9d8acf" (UID: "87b85928-284c-4a99-81b2-26be9c9d8acf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:56 crc kubenswrapper[4700]: I1007 11:38:56.981256 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kglrw\" (UniqueName: \"kubernetes.io/projected/87b85928-284c-4a99-81b2-26be9c9d8acf-kube-api-access-kglrw\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:56 crc kubenswrapper[4700]: I1007 11:38:56.981287 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87b85928-284c-4a99-81b2-26be9c9d8acf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:56 crc kubenswrapper[4700]: I1007 11:38:56.981296 4700 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87b85928-284c-4a99-81b2-26be9c9d8acf-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:56 crc kubenswrapper[4700]: I1007 11:38:56.981324 4700 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87b85928-284c-4a99-81b2-26be9c9d8acf-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:56 crc kubenswrapper[4700]: I1007 11:38:56.981333 4700 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87b85928-284c-4a99-81b2-26be9c9d8acf-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:56 crc kubenswrapper[4700]: I1007 11:38:56.981353 4700 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87b85928-284c-4a99-81b2-26be9c9d8acf-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:56 crc kubenswrapper[4700]: I1007 11:38:56.995468 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87b85928-284c-4a99-81b2-26be9c9d8acf-config-data" (OuterVolumeSpecName: "config-data") pod "87b85928-284c-4a99-81b2-26be9c9d8acf" (UID: "87b85928-284c-4a99-81b2-26be9c9d8acf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.082843 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87b85928-284c-4a99-81b2-26be9c9d8acf-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.272942 4700 generic.go:334] "Generic (PLEG): container finished" podID="87b85928-284c-4a99-81b2-26be9c9d8acf" containerID="58bdb19ade8ce95e35546896ae95fb94a0df748f3882c0b1421bab46fe6d5890" exitCode=0 Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.273014 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.273059 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87b85928-284c-4a99-81b2-26be9c9d8acf","Type":"ContainerDied","Data":"58bdb19ade8ce95e35546896ae95fb94a0df748f3882c0b1421bab46fe6d5890"} Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.273092 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87b85928-284c-4a99-81b2-26be9c9d8acf","Type":"ContainerDied","Data":"54dac636d8b75719e831e6ccbbeb7beea303e7c79a5cf67aa59a7d1f9f80433e"} Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.273110 4700 scope.go:117] "RemoveContainer" containerID="15590034b036eb03441919f6e0f80b2cc5054d90b680ea624b049cd1eff18513" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.298134 4700 scope.go:117] "RemoveContainer" containerID="c2ae3a6b0d2d0d5fd01773f2116a8b784c0147ffe5ca9c23a1686da9ce5700ab" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.315615 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.336139 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.349364 4700 scope.go:117] "RemoveContainer" containerID="4762363538b931b06f4bcc27987dc9942b1f61aa91e7d787452e19fd9509a6f9" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.384792 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:38:57 crc kubenswrapper[4700]: E1007 11:38:57.385116 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87b85928-284c-4a99-81b2-26be9c9d8acf" containerName="proxy-httpd" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.385132 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="87b85928-284c-4a99-81b2-26be9c9d8acf" containerName="proxy-httpd" Oct 07 11:38:57 crc kubenswrapper[4700]: E1007 11:38:57.385145 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87b85928-284c-4a99-81b2-26be9c9d8acf" containerName="ceilometer-central-agent" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.385151 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="87b85928-284c-4a99-81b2-26be9c9d8acf" containerName="ceilometer-central-agent" Oct 07 11:38:57 crc kubenswrapper[4700]: E1007 11:38:57.385168 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87b85928-284c-4a99-81b2-26be9c9d8acf" containerName="ceilometer-notification-agent" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.385176 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="87b85928-284c-4a99-81b2-26be9c9d8acf" containerName="ceilometer-notification-agent" Oct 07 11:38:57 crc kubenswrapper[4700]: E1007 11:38:57.385187 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08ef410f-221b-4062-a3a0-e20f2169c488" containerName="neutron-api" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.385193 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="08ef410f-221b-4062-a3a0-e20f2169c488" containerName="neutron-api" Oct 07 11:38:57 crc kubenswrapper[4700]: E1007 11:38:57.385202 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87b85928-284c-4a99-81b2-26be9c9d8acf" containerName="sg-core" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.385208 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="87b85928-284c-4a99-81b2-26be9c9d8acf" containerName="sg-core" Oct 07 11:38:57 crc kubenswrapper[4700]: E1007 11:38:57.385227 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08ef410f-221b-4062-a3a0-e20f2169c488" containerName="neutron-httpd" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.385233 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="08ef410f-221b-4062-a3a0-e20f2169c488" containerName="neutron-httpd" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.385409 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="87b85928-284c-4a99-81b2-26be9c9d8acf" containerName="proxy-httpd" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.385424 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="08ef410f-221b-4062-a3a0-e20f2169c488" containerName="neutron-httpd" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.385436 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="87b85928-284c-4a99-81b2-26be9c9d8acf" containerName="ceilometer-central-agent" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.385446 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="87b85928-284c-4a99-81b2-26be9c9d8acf" containerName="sg-core" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.385457 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="08ef410f-221b-4062-a3a0-e20f2169c488" containerName="neutron-api" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.385473 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="87b85928-284c-4a99-81b2-26be9c9d8acf" containerName="ceilometer-notification-agent" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.387210 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.391955 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.392325 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.423808 4700 scope.go:117] "RemoveContainer" containerID="58bdb19ade8ce95e35546896ae95fb94a0df748f3882c0b1421bab46fe6d5890" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.425879 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.459182 4700 scope.go:117] "RemoveContainer" containerID="15590034b036eb03441919f6e0f80b2cc5054d90b680ea624b049cd1eff18513" Oct 07 11:38:57 crc kubenswrapper[4700]: E1007 11:38:57.460641 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15590034b036eb03441919f6e0f80b2cc5054d90b680ea624b049cd1eff18513\": container with ID starting with 15590034b036eb03441919f6e0f80b2cc5054d90b680ea624b049cd1eff18513 not found: ID does not exist" containerID="15590034b036eb03441919f6e0f80b2cc5054d90b680ea624b049cd1eff18513" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.460707 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15590034b036eb03441919f6e0f80b2cc5054d90b680ea624b049cd1eff18513"} err="failed to get container status \"15590034b036eb03441919f6e0f80b2cc5054d90b680ea624b049cd1eff18513\": rpc error: code = NotFound desc = could not find container \"15590034b036eb03441919f6e0f80b2cc5054d90b680ea624b049cd1eff18513\": container with ID starting with 15590034b036eb03441919f6e0f80b2cc5054d90b680ea624b049cd1eff18513 not found: ID does not exist" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.460744 4700 scope.go:117] "RemoveContainer" containerID="c2ae3a6b0d2d0d5fd01773f2116a8b784c0147ffe5ca9c23a1686da9ce5700ab" Oct 07 11:38:57 crc kubenswrapper[4700]: E1007 11:38:57.462526 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2ae3a6b0d2d0d5fd01773f2116a8b784c0147ffe5ca9c23a1686da9ce5700ab\": container with ID starting with c2ae3a6b0d2d0d5fd01773f2116a8b784c0147ffe5ca9c23a1686da9ce5700ab not found: ID does not exist" containerID="c2ae3a6b0d2d0d5fd01773f2116a8b784c0147ffe5ca9c23a1686da9ce5700ab" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.462577 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2ae3a6b0d2d0d5fd01773f2116a8b784c0147ffe5ca9c23a1686da9ce5700ab"} err="failed to get container status \"c2ae3a6b0d2d0d5fd01773f2116a8b784c0147ffe5ca9c23a1686da9ce5700ab\": rpc error: code = NotFound desc = could not find container \"c2ae3a6b0d2d0d5fd01773f2116a8b784c0147ffe5ca9c23a1686da9ce5700ab\": container with ID starting with c2ae3a6b0d2d0d5fd01773f2116a8b784c0147ffe5ca9c23a1686da9ce5700ab not found: ID does not exist" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.462600 4700 scope.go:117] "RemoveContainer" containerID="4762363538b931b06f4bcc27987dc9942b1f61aa91e7d787452e19fd9509a6f9" Oct 07 11:38:57 crc kubenswrapper[4700]: E1007 11:38:57.467404 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4762363538b931b06f4bcc27987dc9942b1f61aa91e7d787452e19fd9509a6f9\": container with ID starting with 4762363538b931b06f4bcc27987dc9942b1f61aa91e7d787452e19fd9509a6f9 not found: ID does not exist" containerID="4762363538b931b06f4bcc27987dc9942b1f61aa91e7d787452e19fd9509a6f9" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.467459 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4762363538b931b06f4bcc27987dc9942b1f61aa91e7d787452e19fd9509a6f9"} err="failed to get container status \"4762363538b931b06f4bcc27987dc9942b1f61aa91e7d787452e19fd9509a6f9\": rpc error: code = NotFound desc = could not find container \"4762363538b931b06f4bcc27987dc9942b1f61aa91e7d787452e19fd9509a6f9\": container with ID starting with 4762363538b931b06f4bcc27987dc9942b1f61aa91e7d787452e19fd9509a6f9 not found: ID does not exist" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.467488 4700 scope.go:117] "RemoveContainer" containerID="58bdb19ade8ce95e35546896ae95fb94a0df748f3882c0b1421bab46fe6d5890" Oct 07 11:38:57 crc kubenswrapper[4700]: E1007 11:38:57.471850 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58bdb19ade8ce95e35546896ae95fb94a0df748f3882c0b1421bab46fe6d5890\": container with ID starting with 58bdb19ade8ce95e35546896ae95fb94a0df748f3882c0b1421bab46fe6d5890 not found: ID does not exist" containerID="58bdb19ade8ce95e35546896ae95fb94a0df748f3882c0b1421bab46fe6d5890" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.471922 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58bdb19ade8ce95e35546896ae95fb94a0df748f3882c0b1421bab46fe6d5890"} err="failed to get container status \"58bdb19ade8ce95e35546896ae95fb94a0df748f3882c0b1421bab46fe6d5890\": rpc error: code = NotFound desc = could not find container \"58bdb19ade8ce95e35546896ae95fb94a0df748f3882c0b1421bab46fe6d5890\": container with ID starting with 58bdb19ade8ce95e35546896ae95fb94a0df748f3882c0b1421bab46fe6d5890 not found: ID does not exist" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.490959 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70c71136-ec60-4870-b5e0-8196e66155fd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"70c71136-ec60-4870-b5e0-8196e66155fd\") " pod="openstack/ceilometer-0" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.491026 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c71136-ec60-4870-b5e0-8196e66155fd-config-data\") pod \"ceilometer-0\" (UID: \"70c71136-ec60-4870-b5e0-8196e66155fd\") " pod="openstack/ceilometer-0" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.491074 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70c71136-ec60-4870-b5e0-8196e66155fd-scripts\") pod \"ceilometer-0\" (UID: \"70c71136-ec60-4870-b5e0-8196e66155fd\") " pod="openstack/ceilometer-0" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.491110 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c71136-ec60-4870-b5e0-8196e66155fd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"70c71136-ec60-4870-b5e0-8196e66155fd\") " pod="openstack/ceilometer-0" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.491297 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65596\" (UniqueName: \"kubernetes.io/projected/70c71136-ec60-4870-b5e0-8196e66155fd-kube-api-access-65596\") pod \"ceilometer-0\" (UID: \"70c71136-ec60-4870-b5e0-8196e66155fd\") " pod="openstack/ceilometer-0" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.491387 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70c71136-ec60-4870-b5e0-8196e66155fd-log-httpd\") pod \"ceilometer-0\" (UID: \"70c71136-ec60-4870-b5e0-8196e66155fd\") " pod="openstack/ceilometer-0" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.491710 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70c71136-ec60-4870-b5e0-8196e66155fd-run-httpd\") pod \"ceilometer-0\" (UID: \"70c71136-ec60-4870-b5e0-8196e66155fd\") " pod="openstack/ceilometer-0" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.593852 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65596\" (UniqueName: \"kubernetes.io/projected/70c71136-ec60-4870-b5e0-8196e66155fd-kube-api-access-65596\") pod \"ceilometer-0\" (UID: \"70c71136-ec60-4870-b5e0-8196e66155fd\") " pod="openstack/ceilometer-0" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.593921 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70c71136-ec60-4870-b5e0-8196e66155fd-log-httpd\") pod \"ceilometer-0\" (UID: \"70c71136-ec60-4870-b5e0-8196e66155fd\") " pod="openstack/ceilometer-0" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.594000 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70c71136-ec60-4870-b5e0-8196e66155fd-run-httpd\") pod \"ceilometer-0\" (UID: \"70c71136-ec60-4870-b5e0-8196e66155fd\") " pod="openstack/ceilometer-0" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.594053 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70c71136-ec60-4870-b5e0-8196e66155fd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"70c71136-ec60-4870-b5e0-8196e66155fd\") " pod="openstack/ceilometer-0" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.594101 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c71136-ec60-4870-b5e0-8196e66155fd-config-data\") pod \"ceilometer-0\" (UID: \"70c71136-ec60-4870-b5e0-8196e66155fd\") " pod="openstack/ceilometer-0" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.594139 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70c71136-ec60-4870-b5e0-8196e66155fd-scripts\") pod \"ceilometer-0\" (UID: \"70c71136-ec60-4870-b5e0-8196e66155fd\") " pod="openstack/ceilometer-0" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.594217 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c71136-ec60-4870-b5e0-8196e66155fd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"70c71136-ec60-4870-b5e0-8196e66155fd\") " pod="openstack/ceilometer-0" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.594570 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70c71136-ec60-4870-b5e0-8196e66155fd-run-httpd\") pod \"ceilometer-0\" (UID: \"70c71136-ec60-4870-b5e0-8196e66155fd\") " pod="openstack/ceilometer-0" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.594776 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70c71136-ec60-4870-b5e0-8196e66155fd-log-httpd\") pod \"ceilometer-0\" (UID: \"70c71136-ec60-4870-b5e0-8196e66155fd\") " pod="openstack/ceilometer-0" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.597722 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70c71136-ec60-4870-b5e0-8196e66155fd-scripts\") pod \"ceilometer-0\" (UID: \"70c71136-ec60-4870-b5e0-8196e66155fd\") " pod="openstack/ceilometer-0" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.598223 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c71136-ec60-4870-b5e0-8196e66155fd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"70c71136-ec60-4870-b5e0-8196e66155fd\") " pod="openstack/ceilometer-0" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.598434 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c71136-ec60-4870-b5e0-8196e66155fd-config-data\") pod \"ceilometer-0\" (UID: \"70c71136-ec60-4870-b5e0-8196e66155fd\") " pod="openstack/ceilometer-0" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.611817 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70c71136-ec60-4870-b5e0-8196e66155fd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"70c71136-ec60-4870-b5e0-8196e66155fd\") " pod="openstack/ceilometer-0" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.615913 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65596\" (UniqueName: \"kubernetes.io/projected/70c71136-ec60-4870-b5e0-8196e66155fd-kube-api-access-65596\") pod \"ceilometer-0\" (UID: \"70c71136-ec60-4870-b5e0-8196e66155fd\") " pod="openstack/ceilometer-0" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.724056 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 11:38:57 crc kubenswrapper[4700]: I1007 11:38:57.967963 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87b85928-284c-4a99-81b2-26be9c9d8acf" path="/var/lib/kubelet/pods/87b85928-284c-4a99-81b2-26be9c9d8acf/volumes" Oct 07 11:38:58 crc kubenswrapper[4700]: I1007 11:38:58.230722 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:38:58 crc kubenswrapper[4700]: W1007 11:38:58.233778 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70c71136_ec60_4870_b5e0_8196e66155fd.slice/crio-a3c08e46e53706d3b024d1f71b6d8e43b616d9117577b886d76724e060f47abd WatchSource:0}: Error finding container a3c08e46e53706d3b024d1f71b6d8e43b616d9117577b886d76724e060f47abd: Status 404 returned error can't find the container with id a3c08e46e53706d3b024d1f71b6d8e43b616d9117577b886d76724e060f47abd Oct 07 11:38:58 crc kubenswrapper[4700]: I1007 11:38:58.289236 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70c71136-ec60-4870-b5e0-8196e66155fd","Type":"ContainerStarted","Data":"a3c08e46e53706d3b024d1f71b6d8e43b616d9117577b886d76724e060f47abd"} Oct 07 11:38:58 crc kubenswrapper[4700]: I1007 11:38:58.330039 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 07 11:38:58 crc kubenswrapper[4700]: I1007 11:38:58.330172 4700 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 11:38:58 crc kubenswrapper[4700]: I1007 11:38:58.335175 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 07 11:38:59 crc kubenswrapper[4700]: I1007 11:38:59.299709 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70c71136-ec60-4870-b5e0-8196e66155fd","Type":"ContainerStarted","Data":"ad38e2021a95c769c45d648c2360d4b5fb387f0087c7fa842e51732c28dbbcdd"} Oct 07 11:38:59 crc kubenswrapper[4700]: I1007 11:38:59.300556 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-f4pcg"] Oct 07 11:38:59 crc kubenswrapper[4700]: I1007 11:38:59.301681 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-f4pcg" Oct 07 11:38:59 crc kubenswrapper[4700]: I1007 11:38:59.323446 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-f4pcg"] Oct 07 11:38:59 crc kubenswrapper[4700]: I1007 11:38:59.405807 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-jxw7t"] Oct 07 11:38:59 crc kubenswrapper[4700]: I1007 11:38:59.406863 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jxw7t" Oct 07 11:38:59 crc kubenswrapper[4700]: I1007 11:38:59.439712 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-jxw7t"] Oct 07 11:38:59 crc kubenswrapper[4700]: I1007 11:38:59.449003 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrn9z\" (UniqueName: \"kubernetes.io/projected/d62cb423-c171-42e0-a579-ec4b427d440a-kube-api-access-hrn9z\") pod \"nova-api-db-create-f4pcg\" (UID: \"d62cb423-c171-42e0-a579-ec4b427d440a\") " pod="openstack/nova-api-db-create-f4pcg" Oct 07 11:38:59 crc kubenswrapper[4700]: I1007 11:38:59.551122 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhk2j\" (UniqueName: \"kubernetes.io/projected/1029c91b-8d5a-4de4-9fa3-c28f0ea3e5fa-kube-api-access-bhk2j\") pod \"nova-cell0-db-create-jxw7t\" (UID: \"1029c91b-8d5a-4de4-9fa3-c28f0ea3e5fa\") " pod="openstack/nova-cell0-db-create-jxw7t" Oct 07 11:38:59 crc kubenswrapper[4700]: I1007 11:38:59.551204 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrn9z\" (UniqueName: \"kubernetes.io/projected/d62cb423-c171-42e0-a579-ec4b427d440a-kube-api-access-hrn9z\") pod \"nova-api-db-create-f4pcg\" (UID: \"d62cb423-c171-42e0-a579-ec4b427d440a\") " pod="openstack/nova-api-db-create-f4pcg" Oct 07 11:38:59 crc kubenswrapper[4700]: I1007 11:38:59.572014 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrn9z\" (UniqueName: \"kubernetes.io/projected/d62cb423-c171-42e0-a579-ec4b427d440a-kube-api-access-hrn9z\") pod \"nova-api-db-create-f4pcg\" (UID: \"d62cb423-c171-42e0-a579-ec4b427d440a\") " pod="openstack/nova-api-db-create-f4pcg" Oct 07 11:38:59 crc kubenswrapper[4700]: I1007 11:38:59.605921 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-5dmmx"] Oct 07 11:38:59 crc kubenswrapper[4700]: I1007 11:38:59.607282 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5dmmx" Oct 07 11:38:59 crc kubenswrapper[4700]: I1007 11:38:59.620455 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-f4pcg" Oct 07 11:38:59 crc kubenswrapper[4700]: I1007 11:38:59.642029 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-5dmmx"] Oct 07 11:38:59 crc kubenswrapper[4700]: I1007 11:38:59.653639 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhk2j\" (UniqueName: \"kubernetes.io/projected/1029c91b-8d5a-4de4-9fa3-c28f0ea3e5fa-kube-api-access-bhk2j\") pod \"nova-cell0-db-create-jxw7t\" (UID: \"1029c91b-8d5a-4de4-9fa3-c28f0ea3e5fa\") " pod="openstack/nova-cell0-db-create-jxw7t" Oct 07 11:38:59 crc kubenswrapper[4700]: I1007 11:38:59.672845 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhk2j\" (UniqueName: \"kubernetes.io/projected/1029c91b-8d5a-4de4-9fa3-c28f0ea3e5fa-kube-api-access-bhk2j\") pod \"nova-cell0-db-create-jxw7t\" (UID: \"1029c91b-8d5a-4de4-9fa3-c28f0ea3e5fa\") " pod="openstack/nova-cell0-db-create-jxw7t" Oct 07 11:38:59 crc kubenswrapper[4700]: I1007 11:38:59.759018 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87d92\" (UniqueName: \"kubernetes.io/projected/ef10aef3-784c-4c7a-a9a9-5f1e36e84d2a-kube-api-access-87d92\") pod \"nova-cell1-db-create-5dmmx\" (UID: \"ef10aef3-784c-4c7a-a9a9-5f1e36e84d2a\") " pod="openstack/nova-cell1-db-create-5dmmx" Oct 07 11:38:59 crc kubenswrapper[4700]: I1007 11:38:59.854110 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jxw7t" Oct 07 11:38:59 crc kubenswrapper[4700]: I1007 11:38:59.860512 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87d92\" (UniqueName: \"kubernetes.io/projected/ef10aef3-784c-4c7a-a9a9-5f1e36e84d2a-kube-api-access-87d92\") pod \"nova-cell1-db-create-5dmmx\" (UID: \"ef10aef3-784c-4c7a-a9a9-5f1e36e84d2a\") " pod="openstack/nova-cell1-db-create-5dmmx" Oct 07 11:38:59 crc kubenswrapper[4700]: I1007 11:38:59.894803 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87d92\" (UniqueName: \"kubernetes.io/projected/ef10aef3-784c-4c7a-a9a9-5f1e36e84d2a-kube-api-access-87d92\") pod \"nova-cell1-db-create-5dmmx\" (UID: \"ef10aef3-784c-4c7a-a9a9-5f1e36e84d2a\") " pod="openstack/nova-cell1-db-create-5dmmx" Oct 07 11:38:59 crc kubenswrapper[4700]: I1007 11:38:59.923524 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5dmmx" Oct 07 11:39:00 crc kubenswrapper[4700]: I1007 11:39:00.244318 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-f4pcg"] Oct 07 11:39:00 crc kubenswrapper[4700]: I1007 11:39:00.319651 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-f4pcg" event={"ID":"d62cb423-c171-42e0-a579-ec4b427d440a","Type":"ContainerStarted","Data":"7e80080486fa8206a4cefd050995b005d3d8c40b7e7aacc3058cdf611e866526"} Oct 07 11:39:00 crc kubenswrapper[4700]: I1007 11:39:00.321854 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70c71136-ec60-4870-b5e0-8196e66155fd","Type":"ContainerStarted","Data":"e8dae923838069e574b176c0ba992ca8cb3f7282a011fe292f321a73e2d7c820"} Oct 07 11:39:00 crc kubenswrapper[4700]: I1007 11:39:00.430436 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-5dmmx"] Oct 07 11:39:00 crc kubenswrapper[4700]: I1007 11:39:00.453052 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-jxw7t"] Oct 07 11:39:00 crc kubenswrapper[4700]: W1007 11:39:00.460369 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1029c91b_8d5a_4de4_9fa3_c28f0ea3e5fa.slice/crio-22a5c2d3bf7b6f6def794ea83be2bdeac41fff314a2450263d9ed8fd9ea49604 WatchSource:0}: Error finding container 22a5c2d3bf7b6f6def794ea83be2bdeac41fff314a2450263d9ed8fd9ea49604: Status 404 returned error can't find the container with id 22a5c2d3bf7b6f6def794ea83be2bdeac41fff314a2450263d9ed8fd9ea49604 Oct 07 11:39:01 crc kubenswrapper[4700]: I1007 11:39:01.331488 4700 generic.go:334] "Generic (PLEG): container finished" podID="d62cb423-c171-42e0-a579-ec4b427d440a" containerID="45857cec3ee0b45c8823a4b8f0dca498bc775ba0c6e193d6f84fd99f9ee93ba4" exitCode=0 Oct 07 11:39:01 crc kubenswrapper[4700]: I1007 11:39:01.331561 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-f4pcg" event={"ID":"d62cb423-c171-42e0-a579-ec4b427d440a","Type":"ContainerDied","Data":"45857cec3ee0b45c8823a4b8f0dca498bc775ba0c6e193d6f84fd99f9ee93ba4"} Oct 07 11:39:01 crc kubenswrapper[4700]: I1007 11:39:01.334860 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70c71136-ec60-4870-b5e0-8196e66155fd","Type":"ContainerStarted","Data":"dd663f0937cb2dd644f9a1edfa7d044c6ea12c2068a3f5d72924d8fd01968c55"} Oct 07 11:39:01 crc kubenswrapper[4700]: I1007 11:39:01.337327 4700 generic.go:334] "Generic (PLEG): container finished" podID="ef10aef3-784c-4c7a-a9a9-5f1e36e84d2a" containerID="ca5c7ddde7f22ed9ac8ab549ef3bd576dd79b6a5c2690b8e5d8c0384bc5092b3" exitCode=0 Oct 07 11:39:01 crc kubenswrapper[4700]: I1007 11:39:01.337382 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5dmmx" event={"ID":"ef10aef3-784c-4c7a-a9a9-5f1e36e84d2a","Type":"ContainerDied","Data":"ca5c7ddde7f22ed9ac8ab549ef3bd576dd79b6a5c2690b8e5d8c0384bc5092b3"} Oct 07 11:39:01 crc kubenswrapper[4700]: I1007 11:39:01.337481 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5dmmx" event={"ID":"ef10aef3-784c-4c7a-a9a9-5f1e36e84d2a","Type":"ContainerStarted","Data":"447b0108259cc491fce460a5a46a4d2f6cc2d349462a5debd9ca739b611bbd96"} Oct 07 11:39:01 crc kubenswrapper[4700]: I1007 11:39:01.339283 4700 generic.go:334] "Generic (PLEG): container finished" podID="1029c91b-8d5a-4de4-9fa3-c28f0ea3e5fa" containerID="073d1599228f72beeb612a3a5d4bbdf370d05377ccd988afaeea78d51d21cd05" exitCode=0 Oct 07 11:39:01 crc kubenswrapper[4700]: I1007 11:39:01.339345 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jxw7t" event={"ID":"1029c91b-8d5a-4de4-9fa3-c28f0ea3e5fa","Type":"ContainerDied","Data":"073d1599228f72beeb612a3a5d4bbdf370d05377ccd988afaeea78d51d21cd05"} Oct 07 11:39:01 crc kubenswrapper[4700]: I1007 11:39:01.339527 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jxw7t" event={"ID":"1029c91b-8d5a-4de4-9fa3-c28f0ea3e5fa","Type":"ContainerStarted","Data":"22a5c2d3bf7b6f6def794ea83be2bdeac41fff314a2450263d9ed8fd9ea49604"} Oct 07 11:39:02 crc kubenswrapper[4700]: I1007 11:39:02.350517 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70c71136-ec60-4870-b5e0-8196e66155fd","Type":"ContainerStarted","Data":"e556536cefcf3feeb2c2c67f27c1745c76adace5b01e673de6bfd5b6137197e2"} Oct 07 11:39:02 crc kubenswrapper[4700]: I1007 11:39:02.352407 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 11:39:02 crc kubenswrapper[4700]: E1007 11:39:02.379678 4700 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="370391fb89d0b6dbfe5285a98a41f53e1e1f4d782e3b568d7f8fc6e55edf5a19" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 07 11:39:02 crc kubenswrapper[4700]: E1007 11:39:02.382155 4700 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="370391fb89d0b6dbfe5285a98a41f53e1e1f4d782e3b568d7f8fc6e55edf5a19" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 07 11:39:02 crc kubenswrapper[4700]: I1007 11:39:02.383783 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.918034718 podStartE2EDuration="5.383766892s" podCreationTimestamp="2025-10-07 11:38:57 +0000 UTC" firstStartedPulling="2025-10-07 11:38:58.237131252 +0000 UTC m=+1105.033530261" lastFinishedPulling="2025-10-07 11:39:01.702863446 +0000 UTC m=+1108.499262435" observedRunningTime="2025-10-07 11:39:02.373254608 +0000 UTC m=+1109.169653617" watchObservedRunningTime="2025-10-07 11:39:02.383766892 +0000 UTC m=+1109.180165881" Oct 07 11:39:02 crc kubenswrapper[4700]: E1007 11:39:02.385037 4700 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="370391fb89d0b6dbfe5285a98a41f53e1e1f4d782e3b568d7f8fc6e55edf5a19" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 07 11:39:02 crc kubenswrapper[4700]: E1007 11:39:02.385079 4700 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-759f9b554d-c5s6x" podUID="b7389d16-ec46-47d3-9466-2b844946b6c6" containerName="heat-engine" Oct 07 11:39:02 crc kubenswrapper[4700]: I1007 11:39:02.891513 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jxw7t" Oct 07 11:39:02 crc kubenswrapper[4700]: I1007 11:39:02.903354 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-f4pcg" Oct 07 11:39:02 crc kubenswrapper[4700]: I1007 11:39:02.905693 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5dmmx" Oct 07 11:39:03 crc kubenswrapper[4700]: I1007 11:39:03.017155 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrn9z\" (UniqueName: \"kubernetes.io/projected/d62cb423-c171-42e0-a579-ec4b427d440a-kube-api-access-hrn9z\") pod \"d62cb423-c171-42e0-a579-ec4b427d440a\" (UID: \"d62cb423-c171-42e0-a579-ec4b427d440a\") " Oct 07 11:39:03 crc kubenswrapper[4700]: I1007 11:39:03.017445 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhk2j\" (UniqueName: \"kubernetes.io/projected/1029c91b-8d5a-4de4-9fa3-c28f0ea3e5fa-kube-api-access-bhk2j\") pod \"1029c91b-8d5a-4de4-9fa3-c28f0ea3e5fa\" (UID: \"1029c91b-8d5a-4de4-9fa3-c28f0ea3e5fa\") " Oct 07 11:39:03 crc kubenswrapper[4700]: I1007 11:39:03.017609 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87d92\" (UniqueName: \"kubernetes.io/projected/ef10aef3-784c-4c7a-a9a9-5f1e36e84d2a-kube-api-access-87d92\") pod \"ef10aef3-784c-4c7a-a9a9-5f1e36e84d2a\" (UID: \"ef10aef3-784c-4c7a-a9a9-5f1e36e84d2a\") " Oct 07 11:39:03 crc kubenswrapper[4700]: I1007 11:39:03.025723 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef10aef3-784c-4c7a-a9a9-5f1e36e84d2a-kube-api-access-87d92" (OuterVolumeSpecName: "kube-api-access-87d92") pod "ef10aef3-784c-4c7a-a9a9-5f1e36e84d2a" (UID: "ef10aef3-784c-4c7a-a9a9-5f1e36e84d2a"). InnerVolumeSpecName "kube-api-access-87d92". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:39:03 crc kubenswrapper[4700]: I1007 11:39:03.025770 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1029c91b-8d5a-4de4-9fa3-c28f0ea3e5fa-kube-api-access-bhk2j" (OuterVolumeSpecName: "kube-api-access-bhk2j") pod "1029c91b-8d5a-4de4-9fa3-c28f0ea3e5fa" (UID: "1029c91b-8d5a-4de4-9fa3-c28f0ea3e5fa"). InnerVolumeSpecName "kube-api-access-bhk2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:39:03 crc kubenswrapper[4700]: I1007 11:39:03.029670 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d62cb423-c171-42e0-a579-ec4b427d440a-kube-api-access-hrn9z" (OuterVolumeSpecName: "kube-api-access-hrn9z") pod "d62cb423-c171-42e0-a579-ec4b427d440a" (UID: "d62cb423-c171-42e0-a579-ec4b427d440a"). InnerVolumeSpecName "kube-api-access-hrn9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:39:03 crc kubenswrapper[4700]: I1007 11:39:03.120896 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhk2j\" (UniqueName: \"kubernetes.io/projected/1029c91b-8d5a-4de4-9fa3-c28f0ea3e5fa-kube-api-access-bhk2j\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:03 crc kubenswrapper[4700]: I1007 11:39:03.120922 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87d92\" (UniqueName: \"kubernetes.io/projected/ef10aef3-784c-4c7a-a9a9-5f1e36e84d2a-kube-api-access-87d92\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:03 crc kubenswrapper[4700]: I1007 11:39:03.120933 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrn9z\" (UniqueName: \"kubernetes.io/projected/d62cb423-c171-42e0-a579-ec4b427d440a-kube-api-access-hrn9z\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:03 crc kubenswrapper[4700]: I1007 11:39:03.363426 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5dmmx" event={"ID":"ef10aef3-784c-4c7a-a9a9-5f1e36e84d2a","Type":"ContainerDied","Data":"447b0108259cc491fce460a5a46a4d2f6cc2d349462a5debd9ca739b611bbd96"} Oct 07 11:39:03 crc kubenswrapper[4700]: I1007 11:39:03.363469 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="447b0108259cc491fce460a5a46a4d2f6cc2d349462a5debd9ca739b611bbd96" Oct 07 11:39:03 crc kubenswrapper[4700]: I1007 11:39:03.363488 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5dmmx" Oct 07 11:39:03 crc kubenswrapper[4700]: I1007 11:39:03.368722 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jxw7t" event={"ID":"1029c91b-8d5a-4de4-9fa3-c28f0ea3e5fa","Type":"ContainerDied","Data":"22a5c2d3bf7b6f6def794ea83be2bdeac41fff314a2450263d9ed8fd9ea49604"} Oct 07 11:39:03 crc kubenswrapper[4700]: I1007 11:39:03.368925 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22a5c2d3bf7b6f6def794ea83be2bdeac41fff314a2450263d9ed8fd9ea49604" Oct 07 11:39:03 crc kubenswrapper[4700]: I1007 11:39:03.369112 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jxw7t" Oct 07 11:39:03 crc kubenswrapper[4700]: I1007 11:39:03.378983 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-f4pcg" Oct 07 11:39:03 crc kubenswrapper[4700]: I1007 11:39:03.379029 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-f4pcg" event={"ID":"d62cb423-c171-42e0-a579-ec4b427d440a","Type":"ContainerDied","Data":"7e80080486fa8206a4cefd050995b005d3d8c40b7e7aacc3058cdf611e866526"} Oct 07 11:39:03 crc kubenswrapper[4700]: I1007 11:39:03.379075 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e80080486fa8206a4cefd050995b005d3d8c40b7e7aacc3058cdf611e866526" Oct 07 11:39:04 crc kubenswrapper[4700]: I1007 11:39:04.406755 4700 generic.go:334] "Generic (PLEG): container finished" podID="b7389d16-ec46-47d3-9466-2b844946b6c6" containerID="370391fb89d0b6dbfe5285a98a41f53e1e1f4d782e3b568d7f8fc6e55edf5a19" exitCode=0 Oct 07 11:39:04 crc kubenswrapper[4700]: I1007 11:39:04.406856 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-759f9b554d-c5s6x" event={"ID":"b7389d16-ec46-47d3-9466-2b844946b6c6","Type":"ContainerDied","Data":"370391fb89d0b6dbfe5285a98a41f53e1e1f4d782e3b568d7f8fc6e55edf5a19"} Oct 07 11:39:04 crc kubenswrapper[4700]: I1007 11:39:04.408398 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-759f9b554d-c5s6x" event={"ID":"b7389d16-ec46-47d3-9466-2b844946b6c6","Type":"ContainerDied","Data":"5091ad7e301d50258a7b61dce2237a37cdff4992ba75a4d5370ddf3d6f90764e"} Oct 07 11:39:04 crc kubenswrapper[4700]: I1007 11:39:04.408414 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5091ad7e301d50258a7b61dce2237a37cdff4992ba75a4d5370ddf3d6f90764e" Oct 07 11:39:04 crc kubenswrapper[4700]: I1007 11:39:04.450394 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-759f9b554d-c5s6x" Oct 07 11:39:04 crc kubenswrapper[4700]: I1007 11:39:04.543376 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7389d16-ec46-47d3-9466-2b844946b6c6-combined-ca-bundle\") pod \"b7389d16-ec46-47d3-9466-2b844946b6c6\" (UID: \"b7389d16-ec46-47d3-9466-2b844946b6c6\") " Oct 07 11:39:04 crc kubenswrapper[4700]: I1007 11:39:04.543464 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7389d16-ec46-47d3-9466-2b844946b6c6-config-data\") pod \"b7389d16-ec46-47d3-9466-2b844946b6c6\" (UID: \"b7389d16-ec46-47d3-9466-2b844946b6c6\") " Oct 07 11:39:04 crc kubenswrapper[4700]: I1007 11:39:04.543746 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zs5cd\" (UniqueName: \"kubernetes.io/projected/b7389d16-ec46-47d3-9466-2b844946b6c6-kube-api-access-zs5cd\") pod \"b7389d16-ec46-47d3-9466-2b844946b6c6\" (UID: \"b7389d16-ec46-47d3-9466-2b844946b6c6\") " Oct 07 11:39:04 crc kubenswrapper[4700]: I1007 11:39:04.543812 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7389d16-ec46-47d3-9466-2b844946b6c6-config-data-custom\") pod \"b7389d16-ec46-47d3-9466-2b844946b6c6\" (UID: \"b7389d16-ec46-47d3-9466-2b844946b6c6\") " Oct 07 11:39:04 crc kubenswrapper[4700]: I1007 11:39:04.549257 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7389d16-ec46-47d3-9466-2b844946b6c6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b7389d16-ec46-47d3-9466-2b844946b6c6" (UID: "b7389d16-ec46-47d3-9466-2b844946b6c6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:39:04 crc kubenswrapper[4700]: I1007 11:39:04.550196 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7389d16-ec46-47d3-9466-2b844946b6c6-kube-api-access-zs5cd" (OuterVolumeSpecName: "kube-api-access-zs5cd") pod "b7389d16-ec46-47d3-9466-2b844946b6c6" (UID: "b7389d16-ec46-47d3-9466-2b844946b6c6"). InnerVolumeSpecName "kube-api-access-zs5cd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:39:04 crc kubenswrapper[4700]: I1007 11:39:04.584093 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7389d16-ec46-47d3-9466-2b844946b6c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7389d16-ec46-47d3-9466-2b844946b6c6" (UID: "b7389d16-ec46-47d3-9466-2b844946b6c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:39:04 crc kubenswrapper[4700]: I1007 11:39:04.595683 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7389d16-ec46-47d3-9466-2b844946b6c6-config-data" (OuterVolumeSpecName: "config-data") pod "b7389d16-ec46-47d3-9466-2b844946b6c6" (UID: "b7389d16-ec46-47d3-9466-2b844946b6c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:39:04 crc kubenswrapper[4700]: I1007 11:39:04.646063 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zs5cd\" (UniqueName: \"kubernetes.io/projected/b7389d16-ec46-47d3-9466-2b844946b6c6-kube-api-access-zs5cd\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:04 crc kubenswrapper[4700]: I1007 11:39:04.646099 4700 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7389d16-ec46-47d3-9466-2b844946b6c6-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:04 crc kubenswrapper[4700]: I1007 11:39:04.646115 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7389d16-ec46-47d3-9466-2b844946b6c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:04 crc kubenswrapper[4700]: I1007 11:39:04.646131 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7389d16-ec46-47d3-9466-2b844946b6c6-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:05 crc kubenswrapper[4700]: I1007 11:39:05.415119 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-759f9b554d-c5s6x" Oct 07 11:39:05 crc kubenswrapper[4700]: I1007 11:39:05.455554 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-759f9b554d-c5s6x"] Oct 07 11:39:05 crc kubenswrapper[4700]: I1007 11:39:05.466867 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-759f9b554d-c5s6x"] Oct 07 11:39:05 crc kubenswrapper[4700]: I1007 11:39:05.972102 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7389d16-ec46-47d3-9466-2b844946b6c6" path="/var/lib/kubelet/pods/b7389d16-ec46-47d3-9466-2b844946b6c6/volumes" Oct 07 11:39:09 crc kubenswrapper[4700]: I1007 11:39:09.537539 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-f251-account-create-fjksm"] Oct 07 11:39:09 crc kubenswrapper[4700]: E1007 11:39:09.538460 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d62cb423-c171-42e0-a579-ec4b427d440a" containerName="mariadb-database-create" Oct 07 11:39:09 crc kubenswrapper[4700]: I1007 11:39:09.538475 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="d62cb423-c171-42e0-a579-ec4b427d440a" containerName="mariadb-database-create" Oct 07 11:39:09 crc kubenswrapper[4700]: E1007 11:39:09.538514 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1029c91b-8d5a-4de4-9fa3-c28f0ea3e5fa" containerName="mariadb-database-create" Oct 07 11:39:09 crc kubenswrapper[4700]: I1007 11:39:09.538522 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="1029c91b-8d5a-4de4-9fa3-c28f0ea3e5fa" containerName="mariadb-database-create" Oct 07 11:39:09 crc kubenswrapper[4700]: E1007 11:39:09.538543 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7389d16-ec46-47d3-9466-2b844946b6c6" containerName="heat-engine" Oct 07 11:39:09 crc kubenswrapper[4700]: I1007 11:39:09.538553 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7389d16-ec46-47d3-9466-2b844946b6c6" containerName="heat-engine" Oct 07 11:39:09 crc kubenswrapper[4700]: E1007 11:39:09.538592 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef10aef3-784c-4c7a-a9a9-5f1e36e84d2a" containerName="mariadb-database-create" Oct 07 11:39:09 crc kubenswrapper[4700]: I1007 11:39:09.538600 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef10aef3-784c-4c7a-a9a9-5f1e36e84d2a" containerName="mariadb-database-create" Oct 07 11:39:09 crc kubenswrapper[4700]: I1007 11:39:09.538793 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef10aef3-784c-4c7a-a9a9-5f1e36e84d2a" containerName="mariadb-database-create" Oct 07 11:39:09 crc kubenswrapper[4700]: I1007 11:39:09.538825 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7389d16-ec46-47d3-9466-2b844946b6c6" containerName="heat-engine" Oct 07 11:39:09 crc kubenswrapper[4700]: I1007 11:39:09.538838 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="1029c91b-8d5a-4de4-9fa3-c28f0ea3e5fa" containerName="mariadb-database-create" Oct 07 11:39:09 crc kubenswrapper[4700]: I1007 11:39:09.538848 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="d62cb423-c171-42e0-a579-ec4b427d440a" containerName="mariadb-database-create" Oct 07 11:39:09 crc kubenswrapper[4700]: I1007 11:39:09.539571 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f251-account-create-fjksm" Oct 07 11:39:09 crc kubenswrapper[4700]: I1007 11:39:09.546057 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 07 11:39:09 crc kubenswrapper[4700]: I1007 11:39:09.548687 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-f251-account-create-fjksm"] Oct 07 11:39:09 crc kubenswrapper[4700]: I1007 11:39:09.628924 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb4rh\" (UniqueName: \"kubernetes.io/projected/cb5a0b33-a377-4d96-acc6-3e9eb27bff2d-kube-api-access-nb4rh\") pod \"nova-api-f251-account-create-fjksm\" (UID: \"cb5a0b33-a377-4d96-acc6-3e9eb27bff2d\") " pod="openstack/nova-api-f251-account-create-fjksm" Oct 07 11:39:09 crc kubenswrapper[4700]: I1007 11:39:09.731432 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb4rh\" (UniqueName: \"kubernetes.io/projected/cb5a0b33-a377-4d96-acc6-3e9eb27bff2d-kube-api-access-nb4rh\") pod \"nova-api-f251-account-create-fjksm\" (UID: \"cb5a0b33-a377-4d96-acc6-3e9eb27bff2d\") " pod="openstack/nova-api-f251-account-create-fjksm" Oct 07 11:39:09 crc kubenswrapper[4700]: I1007 11:39:09.743139 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-0850-account-create-hvrkb"] Oct 07 11:39:09 crc kubenswrapper[4700]: I1007 11:39:09.744954 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-0850-account-create-hvrkb" Oct 07 11:39:09 crc kubenswrapper[4700]: I1007 11:39:09.746171 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 07 11:39:09 crc kubenswrapper[4700]: I1007 11:39:09.753273 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb4rh\" (UniqueName: \"kubernetes.io/projected/cb5a0b33-a377-4d96-acc6-3e9eb27bff2d-kube-api-access-nb4rh\") pod \"nova-api-f251-account-create-fjksm\" (UID: \"cb5a0b33-a377-4d96-acc6-3e9eb27bff2d\") " pod="openstack/nova-api-f251-account-create-fjksm" Oct 07 11:39:09 crc kubenswrapper[4700]: I1007 11:39:09.759695 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-0850-account-create-hvrkb"] Oct 07 11:39:09 crc kubenswrapper[4700]: I1007 11:39:09.833391 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz9xz\" (UniqueName: \"kubernetes.io/projected/3e7a1185-eb36-4f41-83ad-98cf6b028339-kube-api-access-sz9xz\") pod \"nova-cell0-0850-account-create-hvrkb\" (UID: \"3e7a1185-eb36-4f41-83ad-98cf6b028339\") " pod="openstack/nova-cell0-0850-account-create-hvrkb" Oct 07 11:39:09 crc kubenswrapper[4700]: I1007 11:39:09.865532 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f251-account-create-fjksm" Oct 07 11:39:09 crc kubenswrapper[4700]: I1007 11:39:09.934998 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz9xz\" (UniqueName: \"kubernetes.io/projected/3e7a1185-eb36-4f41-83ad-98cf6b028339-kube-api-access-sz9xz\") pod \"nova-cell0-0850-account-create-hvrkb\" (UID: \"3e7a1185-eb36-4f41-83ad-98cf6b028339\") " pod="openstack/nova-cell0-0850-account-create-hvrkb" Oct 07 11:39:09 crc kubenswrapper[4700]: I1007 11:39:09.937656 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-4be5-account-create-lnppp"] Oct 07 11:39:09 crc kubenswrapper[4700]: I1007 11:39:09.939270 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4be5-account-create-lnppp" Oct 07 11:39:09 crc kubenswrapper[4700]: I1007 11:39:09.941829 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 07 11:39:09 crc kubenswrapper[4700]: I1007 11:39:09.945822 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4be5-account-create-lnppp"] Oct 07 11:39:09 crc kubenswrapper[4700]: I1007 11:39:09.961489 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz9xz\" (UniqueName: \"kubernetes.io/projected/3e7a1185-eb36-4f41-83ad-98cf6b028339-kube-api-access-sz9xz\") pod \"nova-cell0-0850-account-create-hvrkb\" (UID: \"3e7a1185-eb36-4f41-83ad-98cf6b028339\") " pod="openstack/nova-cell0-0850-account-create-hvrkb" Oct 07 11:39:10 crc kubenswrapper[4700]: I1007 11:39:10.036289 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt6fk\" (UniqueName: \"kubernetes.io/projected/ff7a27a6-846b-4217-98d2-0bb89c409392-kube-api-access-pt6fk\") pod \"nova-cell1-4be5-account-create-lnppp\" (UID: \"ff7a27a6-846b-4217-98d2-0bb89c409392\") " pod="openstack/nova-cell1-4be5-account-create-lnppp" Oct 07 11:39:10 crc kubenswrapper[4700]: I1007 11:39:10.105879 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-0850-account-create-hvrkb" Oct 07 11:39:10 crc kubenswrapper[4700]: I1007 11:39:10.138757 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt6fk\" (UniqueName: \"kubernetes.io/projected/ff7a27a6-846b-4217-98d2-0bb89c409392-kube-api-access-pt6fk\") pod \"nova-cell1-4be5-account-create-lnppp\" (UID: \"ff7a27a6-846b-4217-98d2-0bb89c409392\") " pod="openstack/nova-cell1-4be5-account-create-lnppp" Oct 07 11:39:10 crc kubenswrapper[4700]: I1007 11:39:10.154988 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt6fk\" (UniqueName: \"kubernetes.io/projected/ff7a27a6-846b-4217-98d2-0bb89c409392-kube-api-access-pt6fk\") pod \"nova-cell1-4be5-account-create-lnppp\" (UID: \"ff7a27a6-846b-4217-98d2-0bb89c409392\") " pod="openstack/nova-cell1-4be5-account-create-lnppp" Oct 07 11:39:10 crc kubenswrapper[4700]: I1007 11:39:10.334093 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4be5-account-create-lnppp" Oct 07 11:39:10 crc kubenswrapper[4700]: I1007 11:39:10.367561 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-f251-account-create-fjksm"] Oct 07 11:39:10 crc kubenswrapper[4700]: W1007 11:39:10.377179 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb5a0b33_a377_4d96_acc6_3e9eb27bff2d.slice/crio-b01e67a9e032f2ce4476426363ff49ecee032f054d22fbf0e97b32e9de8d1dfc WatchSource:0}: Error finding container b01e67a9e032f2ce4476426363ff49ecee032f054d22fbf0e97b32e9de8d1dfc: Status 404 returned error can't find the container with id b01e67a9e032f2ce4476426363ff49ecee032f054d22fbf0e97b32e9de8d1dfc Oct 07 11:39:10 crc kubenswrapper[4700]: I1007 11:39:10.476949 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f251-account-create-fjksm" event={"ID":"cb5a0b33-a377-4d96-acc6-3e9eb27bff2d","Type":"ContainerStarted","Data":"b01e67a9e032f2ce4476426363ff49ecee032f054d22fbf0e97b32e9de8d1dfc"} Oct 07 11:39:10 crc kubenswrapper[4700]: I1007 11:39:10.546064 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-0850-account-create-hvrkb"] Oct 07 11:39:10 crc kubenswrapper[4700]: I1007 11:39:10.585523 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4be5-account-create-lnppp"] Oct 07 11:39:11 crc kubenswrapper[4700]: I1007 11:39:11.494030 4700 generic.go:334] "Generic (PLEG): container finished" podID="cb5a0b33-a377-4d96-acc6-3e9eb27bff2d" containerID="a348301cf929b80c8678ceb34ed69c857905dc7228c177378d33e2b0745d4f80" exitCode=0 Oct 07 11:39:11 crc kubenswrapper[4700]: I1007 11:39:11.494140 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f251-account-create-fjksm" event={"ID":"cb5a0b33-a377-4d96-acc6-3e9eb27bff2d","Type":"ContainerDied","Data":"a348301cf929b80c8678ceb34ed69c857905dc7228c177378d33e2b0745d4f80"} Oct 07 11:39:11 crc kubenswrapper[4700]: I1007 11:39:11.496386 4700 generic.go:334] "Generic (PLEG): container finished" podID="ff7a27a6-846b-4217-98d2-0bb89c409392" containerID="1ae71949fca53228ee2b9b377af103076f6d32d2cbc715fcabe1ee9cb078d549" exitCode=0 Oct 07 11:39:11 crc kubenswrapper[4700]: I1007 11:39:11.496467 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4be5-account-create-lnppp" event={"ID":"ff7a27a6-846b-4217-98d2-0bb89c409392","Type":"ContainerDied","Data":"1ae71949fca53228ee2b9b377af103076f6d32d2cbc715fcabe1ee9cb078d549"} Oct 07 11:39:11 crc kubenswrapper[4700]: I1007 11:39:11.496518 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4be5-account-create-lnppp" event={"ID":"ff7a27a6-846b-4217-98d2-0bb89c409392","Type":"ContainerStarted","Data":"f008b523f4ba60d5b4612494abae0ce4be22d945e370975c30212cc93d0db970"} Oct 07 11:39:11 crc kubenswrapper[4700]: I1007 11:39:11.498913 4700 generic.go:334] "Generic (PLEG): container finished" podID="3e7a1185-eb36-4f41-83ad-98cf6b028339" containerID="b11a9bfdec60767c7fd2db0e8957dfd00bdf83fb9b82fb97e44dc837a3de0005" exitCode=0 Oct 07 11:39:11 crc kubenswrapper[4700]: I1007 11:39:11.498949 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-0850-account-create-hvrkb" event={"ID":"3e7a1185-eb36-4f41-83ad-98cf6b028339","Type":"ContainerDied","Data":"b11a9bfdec60767c7fd2db0e8957dfd00bdf83fb9b82fb97e44dc837a3de0005"} Oct 07 11:39:11 crc kubenswrapper[4700]: I1007 11:39:11.498976 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-0850-account-create-hvrkb" event={"ID":"3e7a1185-eb36-4f41-83ad-98cf6b028339","Type":"ContainerStarted","Data":"925cb47333be640091515a03b4bbe15fd611520257319e903f673b85d25ccd88"} Oct 07 11:39:13 crc kubenswrapper[4700]: I1007 11:39:13.029155 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4be5-account-create-lnppp" Oct 07 11:39:13 crc kubenswrapper[4700]: I1007 11:39:13.036847 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f251-account-create-fjksm" Oct 07 11:39:13 crc kubenswrapper[4700]: I1007 11:39:13.050657 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-0850-account-create-hvrkb" Oct 07 11:39:13 crc kubenswrapper[4700]: I1007 11:39:13.096701 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sz9xz\" (UniqueName: \"kubernetes.io/projected/3e7a1185-eb36-4f41-83ad-98cf6b028339-kube-api-access-sz9xz\") pod \"3e7a1185-eb36-4f41-83ad-98cf6b028339\" (UID: \"3e7a1185-eb36-4f41-83ad-98cf6b028339\") " Oct 07 11:39:13 crc kubenswrapper[4700]: I1007 11:39:13.096768 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt6fk\" (UniqueName: \"kubernetes.io/projected/ff7a27a6-846b-4217-98d2-0bb89c409392-kube-api-access-pt6fk\") pod \"ff7a27a6-846b-4217-98d2-0bb89c409392\" (UID: \"ff7a27a6-846b-4217-98d2-0bb89c409392\") " Oct 07 11:39:13 crc kubenswrapper[4700]: I1007 11:39:13.096923 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nb4rh\" (UniqueName: \"kubernetes.io/projected/cb5a0b33-a377-4d96-acc6-3e9eb27bff2d-kube-api-access-nb4rh\") pod \"cb5a0b33-a377-4d96-acc6-3e9eb27bff2d\" (UID: \"cb5a0b33-a377-4d96-acc6-3e9eb27bff2d\") " Oct 07 11:39:13 crc kubenswrapper[4700]: I1007 11:39:13.103562 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff7a27a6-846b-4217-98d2-0bb89c409392-kube-api-access-pt6fk" (OuterVolumeSpecName: "kube-api-access-pt6fk") pod "ff7a27a6-846b-4217-98d2-0bb89c409392" (UID: "ff7a27a6-846b-4217-98d2-0bb89c409392"). InnerVolumeSpecName "kube-api-access-pt6fk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:39:13 crc kubenswrapper[4700]: I1007 11:39:13.103655 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb5a0b33-a377-4d96-acc6-3e9eb27bff2d-kube-api-access-nb4rh" (OuterVolumeSpecName: "kube-api-access-nb4rh") pod "cb5a0b33-a377-4d96-acc6-3e9eb27bff2d" (UID: "cb5a0b33-a377-4d96-acc6-3e9eb27bff2d"). InnerVolumeSpecName "kube-api-access-nb4rh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:39:13 crc kubenswrapper[4700]: I1007 11:39:13.105143 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e7a1185-eb36-4f41-83ad-98cf6b028339-kube-api-access-sz9xz" (OuterVolumeSpecName: "kube-api-access-sz9xz") pod "3e7a1185-eb36-4f41-83ad-98cf6b028339" (UID: "3e7a1185-eb36-4f41-83ad-98cf6b028339"). InnerVolumeSpecName "kube-api-access-sz9xz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:39:13 crc kubenswrapper[4700]: I1007 11:39:13.198890 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sz9xz\" (UniqueName: \"kubernetes.io/projected/3e7a1185-eb36-4f41-83ad-98cf6b028339-kube-api-access-sz9xz\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:13 crc kubenswrapper[4700]: I1007 11:39:13.198922 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt6fk\" (UniqueName: \"kubernetes.io/projected/ff7a27a6-846b-4217-98d2-0bb89c409392-kube-api-access-pt6fk\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:13 crc kubenswrapper[4700]: I1007 11:39:13.198946 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nb4rh\" (UniqueName: \"kubernetes.io/projected/cb5a0b33-a377-4d96-acc6-3e9eb27bff2d-kube-api-access-nb4rh\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:13 crc kubenswrapper[4700]: I1007 11:39:13.531690 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-0850-account-create-hvrkb" event={"ID":"3e7a1185-eb36-4f41-83ad-98cf6b028339","Type":"ContainerDied","Data":"925cb47333be640091515a03b4bbe15fd611520257319e903f673b85d25ccd88"} Oct 07 11:39:13 crc kubenswrapper[4700]: I1007 11:39:13.531729 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="925cb47333be640091515a03b4bbe15fd611520257319e903f673b85d25ccd88" Oct 07 11:39:13 crc kubenswrapper[4700]: I1007 11:39:13.531741 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-0850-account-create-hvrkb" Oct 07 11:39:13 crc kubenswrapper[4700]: I1007 11:39:13.533545 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f251-account-create-fjksm" event={"ID":"cb5a0b33-a377-4d96-acc6-3e9eb27bff2d","Type":"ContainerDied","Data":"b01e67a9e032f2ce4476426363ff49ecee032f054d22fbf0e97b32e9de8d1dfc"} Oct 07 11:39:13 crc kubenswrapper[4700]: I1007 11:39:13.533565 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b01e67a9e032f2ce4476426363ff49ecee032f054d22fbf0e97b32e9de8d1dfc" Oct 07 11:39:13 crc kubenswrapper[4700]: I1007 11:39:13.533604 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f251-account-create-fjksm" Oct 07 11:39:13 crc kubenswrapper[4700]: I1007 11:39:13.536209 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4be5-account-create-lnppp" event={"ID":"ff7a27a6-846b-4217-98d2-0bb89c409392","Type":"ContainerDied","Data":"f008b523f4ba60d5b4612494abae0ce4be22d945e370975c30212cc93d0db970"} Oct 07 11:39:13 crc kubenswrapper[4700]: I1007 11:39:13.536259 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f008b523f4ba60d5b4612494abae0ce4be22d945e370975c30212cc93d0db970" Oct 07 11:39:13 crc kubenswrapper[4700]: I1007 11:39:13.536274 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4be5-account-create-lnppp" Oct 07 11:39:15 crc kubenswrapper[4700]: I1007 11:39:15.011422 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6cj2z"] Oct 07 11:39:15 crc kubenswrapper[4700]: E1007 11:39:15.012303 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb5a0b33-a377-4d96-acc6-3e9eb27bff2d" containerName="mariadb-account-create" Oct 07 11:39:15 crc kubenswrapper[4700]: I1007 11:39:15.012345 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb5a0b33-a377-4d96-acc6-3e9eb27bff2d" containerName="mariadb-account-create" Oct 07 11:39:15 crc kubenswrapper[4700]: E1007 11:39:15.012363 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff7a27a6-846b-4217-98d2-0bb89c409392" containerName="mariadb-account-create" Oct 07 11:39:15 crc kubenswrapper[4700]: I1007 11:39:15.012373 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff7a27a6-846b-4217-98d2-0bb89c409392" containerName="mariadb-account-create" Oct 07 11:39:15 crc kubenswrapper[4700]: E1007 11:39:15.012404 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e7a1185-eb36-4f41-83ad-98cf6b028339" containerName="mariadb-account-create" Oct 07 11:39:15 crc kubenswrapper[4700]: I1007 11:39:15.012416 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e7a1185-eb36-4f41-83ad-98cf6b028339" containerName="mariadb-account-create" Oct 07 11:39:15 crc kubenswrapper[4700]: I1007 11:39:15.012685 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb5a0b33-a377-4d96-acc6-3e9eb27bff2d" containerName="mariadb-account-create" Oct 07 11:39:15 crc kubenswrapper[4700]: I1007 11:39:15.012711 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff7a27a6-846b-4217-98d2-0bb89c409392" containerName="mariadb-account-create" Oct 07 11:39:15 crc kubenswrapper[4700]: I1007 11:39:15.012726 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e7a1185-eb36-4f41-83ad-98cf6b028339" containerName="mariadb-account-create" Oct 07 11:39:15 crc kubenswrapper[4700]: I1007 11:39:15.013776 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6cj2z" Oct 07 11:39:15 crc kubenswrapper[4700]: I1007 11:39:15.019413 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-zcxqt" Oct 07 11:39:15 crc kubenswrapper[4700]: I1007 11:39:15.021010 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 07 11:39:15 crc kubenswrapper[4700]: I1007 11:39:15.021038 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 07 11:39:15 crc kubenswrapper[4700]: I1007 11:39:15.025973 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6cj2z"] Oct 07 11:39:15 crc kubenswrapper[4700]: I1007 11:39:15.028742 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwh5c\" (UniqueName: \"kubernetes.io/projected/ed147029-af21-44d2-9243-98161b542425-kube-api-access-jwh5c\") pod \"nova-cell0-conductor-db-sync-6cj2z\" (UID: \"ed147029-af21-44d2-9243-98161b542425\") " pod="openstack/nova-cell0-conductor-db-sync-6cj2z" Oct 07 11:39:15 crc kubenswrapper[4700]: I1007 11:39:15.028808 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed147029-af21-44d2-9243-98161b542425-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6cj2z\" (UID: \"ed147029-af21-44d2-9243-98161b542425\") " pod="openstack/nova-cell0-conductor-db-sync-6cj2z" Oct 07 11:39:15 crc kubenswrapper[4700]: I1007 11:39:15.028853 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed147029-af21-44d2-9243-98161b542425-scripts\") pod \"nova-cell0-conductor-db-sync-6cj2z\" (UID: \"ed147029-af21-44d2-9243-98161b542425\") " pod="openstack/nova-cell0-conductor-db-sync-6cj2z" Oct 07 11:39:15 crc kubenswrapper[4700]: I1007 11:39:15.029056 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed147029-af21-44d2-9243-98161b542425-config-data\") pod \"nova-cell0-conductor-db-sync-6cj2z\" (UID: \"ed147029-af21-44d2-9243-98161b542425\") " pod="openstack/nova-cell0-conductor-db-sync-6cj2z" Oct 07 11:39:15 crc kubenswrapper[4700]: I1007 11:39:15.130118 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwh5c\" (UniqueName: \"kubernetes.io/projected/ed147029-af21-44d2-9243-98161b542425-kube-api-access-jwh5c\") pod \"nova-cell0-conductor-db-sync-6cj2z\" (UID: \"ed147029-af21-44d2-9243-98161b542425\") " pod="openstack/nova-cell0-conductor-db-sync-6cj2z" Oct 07 11:39:15 crc kubenswrapper[4700]: I1007 11:39:15.130186 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed147029-af21-44d2-9243-98161b542425-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6cj2z\" (UID: \"ed147029-af21-44d2-9243-98161b542425\") " pod="openstack/nova-cell0-conductor-db-sync-6cj2z" Oct 07 11:39:15 crc kubenswrapper[4700]: I1007 11:39:15.130224 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed147029-af21-44d2-9243-98161b542425-scripts\") pod \"nova-cell0-conductor-db-sync-6cj2z\" (UID: \"ed147029-af21-44d2-9243-98161b542425\") " pod="openstack/nova-cell0-conductor-db-sync-6cj2z" Oct 07 11:39:15 crc kubenswrapper[4700]: I1007 11:39:15.130338 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed147029-af21-44d2-9243-98161b542425-config-data\") pod \"nova-cell0-conductor-db-sync-6cj2z\" (UID: \"ed147029-af21-44d2-9243-98161b542425\") " pod="openstack/nova-cell0-conductor-db-sync-6cj2z" Oct 07 11:39:15 crc kubenswrapper[4700]: I1007 11:39:15.136013 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed147029-af21-44d2-9243-98161b542425-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6cj2z\" (UID: \"ed147029-af21-44d2-9243-98161b542425\") " pod="openstack/nova-cell0-conductor-db-sync-6cj2z" Oct 07 11:39:15 crc kubenswrapper[4700]: I1007 11:39:15.139320 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed147029-af21-44d2-9243-98161b542425-config-data\") pod \"nova-cell0-conductor-db-sync-6cj2z\" (UID: \"ed147029-af21-44d2-9243-98161b542425\") " pod="openstack/nova-cell0-conductor-db-sync-6cj2z" Oct 07 11:39:15 crc kubenswrapper[4700]: I1007 11:39:15.140212 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed147029-af21-44d2-9243-98161b542425-scripts\") pod \"nova-cell0-conductor-db-sync-6cj2z\" (UID: \"ed147029-af21-44d2-9243-98161b542425\") " pod="openstack/nova-cell0-conductor-db-sync-6cj2z" Oct 07 11:39:15 crc kubenswrapper[4700]: I1007 11:39:15.156185 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwh5c\" (UniqueName: \"kubernetes.io/projected/ed147029-af21-44d2-9243-98161b542425-kube-api-access-jwh5c\") pod \"nova-cell0-conductor-db-sync-6cj2z\" (UID: \"ed147029-af21-44d2-9243-98161b542425\") " pod="openstack/nova-cell0-conductor-db-sync-6cj2z" Oct 07 11:39:15 crc kubenswrapper[4700]: I1007 11:39:15.331780 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6cj2z" Oct 07 11:39:15 crc kubenswrapper[4700]: I1007 11:39:15.333856 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 11:39:15 crc kubenswrapper[4700]: I1007 11:39:15.333910 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 11:39:15 crc kubenswrapper[4700]: I1007 11:39:15.770502 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6cj2z"] Oct 07 11:39:15 crc kubenswrapper[4700]: W1007 11:39:15.777296 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded147029_af21_44d2_9243_98161b542425.slice/crio-4ffed46416385797bc5450c2350f440195fe57f5e9085b654d0fc238bd86198e WatchSource:0}: Error finding container 4ffed46416385797bc5450c2350f440195fe57f5e9085b654d0fc238bd86198e: Status 404 returned error can't find the container with id 4ffed46416385797bc5450c2350f440195fe57f5e9085b654d0fc238bd86198e Oct 07 11:39:16 crc kubenswrapper[4700]: I1007 11:39:16.568071 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6cj2z" event={"ID":"ed147029-af21-44d2-9243-98161b542425","Type":"ContainerStarted","Data":"4ffed46416385797bc5450c2350f440195fe57f5e9085b654d0fc238bd86198e"} Oct 07 11:39:17 crc kubenswrapper[4700]: I1007 11:39:17.958325 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:39:17 crc kubenswrapper[4700]: I1007 11:39:17.958900 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="70c71136-ec60-4870-b5e0-8196e66155fd" containerName="ceilometer-central-agent" containerID="cri-o://ad38e2021a95c769c45d648c2360d4b5fb387f0087c7fa842e51732c28dbbcdd" gracePeriod=30 Oct 07 11:39:17 crc kubenswrapper[4700]: I1007 11:39:17.959431 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="70c71136-ec60-4870-b5e0-8196e66155fd" containerName="proxy-httpd" containerID="cri-o://e556536cefcf3feeb2c2c67f27c1745c76adace5b01e673de6bfd5b6137197e2" gracePeriod=30 Oct 07 11:39:17 crc kubenswrapper[4700]: I1007 11:39:17.959565 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="70c71136-ec60-4870-b5e0-8196e66155fd" containerName="sg-core" containerID="cri-o://dd663f0937cb2dd644f9a1edfa7d044c6ea12c2068a3f5d72924d8fd01968c55" gracePeriod=30 Oct 07 11:39:17 crc kubenswrapper[4700]: I1007 11:39:17.959674 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="70c71136-ec60-4870-b5e0-8196e66155fd" containerName="ceilometer-notification-agent" containerID="cri-o://e8dae923838069e574b176c0ba992ca8cb3f7282a011fe292f321a73e2d7c820" gracePeriod=30 Oct 07 11:39:17 crc kubenswrapper[4700]: I1007 11:39:17.971658 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="70c71136-ec60-4870-b5e0-8196e66155fd" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.184:3000/\": EOF" Oct 07 11:39:18 crc kubenswrapper[4700]: I1007 11:39:18.587085 4700 generic.go:334] "Generic (PLEG): container finished" podID="70c71136-ec60-4870-b5e0-8196e66155fd" containerID="e556536cefcf3feeb2c2c67f27c1745c76adace5b01e673de6bfd5b6137197e2" exitCode=0 Oct 07 11:39:18 crc kubenswrapper[4700]: I1007 11:39:18.587393 4700 generic.go:334] "Generic (PLEG): container finished" podID="70c71136-ec60-4870-b5e0-8196e66155fd" containerID="dd663f0937cb2dd644f9a1edfa7d044c6ea12c2068a3f5d72924d8fd01968c55" exitCode=2 Oct 07 11:39:18 crc kubenswrapper[4700]: I1007 11:39:18.587403 4700 generic.go:334] "Generic (PLEG): container finished" podID="70c71136-ec60-4870-b5e0-8196e66155fd" containerID="ad38e2021a95c769c45d648c2360d4b5fb387f0087c7fa842e51732c28dbbcdd" exitCode=0 Oct 07 11:39:18 crc kubenswrapper[4700]: I1007 11:39:18.587177 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70c71136-ec60-4870-b5e0-8196e66155fd","Type":"ContainerDied","Data":"e556536cefcf3feeb2c2c67f27c1745c76adace5b01e673de6bfd5b6137197e2"} Oct 07 11:39:18 crc kubenswrapper[4700]: I1007 11:39:18.587438 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70c71136-ec60-4870-b5e0-8196e66155fd","Type":"ContainerDied","Data":"dd663f0937cb2dd644f9a1edfa7d044c6ea12c2068a3f5d72924d8fd01968c55"} Oct 07 11:39:18 crc kubenswrapper[4700]: I1007 11:39:18.587467 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70c71136-ec60-4870-b5e0-8196e66155fd","Type":"ContainerDied","Data":"ad38e2021a95c769c45d648c2360d4b5fb387f0087c7fa842e51732c28dbbcdd"} Oct 07 11:39:21 crc kubenswrapper[4700]: I1007 11:39:21.620500 4700 generic.go:334] "Generic (PLEG): container finished" podID="70c71136-ec60-4870-b5e0-8196e66155fd" containerID="e8dae923838069e574b176c0ba992ca8cb3f7282a011fe292f321a73e2d7c820" exitCode=0 Oct 07 11:39:21 crc kubenswrapper[4700]: I1007 11:39:21.620581 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70c71136-ec60-4870-b5e0-8196e66155fd","Type":"ContainerDied","Data":"e8dae923838069e574b176c0ba992ca8cb3f7282a011fe292f321a73e2d7c820"} Oct 07 11:39:22 crc kubenswrapper[4700]: I1007 11:39:22.726789 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 11:39:22 crc kubenswrapper[4700]: I1007 11:39:22.881050 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70c71136-ec60-4870-b5e0-8196e66155fd-sg-core-conf-yaml\") pod \"70c71136-ec60-4870-b5e0-8196e66155fd\" (UID: \"70c71136-ec60-4870-b5e0-8196e66155fd\") " Oct 07 11:39:22 crc kubenswrapper[4700]: I1007 11:39:22.881350 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c71136-ec60-4870-b5e0-8196e66155fd-combined-ca-bundle\") pod \"70c71136-ec60-4870-b5e0-8196e66155fd\" (UID: \"70c71136-ec60-4870-b5e0-8196e66155fd\") " Oct 07 11:39:22 crc kubenswrapper[4700]: I1007 11:39:22.881395 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70c71136-ec60-4870-b5e0-8196e66155fd-log-httpd\") pod \"70c71136-ec60-4870-b5e0-8196e66155fd\" (UID: \"70c71136-ec60-4870-b5e0-8196e66155fd\") " Oct 07 11:39:22 crc kubenswrapper[4700]: I1007 11:39:22.881487 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65596\" (UniqueName: \"kubernetes.io/projected/70c71136-ec60-4870-b5e0-8196e66155fd-kube-api-access-65596\") pod \"70c71136-ec60-4870-b5e0-8196e66155fd\" (UID: \"70c71136-ec60-4870-b5e0-8196e66155fd\") " Oct 07 11:39:22 crc kubenswrapper[4700]: I1007 11:39:22.882200 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c71136-ec60-4870-b5e0-8196e66155fd-config-data\") pod \"70c71136-ec60-4870-b5e0-8196e66155fd\" (UID: \"70c71136-ec60-4870-b5e0-8196e66155fd\") " Oct 07 11:39:22 crc kubenswrapper[4700]: I1007 11:39:22.882347 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70c71136-ec60-4870-b5e0-8196e66155fd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "70c71136-ec60-4870-b5e0-8196e66155fd" (UID: "70c71136-ec60-4870-b5e0-8196e66155fd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:39:22 crc kubenswrapper[4700]: I1007 11:39:22.882422 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70c71136-ec60-4870-b5e0-8196e66155fd-run-httpd\") pod \"70c71136-ec60-4870-b5e0-8196e66155fd\" (UID: \"70c71136-ec60-4870-b5e0-8196e66155fd\") " Oct 07 11:39:22 crc kubenswrapper[4700]: I1007 11:39:22.882686 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70c71136-ec60-4870-b5e0-8196e66155fd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "70c71136-ec60-4870-b5e0-8196e66155fd" (UID: "70c71136-ec60-4870-b5e0-8196e66155fd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:39:22 crc kubenswrapper[4700]: I1007 11:39:22.882448 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70c71136-ec60-4870-b5e0-8196e66155fd-scripts\") pod \"70c71136-ec60-4870-b5e0-8196e66155fd\" (UID: \"70c71136-ec60-4870-b5e0-8196e66155fd\") " Oct 07 11:39:22 crc kubenswrapper[4700]: I1007 11:39:22.883481 4700 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70c71136-ec60-4870-b5e0-8196e66155fd-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:22 crc kubenswrapper[4700]: I1007 11:39:22.883498 4700 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70c71136-ec60-4870-b5e0-8196e66155fd-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:22 crc kubenswrapper[4700]: I1007 11:39:22.886353 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70c71136-ec60-4870-b5e0-8196e66155fd-scripts" (OuterVolumeSpecName: "scripts") pod "70c71136-ec60-4870-b5e0-8196e66155fd" (UID: "70c71136-ec60-4870-b5e0-8196e66155fd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:39:22 crc kubenswrapper[4700]: I1007 11:39:22.886410 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70c71136-ec60-4870-b5e0-8196e66155fd-kube-api-access-65596" (OuterVolumeSpecName: "kube-api-access-65596") pod "70c71136-ec60-4870-b5e0-8196e66155fd" (UID: "70c71136-ec60-4870-b5e0-8196e66155fd"). InnerVolumeSpecName "kube-api-access-65596". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:39:22 crc kubenswrapper[4700]: I1007 11:39:22.907590 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70c71136-ec60-4870-b5e0-8196e66155fd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "70c71136-ec60-4870-b5e0-8196e66155fd" (UID: "70c71136-ec60-4870-b5e0-8196e66155fd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:39:22 crc kubenswrapper[4700]: I1007 11:39:22.964036 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70c71136-ec60-4870-b5e0-8196e66155fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70c71136-ec60-4870-b5e0-8196e66155fd" (UID: "70c71136-ec60-4870-b5e0-8196e66155fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:39:22 crc kubenswrapper[4700]: I1007 11:39:22.969539 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70c71136-ec60-4870-b5e0-8196e66155fd-config-data" (OuterVolumeSpecName: "config-data") pod "70c71136-ec60-4870-b5e0-8196e66155fd" (UID: "70c71136-ec60-4870-b5e0-8196e66155fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:39:22 crc kubenswrapper[4700]: I1007 11:39:22.985372 4700 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70c71136-ec60-4870-b5e0-8196e66155fd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:22 crc kubenswrapper[4700]: I1007 11:39:22.985405 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c71136-ec60-4870-b5e0-8196e66155fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:22 crc kubenswrapper[4700]: I1007 11:39:22.985415 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65596\" (UniqueName: \"kubernetes.io/projected/70c71136-ec60-4870-b5e0-8196e66155fd-kube-api-access-65596\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:22 crc kubenswrapper[4700]: I1007 11:39:22.985426 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c71136-ec60-4870-b5e0-8196e66155fd-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:22 crc kubenswrapper[4700]: I1007 11:39:22.985469 4700 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70c71136-ec60-4870-b5e0-8196e66155fd-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.647955 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70c71136-ec60-4870-b5e0-8196e66155fd","Type":"ContainerDied","Data":"a3c08e46e53706d3b024d1f71b6d8e43b616d9117577b886d76724e060f47abd"} Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.648014 4700 scope.go:117] "RemoveContainer" containerID="e556536cefcf3feeb2c2c67f27c1745c76adace5b01e673de6bfd5b6137197e2" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.648167 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.655966 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6cj2z" event={"ID":"ed147029-af21-44d2-9243-98161b542425","Type":"ContainerStarted","Data":"51cec8adf1716593ae4097d82c2d01a7ee177da51ae7618ccd19de52e48a8c00"} Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.690173 4700 scope.go:117] "RemoveContainer" containerID="dd663f0937cb2dd644f9a1edfa7d044c6ea12c2068a3f5d72924d8fd01968c55" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.710849 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-6cj2z" podStartSLOduration=3.051001137 podStartE2EDuration="9.710828118s" podCreationTimestamp="2025-10-07 11:39:14 +0000 UTC" firstStartedPulling="2025-10-07 11:39:15.779585959 +0000 UTC m=+1122.575984948" lastFinishedPulling="2025-10-07 11:39:22.43941294 +0000 UTC m=+1129.235811929" observedRunningTime="2025-10-07 11:39:23.688725052 +0000 UTC m=+1130.485124041" watchObservedRunningTime="2025-10-07 11:39:23.710828118 +0000 UTC m=+1130.507227107" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.714875 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.727338 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.754961 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:39:23 crc kubenswrapper[4700]: E1007 11:39:23.756041 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70c71136-ec60-4870-b5e0-8196e66155fd" containerName="sg-core" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.756093 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="70c71136-ec60-4870-b5e0-8196e66155fd" containerName="sg-core" Oct 07 11:39:23 crc kubenswrapper[4700]: E1007 11:39:23.756119 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70c71136-ec60-4870-b5e0-8196e66155fd" containerName="ceilometer-central-agent" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.756131 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="70c71136-ec60-4870-b5e0-8196e66155fd" containerName="ceilometer-central-agent" Oct 07 11:39:23 crc kubenswrapper[4700]: E1007 11:39:23.756152 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70c71136-ec60-4870-b5e0-8196e66155fd" containerName="ceilometer-notification-agent" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.756164 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="70c71136-ec60-4870-b5e0-8196e66155fd" containerName="ceilometer-notification-agent" Oct 07 11:39:23 crc kubenswrapper[4700]: E1007 11:39:23.756205 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70c71136-ec60-4870-b5e0-8196e66155fd" containerName="proxy-httpd" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.756216 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="70c71136-ec60-4870-b5e0-8196e66155fd" containerName="proxy-httpd" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.758076 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="70c71136-ec60-4870-b5e0-8196e66155fd" containerName="ceilometer-central-agent" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.758106 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="70c71136-ec60-4870-b5e0-8196e66155fd" containerName="proxy-httpd" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.758127 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="70c71136-ec60-4870-b5e0-8196e66155fd" containerName="sg-core" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.758149 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="70c71136-ec60-4870-b5e0-8196e66155fd" containerName="ceilometer-notification-agent" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.761080 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.764257 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.764502 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.768744 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.793250 4700 scope.go:117] "RemoveContainer" containerID="e8dae923838069e574b176c0ba992ca8cb3f7282a011fe292f321a73e2d7c820" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.802779 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b765c26c-24c0-41b8-a126-0524806c134d-log-httpd\") pod \"ceilometer-0\" (UID: \"b765c26c-24c0-41b8-a126-0524806c134d\") " pod="openstack/ceilometer-0" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.803846 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hchbt\" (UniqueName: \"kubernetes.io/projected/b765c26c-24c0-41b8-a126-0524806c134d-kube-api-access-hchbt\") pod \"ceilometer-0\" (UID: \"b765c26c-24c0-41b8-a126-0524806c134d\") " pod="openstack/ceilometer-0" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.803888 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b765c26c-24c0-41b8-a126-0524806c134d-config-data\") pod \"ceilometer-0\" (UID: \"b765c26c-24c0-41b8-a126-0524806c134d\") " pod="openstack/ceilometer-0" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.803950 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b765c26c-24c0-41b8-a126-0524806c134d-scripts\") pod \"ceilometer-0\" (UID: \"b765c26c-24c0-41b8-a126-0524806c134d\") " pod="openstack/ceilometer-0" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.803992 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b765c26c-24c0-41b8-a126-0524806c134d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b765c26c-24c0-41b8-a126-0524806c134d\") " pod="openstack/ceilometer-0" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.804014 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b765c26c-24c0-41b8-a126-0524806c134d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b765c26c-24c0-41b8-a126-0524806c134d\") " pod="openstack/ceilometer-0" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.804064 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b765c26c-24c0-41b8-a126-0524806c134d-run-httpd\") pod \"ceilometer-0\" (UID: \"b765c26c-24c0-41b8-a126-0524806c134d\") " pod="openstack/ceilometer-0" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.835442 4700 scope.go:117] "RemoveContainer" containerID="ad38e2021a95c769c45d648c2360d4b5fb387f0087c7fa842e51732c28dbbcdd" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.905852 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b765c26c-24c0-41b8-a126-0524806c134d-scripts\") pod \"ceilometer-0\" (UID: \"b765c26c-24c0-41b8-a126-0524806c134d\") " pod="openstack/ceilometer-0" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.905919 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b765c26c-24c0-41b8-a126-0524806c134d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b765c26c-24c0-41b8-a126-0524806c134d\") " pod="openstack/ceilometer-0" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.905951 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b765c26c-24c0-41b8-a126-0524806c134d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b765c26c-24c0-41b8-a126-0524806c134d\") " pod="openstack/ceilometer-0" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.906044 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b765c26c-24c0-41b8-a126-0524806c134d-run-httpd\") pod \"ceilometer-0\" (UID: \"b765c26c-24c0-41b8-a126-0524806c134d\") " pod="openstack/ceilometer-0" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.906173 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b765c26c-24c0-41b8-a126-0524806c134d-log-httpd\") pod \"ceilometer-0\" (UID: \"b765c26c-24c0-41b8-a126-0524806c134d\") " pod="openstack/ceilometer-0" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.906299 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hchbt\" (UniqueName: \"kubernetes.io/projected/b765c26c-24c0-41b8-a126-0524806c134d-kube-api-access-hchbt\") pod \"ceilometer-0\" (UID: \"b765c26c-24c0-41b8-a126-0524806c134d\") " pod="openstack/ceilometer-0" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.906347 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b765c26c-24c0-41b8-a126-0524806c134d-config-data\") pod \"ceilometer-0\" (UID: \"b765c26c-24c0-41b8-a126-0524806c134d\") " pod="openstack/ceilometer-0" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.907409 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b765c26c-24c0-41b8-a126-0524806c134d-log-httpd\") pod \"ceilometer-0\" (UID: \"b765c26c-24c0-41b8-a126-0524806c134d\") " pod="openstack/ceilometer-0" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.907504 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b765c26c-24c0-41b8-a126-0524806c134d-run-httpd\") pod \"ceilometer-0\" (UID: \"b765c26c-24c0-41b8-a126-0524806c134d\") " pod="openstack/ceilometer-0" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.910698 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b765c26c-24c0-41b8-a126-0524806c134d-config-data\") pod \"ceilometer-0\" (UID: \"b765c26c-24c0-41b8-a126-0524806c134d\") " pod="openstack/ceilometer-0" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.920448 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b765c26c-24c0-41b8-a126-0524806c134d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b765c26c-24c0-41b8-a126-0524806c134d\") " pod="openstack/ceilometer-0" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.924288 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b765c26c-24c0-41b8-a126-0524806c134d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b765c26c-24c0-41b8-a126-0524806c134d\") " pod="openstack/ceilometer-0" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.927771 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hchbt\" (UniqueName: \"kubernetes.io/projected/b765c26c-24c0-41b8-a126-0524806c134d-kube-api-access-hchbt\") pod \"ceilometer-0\" (UID: \"b765c26c-24c0-41b8-a126-0524806c134d\") " pod="openstack/ceilometer-0" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.934914 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b765c26c-24c0-41b8-a126-0524806c134d-scripts\") pod \"ceilometer-0\" (UID: \"b765c26c-24c0-41b8-a126-0524806c134d\") " pod="openstack/ceilometer-0" Oct 07 11:39:23 crc kubenswrapper[4700]: I1007 11:39:23.991169 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70c71136-ec60-4870-b5e0-8196e66155fd" path="/var/lib/kubelet/pods/70c71136-ec60-4870-b5e0-8196e66155fd/volumes" Oct 07 11:39:24 crc kubenswrapper[4700]: I1007 11:39:24.128557 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 11:39:24 crc kubenswrapper[4700]: I1007 11:39:24.572729 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:39:24 crc kubenswrapper[4700]: W1007 11:39:24.578034 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb765c26c_24c0_41b8_a126_0524806c134d.slice/crio-6902e3bc236407e12ed29d73d61d13cdb85e8be92dba60de94684c9779c4aa06 WatchSource:0}: Error finding container 6902e3bc236407e12ed29d73d61d13cdb85e8be92dba60de94684c9779c4aa06: Status 404 returned error can't find the container with id 6902e3bc236407e12ed29d73d61d13cdb85e8be92dba60de94684c9779c4aa06 Oct 07 11:39:24 crc kubenswrapper[4700]: I1007 11:39:24.668438 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b765c26c-24c0-41b8-a126-0524806c134d","Type":"ContainerStarted","Data":"6902e3bc236407e12ed29d73d61d13cdb85e8be92dba60de94684c9779c4aa06"} Oct 07 11:39:25 crc kubenswrapper[4700]: I1007 11:39:25.680826 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b765c26c-24c0-41b8-a126-0524806c134d","Type":"ContainerStarted","Data":"6e927eac471a5fd273fd455375bf8554a20a98e87b7c46a54563db72f4d86b97"} Oct 07 11:39:26 crc kubenswrapper[4700]: I1007 11:39:26.699107 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b765c26c-24c0-41b8-a126-0524806c134d","Type":"ContainerStarted","Data":"5e5801de6a400d3d3dcffcd9343f50edcd34145d0cee52bf49ee0f30d7eb17cb"} Oct 07 11:39:27 crc kubenswrapper[4700]: I1007 11:39:27.716161 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b765c26c-24c0-41b8-a126-0524806c134d","Type":"ContainerStarted","Data":"afc0c4721959cf69ae63566f0342a4922a525abca700f59eb5d713f0cbad60fc"} Oct 07 11:39:28 crc kubenswrapper[4700]: I1007 11:39:28.732233 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b765c26c-24c0-41b8-a126-0524806c134d","Type":"ContainerStarted","Data":"f83d0e3646111a30ff529e17ebeb46e1ef5491f801750f3a1defb5aff45ed57a"} Oct 07 11:39:28 crc kubenswrapper[4700]: I1007 11:39:28.734629 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 11:39:28 crc kubenswrapper[4700]: I1007 11:39:28.768979 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.109331644 podStartE2EDuration="5.768955868s" podCreationTimestamp="2025-10-07 11:39:23 +0000 UTC" firstStartedPulling="2025-10-07 11:39:24.580297874 +0000 UTC m=+1131.376696863" lastFinishedPulling="2025-10-07 11:39:28.239922088 +0000 UTC m=+1135.036321087" observedRunningTime="2025-10-07 11:39:28.756415452 +0000 UTC m=+1135.552814471" watchObservedRunningTime="2025-10-07 11:39:28.768955868 +0000 UTC m=+1135.565354867" Oct 07 11:39:32 crc kubenswrapper[4700]: I1007 11:39:32.783952 4700 generic.go:334] "Generic (PLEG): container finished" podID="ed147029-af21-44d2-9243-98161b542425" containerID="51cec8adf1716593ae4097d82c2d01a7ee177da51ae7618ccd19de52e48a8c00" exitCode=0 Oct 07 11:39:32 crc kubenswrapper[4700]: I1007 11:39:32.784075 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6cj2z" event={"ID":"ed147029-af21-44d2-9243-98161b542425","Type":"ContainerDied","Data":"51cec8adf1716593ae4097d82c2d01a7ee177da51ae7618ccd19de52e48a8c00"} Oct 07 11:39:34 crc kubenswrapper[4700]: I1007 11:39:34.146293 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6cj2z" Oct 07 11:39:34 crc kubenswrapper[4700]: I1007 11:39:34.317408 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed147029-af21-44d2-9243-98161b542425-config-data\") pod \"ed147029-af21-44d2-9243-98161b542425\" (UID: \"ed147029-af21-44d2-9243-98161b542425\") " Oct 07 11:39:34 crc kubenswrapper[4700]: I1007 11:39:34.317515 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed147029-af21-44d2-9243-98161b542425-combined-ca-bundle\") pod \"ed147029-af21-44d2-9243-98161b542425\" (UID: \"ed147029-af21-44d2-9243-98161b542425\") " Oct 07 11:39:34 crc kubenswrapper[4700]: I1007 11:39:34.317565 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwh5c\" (UniqueName: \"kubernetes.io/projected/ed147029-af21-44d2-9243-98161b542425-kube-api-access-jwh5c\") pod \"ed147029-af21-44d2-9243-98161b542425\" (UID: \"ed147029-af21-44d2-9243-98161b542425\") " Oct 07 11:39:34 crc kubenswrapper[4700]: I1007 11:39:34.317723 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed147029-af21-44d2-9243-98161b542425-scripts\") pod \"ed147029-af21-44d2-9243-98161b542425\" (UID: \"ed147029-af21-44d2-9243-98161b542425\") " Oct 07 11:39:34 crc kubenswrapper[4700]: I1007 11:39:34.322699 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed147029-af21-44d2-9243-98161b542425-scripts" (OuterVolumeSpecName: "scripts") pod "ed147029-af21-44d2-9243-98161b542425" (UID: "ed147029-af21-44d2-9243-98161b542425"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:39:34 crc kubenswrapper[4700]: I1007 11:39:34.324165 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed147029-af21-44d2-9243-98161b542425-kube-api-access-jwh5c" (OuterVolumeSpecName: "kube-api-access-jwh5c") pod "ed147029-af21-44d2-9243-98161b542425" (UID: "ed147029-af21-44d2-9243-98161b542425"). InnerVolumeSpecName "kube-api-access-jwh5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:39:34 crc kubenswrapper[4700]: I1007 11:39:34.349272 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed147029-af21-44d2-9243-98161b542425-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed147029-af21-44d2-9243-98161b542425" (UID: "ed147029-af21-44d2-9243-98161b542425"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:39:34 crc kubenswrapper[4700]: I1007 11:39:34.349298 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed147029-af21-44d2-9243-98161b542425-config-data" (OuterVolumeSpecName: "config-data") pod "ed147029-af21-44d2-9243-98161b542425" (UID: "ed147029-af21-44d2-9243-98161b542425"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:39:34 crc kubenswrapper[4700]: I1007 11:39:34.419728 4700 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed147029-af21-44d2-9243-98161b542425-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:34 crc kubenswrapper[4700]: I1007 11:39:34.419757 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed147029-af21-44d2-9243-98161b542425-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:34 crc kubenswrapper[4700]: I1007 11:39:34.419768 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed147029-af21-44d2-9243-98161b542425-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:34 crc kubenswrapper[4700]: I1007 11:39:34.419778 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwh5c\" (UniqueName: \"kubernetes.io/projected/ed147029-af21-44d2-9243-98161b542425-kube-api-access-jwh5c\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:34 crc kubenswrapper[4700]: I1007 11:39:34.806994 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6cj2z" event={"ID":"ed147029-af21-44d2-9243-98161b542425","Type":"ContainerDied","Data":"4ffed46416385797bc5450c2350f440195fe57f5e9085b654d0fc238bd86198e"} Oct 07 11:39:34 crc kubenswrapper[4700]: I1007 11:39:34.807594 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ffed46416385797bc5450c2350f440195fe57f5e9085b654d0fc238bd86198e" Oct 07 11:39:34 crc kubenswrapper[4700]: I1007 11:39:34.807061 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6cj2z" Oct 07 11:39:35 crc kubenswrapper[4700]: I1007 11:39:35.038856 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 11:39:35 crc kubenswrapper[4700]: E1007 11:39:35.039415 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed147029-af21-44d2-9243-98161b542425" containerName="nova-cell0-conductor-db-sync" Oct 07 11:39:35 crc kubenswrapper[4700]: I1007 11:39:35.039435 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed147029-af21-44d2-9243-98161b542425" containerName="nova-cell0-conductor-db-sync" Oct 07 11:39:35 crc kubenswrapper[4700]: I1007 11:39:35.039701 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed147029-af21-44d2-9243-98161b542425" containerName="nova-cell0-conductor-db-sync" Oct 07 11:39:35 crc kubenswrapper[4700]: I1007 11:39:35.040486 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 07 11:39:35 crc kubenswrapper[4700]: I1007 11:39:35.043114 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 07 11:39:35 crc kubenswrapper[4700]: I1007 11:39:35.043238 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-zcxqt" Oct 07 11:39:35 crc kubenswrapper[4700]: I1007 11:39:35.058330 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 11:39:35 crc kubenswrapper[4700]: I1007 11:39:35.234588 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/606c2b50-ba1e-4181-8615-29f434e0597e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"606c2b50-ba1e-4181-8615-29f434e0597e\") " pod="openstack/nova-cell0-conductor-0" Oct 07 11:39:35 crc kubenswrapper[4700]: I1007 11:39:35.234644 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/606c2b50-ba1e-4181-8615-29f434e0597e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"606c2b50-ba1e-4181-8615-29f434e0597e\") " pod="openstack/nova-cell0-conductor-0" Oct 07 11:39:35 crc kubenswrapper[4700]: I1007 11:39:35.234969 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwzrx\" (UniqueName: \"kubernetes.io/projected/606c2b50-ba1e-4181-8615-29f434e0597e-kube-api-access-dwzrx\") pod \"nova-cell0-conductor-0\" (UID: \"606c2b50-ba1e-4181-8615-29f434e0597e\") " pod="openstack/nova-cell0-conductor-0" Oct 07 11:39:35 crc kubenswrapper[4700]: I1007 11:39:35.337700 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/606c2b50-ba1e-4181-8615-29f434e0597e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"606c2b50-ba1e-4181-8615-29f434e0597e\") " pod="openstack/nova-cell0-conductor-0" Oct 07 11:39:35 crc kubenswrapper[4700]: I1007 11:39:35.337775 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/606c2b50-ba1e-4181-8615-29f434e0597e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"606c2b50-ba1e-4181-8615-29f434e0597e\") " pod="openstack/nova-cell0-conductor-0" Oct 07 11:39:35 crc kubenswrapper[4700]: I1007 11:39:35.337838 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwzrx\" (UniqueName: \"kubernetes.io/projected/606c2b50-ba1e-4181-8615-29f434e0597e-kube-api-access-dwzrx\") pod \"nova-cell0-conductor-0\" (UID: \"606c2b50-ba1e-4181-8615-29f434e0597e\") " pod="openstack/nova-cell0-conductor-0" Oct 07 11:39:35 crc kubenswrapper[4700]: I1007 11:39:35.347531 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/606c2b50-ba1e-4181-8615-29f434e0597e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"606c2b50-ba1e-4181-8615-29f434e0597e\") " pod="openstack/nova-cell0-conductor-0" Oct 07 11:39:35 crc kubenswrapper[4700]: I1007 11:39:35.350001 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/606c2b50-ba1e-4181-8615-29f434e0597e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"606c2b50-ba1e-4181-8615-29f434e0597e\") " pod="openstack/nova-cell0-conductor-0" Oct 07 11:39:35 crc kubenswrapper[4700]: I1007 11:39:35.360953 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwzrx\" (UniqueName: \"kubernetes.io/projected/606c2b50-ba1e-4181-8615-29f434e0597e-kube-api-access-dwzrx\") pod \"nova-cell0-conductor-0\" (UID: \"606c2b50-ba1e-4181-8615-29f434e0597e\") " pod="openstack/nova-cell0-conductor-0" Oct 07 11:39:35 crc kubenswrapper[4700]: I1007 11:39:35.370121 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 07 11:39:35 crc kubenswrapper[4700]: I1007 11:39:35.837219 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 11:39:35 crc kubenswrapper[4700]: W1007 11:39:35.837630 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod606c2b50_ba1e_4181_8615_29f434e0597e.slice/crio-c6914e95cc9119837f5acc82924884a8ed568261eef4c4412f77197e6c85b752 WatchSource:0}: Error finding container c6914e95cc9119837f5acc82924884a8ed568261eef4c4412f77197e6c85b752: Status 404 returned error can't find the container with id c6914e95cc9119837f5acc82924884a8ed568261eef4c4412f77197e6c85b752 Oct 07 11:39:36 crc kubenswrapper[4700]: I1007 11:39:36.829519 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"606c2b50-ba1e-4181-8615-29f434e0597e","Type":"ContainerStarted","Data":"e7a946f40e76dab7bcf859c4bcf446c3c7ec0457d611b840f5bbef35e69be16c"} Oct 07 11:39:36 crc kubenswrapper[4700]: I1007 11:39:36.830246 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"606c2b50-ba1e-4181-8615-29f434e0597e","Type":"ContainerStarted","Data":"c6914e95cc9119837f5acc82924884a8ed568261eef4c4412f77197e6c85b752"} Oct 07 11:39:36 crc kubenswrapper[4700]: I1007 11:39:36.830272 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 07 11:39:36 crc kubenswrapper[4700]: I1007 11:39:36.859814 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.859796325 podStartE2EDuration="1.859796325s" podCreationTimestamp="2025-10-07 11:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:39:36.850912953 +0000 UTC m=+1143.647312012" watchObservedRunningTime="2025-10-07 11:39:36.859796325 +0000 UTC m=+1143.656195314" Oct 07 11:39:45 crc kubenswrapper[4700]: I1007 11:39:45.334052 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 11:39:45 crc kubenswrapper[4700]: I1007 11:39:45.334586 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 11:39:45 crc kubenswrapper[4700]: I1007 11:39:45.334665 4700 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" Oct 07 11:39:45 crc kubenswrapper[4700]: I1007 11:39:45.335921 4700 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"39f51c218de12efa082d2cc5034a6195a011e23573e568730496b6798d2fbe71"} pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 11:39:45 crc kubenswrapper[4700]: I1007 11:39:45.336049 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" containerID="cri-o://39f51c218de12efa082d2cc5034a6195a011e23573e568730496b6798d2fbe71" gracePeriod=600 Oct 07 11:39:45 crc kubenswrapper[4700]: I1007 11:39:45.423645 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 07 11:39:45 crc kubenswrapper[4700]: I1007 11:39:45.930645 4700 generic.go:334] "Generic (PLEG): container finished" podID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerID="39f51c218de12efa082d2cc5034a6195a011e23573e568730496b6798d2fbe71" exitCode=0 Oct 07 11:39:45 crc kubenswrapper[4700]: I1007 11:39:45.930793 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" event={"ID":"97a77b38-e9b1-4243-ac3a-28d83d87cf15","Type":"ContainerDied","Data":"39f51c218de12efa082d2cc5034a6195a011e23573e568730496b6798d2fbe71"} Oct 07 11:39:45 crc kubenswrapper[4700]: I1007 11:39:45.930959 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" event={"ID":"97a77b38-e9b1-4243-ac3a-28d83d87cf15","Type":"ContainerStarted","Data":"2dd00ae003149c481b44bf29df7a596aca95ac6b3173a4a0af3e08d67d5e4363"} Oct 07 11:39:45 crc kubenswrapper[4700]: I1007 11:39:45.930983 4700 scope.go:117] "RemoveContainer" containerID="b9e5650b4ada44376befecad6ee06386391296fc23c23a71914ec3f35d9306ee" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.021798 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-7vgrf"] Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.022927 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7vgrf" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.026985 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.029614 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.079767 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-7vgrf"] Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.186763 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f14d077-e353-4931-a13e-c2184c602276-config-data\") pod \"nova-cell0-cell-mapping-7vgrf\" (UID: \"8f14d077-e353-4931-a13e-c2184c602276\") " pod="openstack/nova-cell0-cell-mapping-7vgrf" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.187058 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f14d077-e353-4931-a13e-c2184c602276-scripts\") pod \"nova-cell0-cell-mapping-7vgrf\" (UID: \"8f14d077-e353-4931-a13e-c2184c602276\") " pod="openstack/nova-cell0-cell-mapping-7vgrf" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.187104 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9669\" (UniqueName: \"kubernetes.io/projected/8f14d077-e353-4931-a13e-c2184c602276-kube-api-access-j9669\") pod \"nova-cell0-cell-mapping-7vgrf\" (UID: \"8f14d077-e353-4931-a13e-c2184c602276\") " pod="openstack/nova-cell0-cell-mapping-7vgrf" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.187127 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f14d077-e353-4931-a13e-c2184c602276-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-7vgrf\" (UID: \"8f14d077-e353-4931-a13e-c2184c602276\") " pod="openstack/nova-cell0-cell-mapping-7vgrf" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.195423 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.198425 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.202779 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.213278 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.258484 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.259590 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.265708 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.281416 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.283009 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.284430 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.289516 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f14d077-e353-4931-a13e-c2184c602276-config-data\") pod \"nova-cell0-cell-mapping-7vgrf\" (UID: \"8f14d077-e353-4931-a13e-c2184c602276\") " pod="openstack/nova-cell0-cell-mapping-7vgrf" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.289925 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f14d077-e353-4931-a13e-c2184c602276-scripts\") pod \"nova-cell0-cell-mapping-7vgrf\" (UID: \"8f14d077-e353-4931-a13e-c2184c602276\") " pod="openstack/nova-cell0-cell-mapping-7vgrf" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.289974 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9669\" (UniqueName: \"kubernetes.io/projected/8f14d077-e353-4931-a13e-c2184c602276-kube-api-access-j9669\") pod \"nova-cell0-cell-mapping-7vgrf\" (UID: \"8f14d077-e353-4931-a13e-c2184c602276\") " pod="openstack/nova-cell0-cell-mapping-7vgrf" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.289991 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f14d077-e353-4931-a13e-c2184c602276-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-7vgrf\" (UID: \"8f14d077-e353-4931-a13e-c2184c602276\") " pod="openstack/nova-cell0-cell-mapping-7vgrf" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.298735 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f14d077-e353-4931-a13e-c2184c602276-config-data\") pod \"nova-cell0-cell-mapping-7vgrf\" (UID: \"8f14d077-e353-4931-a13e-c2184c602276\") " pod="openstack/nova-cell0-cell-mapping-7vgrf" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.299334 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f14d077-e353-4931-a13e-c2184c602276-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-7vgrf\" (UID: \"8f14d077-e353-4931-a13e-c2184c602276\") " pod="openstack/nova-cell0-cell-mapping-7vgrf" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.302388 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.317502 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f14d077-e353-4931-a13e-c2184c602276-scripts\") pod \"nova-cell0-cell-mapping-7vgrf\" (UID: \"8f14d077-e353-4931-a13e-c2184c602276\") " pod="openstack/nova-cell0-cell-mapping-7vgrf" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.329950 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9669\" (UniqueName: \"kubernetes.io/projected/8f14d077-e353-4931-a13e-c2184c602276-kube-api-access-j9669\") pod \"nova-cell0-cell-mapping-7vgrf\" (UID: \"8f14d077-e353-4931-a13e-c2184c602276\") " pod="openstack/nova-cell0-cell-mapping-7vgrf" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.335376 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.375788 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.377387 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.379122 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.392613 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc56a36f-8764-4037-8c5a-e14a0ee8e309-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc56a36f-8764-4037-8c5a-e14a0ee8e309\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.392659 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05ed3d4c-807a-4022-a1e0-6e2fc312c99b-config-data\") pod \"nova-scheduler-0\" (UID: \"05ed3d4c-807a-4022-a1e0-6e2fc312c99b\") " pod="openstack/nova-scheduler-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.392692 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdc9g\" (UniqueName: \"kubernetes.io/projected/615aed2c-5f89-4595-b80a-d1241196b64a-kube-api-access-wdc9g\") pod \"nova-api-0\" (UID: \"615aed2c-5f89-4595-b80a-d1241196b64a\") " pod="openstack/nova-api-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.392736 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/615aed2c-5f89-4595-b80a-d1241196b64a-config-data\") pod \"nova-api-0\" (UID: \"615aed2c-5f89-4595-b80a-d1241196b64a\") " pod="openstack/nova-api-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.392800 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/615aed2c-5f89-4595-b80a-d1241196b64a-logs\") pod \"nova-api-0\" (UID: \"615aed2c-5f89-4595-b80a-d1241196b64a\") " pod="openstack/nova-api-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.392822 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05ed3d4c-807a-4022-a1e0-6e2fc312c99b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"05ed3d4c-807a-4022-a1e0-6e2fc312c99b\") " pod="openstack/nova-scheduler-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.392898 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/615aed2c-5f89-4595-b80a-d1241196b64a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"615aed2c-5f89-4595-b80a-d1241196b64a\") " pod="openstack/nova-api-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.392952 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc56a36f-8764-4037-8c5a-e14a0ee8e309-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc56a36f-8764-4037-8c5a-e14a0ee8e309\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.392998 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llnb7\" (UniqueName: \"kubernetes.io/projected/dc56a36f-8764-4037-8c5a-e14a0ee8e309-kube-api-access-llnb7\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc56a36f-8764-4037-8c5a-e14a0ee8e309\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.393082 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hs9l\" (UniqueName: \"kubernetes.io/projected/05ed3d4c-807a-4022-a1e0-6e2fc312c99b-kube-api-access-7hs9l\") pod \"nova-scheduler-0\" (UID: \"05ed3d4c-807a-4022-a1e0-6e2fc312c99b\") " pod="openstack/nova-scheduler-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.398146 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.405328 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7vgrf" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.485774 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fbc4d444f-c9svf"] Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.487821 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fbc4d444f-c9svf" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.494216 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/615aed2c-5f89-4595-b80a-d1241196b64a-logs\") pod \"nova-api-0\" (UID: \"615aed2c-5f89-4595-b80a-d1241196b64a\") " pod="openstack/nova-api-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.494255 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05ed3d4c-807a-4022-a1e0-6e2fc312c99b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"05ed3d4c-807a-4022-a1e0-6e2fc312c99b\") " pod="openstack/nova-scheduler-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.494285 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a351e5c-84ad-4252-96c0-87b770185cca-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4a351e5c-84ad-4252-96c0-87b770185cca\") " pod="openstack/nova-metadata-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.494418 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/615aed2c-5f89-4595-b80a-d1241196b64a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"615aed2c-5f89-4595-b80a-d1241196b64a\") " pod="openstack/nova-api-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.494457 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc56a36f-8764-4037-8c5a-e14a0ee8e309-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc56a36f-8764-4037-8c5a-e14a0ee8e309\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.495152 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llnb7\" (UniqueName: \"kubernetes.io/projected/dc56a36f-8764-4037-8c5a-e14a0ee8e309-kube-api-access-llnb7\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc56a36f-8764-4037-8c5a-e14a0ee8e309\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.495193 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cxwk\" (UniqueName: \"kubernetes.io/projected/4a351e5c-84ad-4252-96c0-87b770185cca-kube-api-access-9cxwk\") pod \"nova-metadata-0\" (UID: \"4a351e5c-84ad-4252-96c0-87b770185cca\") " pod="openstack/nova-metadata-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.495249 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hs9l\" (UniqueName: \"kubernetes.io/projected/05ed3d4c-807a-4022-a1e0-6e2fc312c99b-kube-api-access-7hs9l\") pod \"nova-scheduler-0\" (UID: \"05ed3d4c-807a-4022-a1e0-6e2fc312c99b\") " pod="openstack/nova-scheduler-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.495270 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc56a36f-8764-4037-8c5a-e14a0ee8e309-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc56a36f-8764-4037-8c5a-e14a0ee8e309\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.495291 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a351e5c-84ad-4252-96c0-87b770185cca-config-data\") pod \"nova-metadata-0\" (UID: \"4a351e5c-84ad-4252-96c0-87b770185cca\") " pod="openstack/nova-metadata-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.495324 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a351e5c-84ad-4252-96c0-87b770185cca-logs\") pod \"nova-metadata-0\" (UID: \"4a351e5c-84ad-4252-96c0-87b770185cca\") " pod="openstack/nova-metadata-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.495352 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05ed3d4c-807a-4022-a1e0-6e2fc312c99b-config-data\") pod \"nova-scheduler-0\" (UID: \"05ed3d4c-807a-4022-a1e0-6e2fc312c99b\") " pod="openstack/nova-scheduler-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.495372 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdc9g\" (UniqueName: \"kubernetes.io/projected/615aed2c-5f89-4595-b80a-d1241196b64a-kube-api-access-wdc9g\") pod \"nova-api-0\" (UID: \"615aed2c-5f89-4595-b80a-d1241196b64a\") " pod="openstack/nova-api-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.495391 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/615aed2c-5f89-4595-b80a-d1241196b64a-config-data\") pod \"nova-api-0\" (UID: \"615aed2c-5f89-4595-b80a-d1241196b64a\") " pod="openstack/nova-api-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.495541 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/615aed2c-5f89-4595-b80a-d1241196b64a-logs\") pod \"nova-api-0\" (UID: \"615aed2c-5f89-4595-b80a-d1241196b64a\") " pod="openstack/nova-api-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.504535 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc56a36f-8764-4037-8c5a-e14a0ee8e309-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc56a36f-8764-4037-8c5a-e14a0ee8e309\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.504641 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/615aed2c-5f89-4595-b80a-d1241196b64a-config-data\") pod \"nova-api-0\" (UID: \"615aed2c-5f89-4595-b80a-d1241196b64a\") " pod="openstack/nova-api-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.505807 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05ed3d4c-807a-4022-a1e0-6e2fc312c99b-config-data\") pod \"nova-scheduler-0\" (UID: \"05ed3d4c-807a-4022-a1e0-6e2fc312c99b\") " pod="openstack/nova-scheduler-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.506762 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc56a36f-8764-4037-8c5a-e14a0ee8e309-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc56a36f-8764-4037-8c5a-e14a0ee8e309\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.511772 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/615aed2c-5f89-4595-b80a-d1241196b64a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"615aed2c-5f89-4595-b80a-d1241196b64a\") " pod="openstack/nova-api-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.522358 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fbc4d444f-c9svf"] Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.525785 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05ed3d4c-807a-4022-a1e0-6e2fc312c99b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"05ed3d4c-807a-4022-a1e0-6e2fc312c99b\") " pod="openstack/nova-scheduler-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.531621 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llnb7\" (UniqueName: \"kubernetes.io/projected/dc56a36f-8764-4037-8c5a-e14a0ee8e309-kube-api-access-llnb7\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc56a36f-8764-4037-8c5a-e14a0ee8e309\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.539074 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdc9g\" (UniqueName: \"kubernetes.io/projected/615aed2c-5f89-4595-b80a-d1241196b64a-kube-api-access-wdc9g\") pod \"nova-api-0\" (UID: \"615aed2c-5f89-4595-b80a-d1241196b64a\") " pod="openstack/nova-api-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.544087 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hs9l\" (UniqueName: \"kubernetes.io/projected/05ed3d4c-807a-4022-a1e0-6e2fc312c99b-kube-api-access-7hs9l\") pod \"nova-scheduler-0\" (UID: \"05ed3d4c-807a-4022-a1e0-6e2fc312c99b\") " pod="openstack/nova-scheduler-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.596755 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a351e5c-84ad-4252-96c0-87b770185cca-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4a351e5c-84ad-4252-96c0-87b770185cca\") " pod="openstack/nova-metadata-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.596832 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkgc4\" (UniqueName: \"kubernetes.io/projected/bd2c5b31-ee31-445b-8f37-5f4def71e84e-kube-api-access-gkgc4\") pod \"dnsmasq-dns-5fbc4d444f-c9svf\" (UID: \"bd2c5b31-ee31-445b-8f37-5f4def71e84e\") " pod="openstack/dnsmasq-dns-5fbc4d444f-c9svf" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.597375 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd2c5b31-ee31-445b-8f37-5f4def71e84e-config\") pod \"dnsmasq-dns-5fbc4d444f-c9svf\" (UID: \"bd2c5b31-ee31-445b-8f37-5f4def71e84e\") " pod="openstack/dnsmasq-dns-5fbc4d444f-c9svf" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.597410 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd2c5b31-ee31-445b-8f37-5f4def71e84e-ovsdbserver-sb\") pod \"dnsmasq-dns-5fbc4d444f-c9svf\" (UID: \"bd2c5b31-ee31-445b-8f37-5f4def71e84e\") " pod="openstack/dnsmasq-dns-5fbc4d444f-c9svf" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.597451 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd2c5b31-ee31-445b-8f37-5f4def71e84e-ovsdbserver-nb\") pod \"dnsmasq-dns-5fbc4d444f-c9svf\" (UID: \"bd2c5b31-ee31-445b-8f37-5f4def71e84e\") " pod="openstack/dnsmasq-dns-5fbc4d444f-c9svf" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.597479 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd2c5b31-ee31-445b-8f37-5f4def71e84e-dns-svc\") pod \"dnsmasq-dns-5fbc4d444f-c9svf\" (UID: \"bd2c5b31-ee31-445b-8f37-5f4def71e84e\") " pod="openstack/dnsmasq-dns-5fbc4d444f-c9svf" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.597510 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cxwk\" (UniqueName: \"kubernetes.io/projected/4a351e5c-84ad-4252-96c0-87b770185cca-kube-api-access-9cxwk\") pod \"nova-metadata-0\" (UID: \"4a351e5c-84ad-4252-96c0-87b770185cca\") " pod="openstack/nova-metadata-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.597552 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bd2c5b31-ee31-445b-8f37-5f4def71e84e-dns-swift-storage-0\") pod \"dnsmasq-dns-5fbc4d444f-c9svf\" (UID: \"bd2c5b31-ee31-445b-8f37-5f4def71e84e\") " pod="openstack/dnsmasq-dns-5fbc4d444f-c9svf" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.597597 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a351e5c-84ad-4252-96c0-87b770185cca-config-data\") pod \"nova-metadata-0\" (UID: \"4a351e5c-84ad-4252-96c0-87b770185cca\") " pod="openstack/nova-metadata-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.597611 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a351e5c-84ad-4252-96c0-87b770185cca-logs\") pod \"nova-metadata-0\" (UID: \"4a351e5c-84ad-4252-96c0-87b770185cca\") " pod="openstack/nova-metadata-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.598175 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a351e5c-84ad-4252-96c0-87b770185cca-logs\") pod \"nova-metadata-0\" (UID: \"4a351e5c-84ad-4252-96c0-87b770185cca\") " pod="openstack/nova-metadata-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.603152 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a351e5c-84ad-4252-96c0-87b770185cca-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4a351e5c-84ad-4252-96c0-87b770185cca\") " pod="openstack/nova-metadata-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.618230 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a351e5c-84ad-4252-96c0-87b770185cca-config-data\") pod \"nova-metadata-0\" (UID: \"4a351e5c-84ad-4252-96c0-87b770185cca\") " pod="openstack/nova-metadata-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.623837 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cxwk\" (UniqueName: \"kubernetes.io/projected/4a351e5c-84ad-4252-96c0-87b770185cca-kube-api-access-9cxwk\") pod \"nova-metadata-0\" (UID: \"4a351e5c-84ad-4252-96c0-87b770185cca\") " pod="openstack/nova-metadata-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.691605 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.700752 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkgc4\" (UniqueName: \"kubernetes.io/projected/bd2c5b31-ee31-445b-8f37-5f4def71e84e-kube-api-access-gkgc4\") pod \"dnsmasq-dns-5fbc4d444f-c9svf\" (UID: \"bd2c5b31-ee31-445b-8f37-5f4def71e84e\") " pod="openstack/dnsmasq-dns-5fbc4d444f-c9svf" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.700800 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd2c5b31-ee31-445b-8f37-5f4def71e84e-config\") pod \"dnsmasq-dns-5fbc4d444f-c9svf\" (UID: \"bd2c5b31-ee31-445b-8f37-5f4def71e84e\") " pod="openstack/dnsmasq-dns-5fbc4d444f-c9svf" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.700831 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd2c5b31-ee31-445b-8f37-5f4def71e84e-ovsdbserver-sb\") pod \"dnsmasq-dns-5fbc4d444f-c9svf\" (UID: \"bd2c5b31-ee31-445b-8f37-5f4def71e84e\") " pod="openstack/dnsmasq-dns-5fbc4d444f-c9svf" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.700866 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd2c5b31-ee31-445b-8f37-5f4def71e84e-ovsdbserver-nb\") pod \"dnsmasq-dns-5fbc4d444f-c9svf\" (UID: \"bd2c5b31-ee31-445b-8f37-5f4def71e84e\") " pod="openstack/dnsmasq-dns-5fbc4d444f-c9svf" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.700901 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd2c5b31-ee31-445b-8f37-5f4def71e84e-dns-svc\") pod \"dnsmasq-dns-5fbc4d444f-c9svf\" (UID: \"bd2c5b31-ee31-445b-8f37-5f4def71e84e\") " pod="openstack/dnsmasq-dns-5fbc4d444f-c9svf" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.700964 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bd2c5b31-ee31-445b-8f37-5f4def71e84e-dns-swift-storage-0\") pod \"dnsmasq-dns-5fbc4d444f-c9svf\" (UID: \"bd2c5b31-ee31-445b-8f37-5f4def71e84e\") " pod="openstack/dnsmasq-dns-5fbc4d444f-c9svf" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.702639 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bd2c5b31-ee31-445b-8f37-5f4def71e84e-dns-swift-storage-0\") pod \"dnsmasq-dns-5fbc4d444f-c9svf\" (UID: \"bd2c5b31-ee31-445b-8f37-5f4def71e84e\") " pod="openstack/dnsmasq-dns-5fbc4d444f-c9svf" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.703009 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd2c5b31-ee31-445b-8f37-5f4def71e84e-config\") pod \"dnsmasq-dns-5fbc4d444f-c9svf\" (UID: \"bd2c5b31-ee31-445b-8f37-5f4def71e84e\") " pod="openstack/dnsmasq-dns-5fbc4d444f-c9svf" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.703479 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd2c5b31-ee31-445b-8f37-5f4def71e84e-ovsdbserver-sb\") pod \"dnsmasq-dns-5fbc4d444f-c9svf\" (UID: \"bd2c5b31-ee31-445b-8f37-5f4def71e84e\") " pod="openstack/dnsmasq-dns-5fbc4d444f-c9svf" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.703711 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd2c5b31-ee31-445b-8f37-5f4def71e84e-ovsdbserver-nb\") pod \"dnsmasq-dns-5fbc4d444f-c9svf\" (UID: \"bd2c5b31-ee31-445b-8f37-5f4def71e84e\") " pod="openstack/dnsmasq-dns-5fbc4d444f-c9svf" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.703815 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd2c5b31-ee31-445b-8f37-5f4def71e84e-dns-svc\") pod \"dnsmasq-dns-5fbc4d444f-c9svf\" (UID: \"bd2c5b31-ee31-445b-8f37-5f4def71e84e\") " pod="openstack/dnsmasq-dns-5fbc4d444f-c9svf" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.714969 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.732220 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkgc4\" (UniqueName: \"kubernetes.io/projected/bd2c5b31-ee31-445b-8f37-5f4def71e84e-kube-api-access-gkgc4\") pod \"dnsmasq-dns-5fbc4d444f-c9svf\" (UID: \"bd2c5b31-ee31-445b-8f37-5f4def71e84e\") " pod="openstack/dnsmasq-dns-5fbc4d444f-c9svf" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.740061 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.752265 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fbc4d444f-c9svf" Oct 07 11:39:46 crc kubenswrapper[4700]: I1007 11:39:46.821131 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 11:39:47 crc kubenswrapper[4700]: I1007 11:39:47.120504 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-7vgrf"] Oct 07 11:39:47 crc kubenswrapper[4700]: W1007 11:39:47.222961 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f14d077_e353_4931_a13e_c2184c602276.slice/crio-591d3bae7ddfb73b4a2d144afb0cb59efa653958863bd6a2968b217b8d040f5d WatchSource:0}: Error finding container 591d3bae7ddfb73b4a2d144afb0cb59efa653958863bd6a2968b217b8d040f5d: Status 404 returned error can't find the container with id 591d3bae7ddfb73b4a2d144afb0cb59efa653958863bd6a2968b217b8d040f5d Oct 07 11:39:47 crc kubenswrapper[4700]: I1007 11:39:47.308951 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hk898"] Oct 07 11:39:47 crc kubenswrapper[4700]: I1007 11:39:47.310796 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hk898" Oct 07 11:39:47 crc kubenswrapper[4700]: I1007 11:39:47.313130 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 07 11:39:47 crc kubenswrapper[4700]: I1007 11:39:47.313457 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 07 11:39:47 crc kubenswrapper[4700]: I1007 11:39:47.323688 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hk898"] Oct 07 11:39:47 crc kubenswrapper[4700]: I1007 11:39:47.351277 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 11:39:47 crc kubenswrapper[4700]: I1007 11:39:47.426017 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3089c2b1-1959-42bb-87ff-1f9c4ab4d01d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hk898\" (UID: \"3089c2b1-1959-42bb-87ff-1f9c4ab4d01d\") " pod="openstack/nova-cell1-conductor-db-sync-hk898" Oct 07 11:39:47 crc kubenswrapper[4700]: I1007 11:39:47.426093 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3089c2b1-1959-42bb-87ff-1f9c4ab4d01d-config-data\") pod \"nova-cell1-conductor-db-sync-hk898\" (UID: \"3089c2b1-1959-42bb-87ff-1f9c4ab4d01d\") " pod="openstack/nova-cell1-conductor-db-sync-hk898" Oct 07 11:39:47 crc kubenswrapper[4700]: I1007 11:39:47.426185 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3089c2b1-1959-42bb-87ff-1f9c4ab4d01d-scripts\") pod \"nova-cell1-conductor-db-sync-hk898\" (UID: \"3089c2b1-1959-42bb-87ff-1f9c4ab4d01d\") " pod="openstack/nova-cell1-conductor-db-sync-hk898" Oct 07 11:39:47 crc kubenswrapper[4700]: I1007 11:39:47.426221 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lgrp\" (UniqueName: \"kubernetes.io/projected/3089c2b1-1959-42bb-87ff-1f9c4ab4d01d-kube-api-access-5lgrp\") pod \"nova-cell1-conductor-db-sync-hk898\" (UID: \"3089c2b1-1959-42bb-87ff-1f9c4ab4d01d\") " pod="openstack/nova-cell1-conductor-db-sync-hk898" Oct 07 11:39:47 crc kubenswrapper[4700]: I1007 11:39:47.527395 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3089c2b1-1959-42bb-87ff-1f9c4ab4d01d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hk898\" (UID: \"3089c2b1-1959-42bb-87ff-1f9c4ab4d01d\") " pod="openstack/nova-cell1-conductor-db-sync-hk898" Oct 07 11:39:47 crc kubenswrapper[4700]: I1007 11:39:47.527737 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3089c2b1-1959-42bb-87ff-1f9c4ab4d01d-config-data\") pod \"nova-cell1-conductor-db-sync-hk898\" (UID: \"3089c2b1-1959-42bb-87ff-1f9c4ab4d01d\") " pod="openstack/nova-cell1-conductor-db-sync-hk898" Oct 07 11:39:47 crc kubenswrapper[4700]: I1007 11:39:47.527806 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3089c2b1-1959-42bb-87ff-1f9c4ab4d01d-scripts\") pod \"nova-cell1-conductor-db-sync-hk898\" (UID: \"3089c2b1-1959-42bb-87ff-1f9c4ab4d01d\") " pod="openstack/nova-cell1-conductor-db-sync-hk898" Oct 07 11:39:47 crc kubenswrapper[4700]: I1007 11:39:47.527843 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lgrp\" (UniqueName: \"kubernetes.io/projected/3089c2b1-1959-42bb-87ff-1f9c4ab4d01d-kube-api-access-5lgrp\") pod \"nova-cell1-conductor-db-sync-hk898\" (UID: \"3089c2b1-1959-42bb-87ff-1f9c4ab4d01d\") " pod="openstack/nova-cell1-conductor-db-sync-hk898" Oct 07 11:39:47 crc kubenswrapper[4700]: I1007 11:39:47.539085 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3089c2b1-1959-42bb-87ff-1f9c4ab4d01d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hk898\" (UID: \"3089c2b1-1959-42bb-87ff-1f9c4ab4d01d\") " pod="openstack/nova-cell1-conductor-db-sync-hk898" Oct 07 11:39:47 crc kubenswrapper[4700]: I1007 11:39:47.539475 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3089c2b1-1959-42bb-87ff-1f9c4ab4d01d-scripts\") pod \"nova-cell1-conductor-db-sync-hk898\" (UID: \"3089c2b1-1959-42bb-87ff-1f9c4ab4d01d\") " pod="openstack/nova-cell1-conductor-db-sync-hk898" Oct 07 11:39:47 crc kubenswrapper[4700]: I1007 11:39:47.570958 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3089c2b1-1959-42bb-87ff-1f9c4ab4d01d-config-data\") pod \"nova-cell1-conductor-db-sync-hk898\" (UID: \"3089c2b1-1959-42bb-87ff-1f9c4ab4d01d\") " pod="openstack/nova-cell1-conductor-db-sync-hk898" Oct 07 11:39:47 crc kubenswrapper[4700]: I1007 11:39:47.597059 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lgrp\" (UniqueName: \"kubernetes.io/projected/3089c2b1-1959-42bb-87ff-1f9c4ab4d01d-kube-api-access-5lgrp\") pod \"nova-cell1-conductor-db-sync-hk898\" (UID: \"3089c2b1-1959-42bb-87ff-1f9c4ab4d01d\") " pod="openstack/nova-cell1-conductor-db-sync-hk898" Oct 07 11:39:47 crc kubenswrapper[4700]: I1007 11:39:47.673801 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 11:39:47 crc kubenswrapper[4700]: I1007 11:39:47.706397 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hk898" Oct 07 11:39:47 crc kubenswrapper[4700]: I1007 11:39:47.713413 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fbc4d444f-c9svf"] Oct 07 11:39:47 crc kubenswrapper[4700]: I1007 11:39:47.719889 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 11:39:47 crc kubenswrapper[4700]: I1007 11:39:47.774706 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 11:39:47 crc kubenswrapper[4700]: W1007 11:39:47.832237 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a351e5c_84ad_4252_96c0_87b770185cca.slice/crio-330c5300ce113eefbae56aa835c6a1076b4b037b2b712a347520657feed2bf85 WatchSource:0}: Error finding container 330c5300ce113eefbae56aa835c6a1076b4b037b2b712a347520657feed2bf85: Status 404 returned error can't find the container with id 330c5300ce113eefbae56aa835c6a1076b4b037b2b712a347520657feed2bf85 Oct 07 11:39:48 crc kubenswrapper[4700]: I1007 11:39:48.011730 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"615aed2c-5f89-4595-b80a-d1241196b64a","Type":"ContainerStarted","Data":"8db1ae417e647abce81569a6e559bff8f4f9d9c18816f1e3a243c8526d512762"} Oct 07 11:39:48 crc kubenswrapper[4700]: I1007 11:39:48.013830 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7vgrf" event={"ID":"8f14d077-e353-4931-a13e-c2184c602276","Type":"ContainerStarted","Data":"c1af685aad61d376c1cfc82a063190289f77886b4ad1e6339fbed2b04f28c047"} Oct 07 11:39:48 crc kubenswrapper[4700]: I1007 11:39:48.013855 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7vgrf" event={"ID":"8f14d077-e353-4931-a13e-c2184c602276","Type":"ContainerStarted","Data":"591d3bae7ddfb73b4a2d144afb0cb59efa653958863bd6a2968b217b8d040f5d"} Oct 07 11:39:48 crc kubenswrapper[4700]: I1007 11:39:48.014882 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fbc4d444f-c9svf" event={"ID":"bd2c5b31-ee31-445b-8f37-5f4def71e84e","Type":"ContainerStarted","Data":"abf76f66574295971ce593d81a56901c03e88d4b07599a4002e68b2ebdef8db8"} Oct 07 11:39:48 crc kubenswrapper[4700]: I1007 11:39:48.016603 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4a351e5c-84ad-4252-96c0-87b770185cca","Type":"ContainerStarted","Data":"330c5300ce113eefbae56aa835c6a1076b4b037b2b712a347520657feed2bf85"} Oct 07 11:39:48 crc kubenswrapper[4700]: I1007 11:39:48.019155 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"05ed3d4c-807a-4022-a1e0-6e2fc312c99b","Type":"ContainerStarted","Data":"c8fbb1930cca26990610056fe7841bf7ad8e7f0d2ee52c1090da96e2a1b7f7c4"} Oct 07 11:39:48 crc kubenswrapper[4700]: I1007 11:39:48.020215 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"dc56a36f-8764-4037-8c5a-e14a0ee8e309","Type":"ContainerStarted","Data":"537285339f143d2e980e9a38bc3c93df2419237c52c31201159ef69a7242fbfe"} Oct 07 11:39:48 crc kubenswrapper[4700]: I1007 11:39:48.032240 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-7vgrf" podStartSLOduration=2.032221969 podStartE2EDuration="2.032221969s" podCreationTimestamp="2025-10-07 11:39:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:39:48.027488316 +0000 UTC m=+1154.823887295" watchObservedRunningTime="2025-10-07 11:39:48.032221969 +0000 UTC m=+1154.828620958" Oct 07 11:39:48 crc kubenswrapper[4700]: I1007 11:39:48.320073 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hk898"] Oct 07 11:39:49 crc kubenswrapper[4700]: I1007 11:39:49.040314 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hk898" event={"ID":"3089c2b1-1959-42bb-87ff-1f9c4ab4d01d","Type":"ContainerStarted","Data":"deadea4bbdd14a2a9e9ae2298fb29d71b595d3d35bbee933462eb6a99a59727a"} Oct 07 11:39:49 crc kubenswrapper[4700]: I1007 11:39:49.040880 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hk898" event={"ID":"3089c2b1-1959-42bb-87ff-1f9c4ab4d01d","Type":"ContainerStarted","Data":"6a9bd262c75e83ed3650b86273d2b84a7d0fac6bbfedf336a980b52c816764d1"} Oct 07 11:39:49 crc kubenswrapper[4700]: I1007 11:39:49.046932 4700 generic.go:334] "Generic (PLEG): container finished" podID="bd2c5b31-ee31-445b-8f37-5f4def71e84e" containerID="276f1b7b755a38b65f42320526b73e464b73cd602d0a8ea7c35f278534e947d7" exitCode=0 Oct 07 11:39:49 crc kubenswrapper[4700]: I1007 11:39:49.048285 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fbc4d444f-c9svf" event={"ID":"bd2c5b31-ee31-445b-8f37-5f4def71e84e","Type":"ContainerDied","Data":"276f1b7b755a38b65f42320526b73e464b73cd602d0a8ea7c35f278534e947d7"} Oct 07 11:39:49 crc kubenswrapper[4700]: I1007 11:39:49.061925 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-hk898" podStartSLOduration=2.06190035 podStartE2EDuration="2.06190035s" podCreationTimestamp="2025-10-07 11:39:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:39:49.059164709 +0000 UTC m=+1155.855563698" watchObservedRunningTime="2025-10-07 11:39:49.06190035 +0000 UTC m=+1155.858299339" Oct 07 11:39:49 crc kubenswrapper[4700]: I1007 11:39:49.784447 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 11:39:49 crc kubenswrapper[4700]: I1007 11:39:49.817034 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 11:39:52 crc kubenswrapper[4700]: I1007 11:39:52.077433 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"dc56a36f-8764-4037-8c5a-e14a0ee8e309","Type":"ContainerStarted","Data":"560e2a7cce2411869b81e9c004c6f90ac805820feb943061cf762cd99ad9d03c"} Oct 07 11:39:52 crc kubenswrapper[4700]: I1007 11:39:52.077549 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="dc56a36f-8764-4037-8c5a-e14a0ee8e309" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://560e2a7cce2411869b81e9c004c6f90ac805820feb943061cf762cd99ad9d03c" gracePeriod=30 Oct 07 11:39:52 crc kubenswrapper[4700]: I1007 11:39:52.092878 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"615aed2c-5f89-4595-b80a-d1241196b64a","Type":"ContainerStarted","Data":"ff202ffd72b27ded4bad082b24ccc925b216deac726d94b11b3514daa9c30de2"} Oct 07 11:39:52 crc kubenswrapper[4700]: I1007 11:39:52.092939 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"615aed2c-5f89-4595-b80a-d1241196b64a","Type":"ContainerStarted","Data":"4cf9fab36dfa862401b67a39ff4a5238ce0c9b6cb2521ac2b78c7ab0740ad2cb"} Oct 07 11:39:52 crc kubenswrapper[4700]: I1007 11:39:52.096947 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fbc4d444f-c9svf" event={"ID":"bd2c5b31-ee31-445b-8f37-5f4def71e84e","Type":"ContainerStarted","Data":"9acc0339b9a2fc597468ca200b34801b9e52360bb8f3b2bf05a27c107357cbb1"} Oct 07 11:39:52 crc kubenswrapper[4700]: I1007 11:39:52.097988 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5fbc4d444f-c9svf" Oct 07 11:39:52 crc kubenswrapper[4700]: I1007 11:39:52.105503 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4a351e5c-84ad-4252-96c0-87b770185cca","Type":"ContainerStarted","Data":"40019738950f968d899d5f8660ccdcd97f66bf35ade9a2b27b4dbc45030d0540"} Oct 07 11:39:52 crc kubenswrapper[4700]: I1007 11:39:52.105757 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4a351e5c-84ad-4252-96c0-87b770185cca","Type":"ContainerStarted","Data":"12b39120483acf5e27c7c0e9ff92dea6b732cc22d13a6fdd77c9f524b0ef98d3"} Oct 07 11:39:52 crc kubenswrapper[4700]: I1007 11:39:52.105755 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4a351e5c-84ad-4252-96c0-87b770185cca" containerName="nova-metadata-log" containerID="cri-o://12b39120483acf5e27c7c0e9ff92dea6b732cc22d13a6fdd77c9f524b0ef98d3" gracePeriod=30 Oct 07 11:39:52 crc kubenswrapper[4700]: I1007 11:39:52.106008 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4a351e5c-84ad-4252-96c0-87b770185cca" containerName="nova-metadata-metadata" containerID="cri-o://40019738950f968d899d5f8660ccdcd97f66bf35ade9a2b27b4dbc45030d0540" gracePeriod=30 Oct 07 11:39:52 crc kubenswrapper[4700]: I1007 11:39:52.109196 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.389870226 podStartE2EDuration="6.109174774s" podCreationTimestamp="2025-10-07 11:39:46 +0000 UTC" firstStartedPulling="2025-10-07 11:39:47.394682613 +0000 UTC m=+1154.191081612" lastFinishedPulling="2025-10-07 11:39:51.113987171 +0000 UTC m=+1157.910386160" observedRunningTime="2025-10-07 11:39:52.098199028 +0000 UTC m=+1158.894598027" watchObservedRunningTime="2025-10-07 11:39:52.109174774 +0000 UTC m=+1158.905573783" Oct 07 11:39:52 crc kubenswrapper[4700]: I1007 11:39:52.117924 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"05ed3d4c-807a-4022-a1e0-6e2fc312c99b","Type":"ContainerStarted","Data":"edb0bfe4d57fed9e2aa29f479479c8c6d9e0446175d093acf8fa0526a9b10373"} Oct 07 11:39:52 crc kubenswrapper[4700]: I1007 11:39:52.131913 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5fbc4d444f-c9svf" podStartSLOduration=6.131890115 podStartE2EDuration="6.131890115s" podCreationTimestamp="2025-10-07 11:39:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:39:52.122805109 +0000 UTC m=+1158.919204098" watchObservedRunningTime="2025-10-07 11:39:52.131890115 +0000 UTC m=+1158.928289104" Oct 07 11:39:52 crc kubenswrapper[4700]: I1007 11:39:52.144547 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.7220812690000002 podStartE2EDuration="6.144531985s" podCreationTimestamp="2025-10-07 11:39:46 +0000 UTC" firstStartedPulling="2025-10-07 11:39:47.691412102 +0000 UTC m=+1154.487811091" lastFinishedPulling="2025-10-07 11:39:51.113862828 +0000 UTC m=+1157.910261807" observedRunningTime="2025-10-07 11:39:52.14011402 +0000 UTC m=+1158.936513009" watchObservedRunningTime="2025-10-07 11:39:52.144531985 +0000 UTC m=+1158.940930974" Oct 07 11:39:52 crc kubenswrapper[4700]: I1007 11:39:52.169682 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.85418656 podStartE2EDuration="6.169652249s" podCreationTimestamp="2025-10-07 11:39:46 +0000 UTC" firstStartedPulling="2025-10-07 11:39:47.798464621 +0000 UTC m=+1154.594863610" lastFinishedPulling="2025-10-07 11:39:51.11393032 +0000 UTC m=+1157.910329299" observedRunningTime="2025-10-07 11:39:52.160764928 +0000 UTC m=+1158.957163917" watchObservedRunningTime="2025-10-07 11:39:52.169652249 +0000 UTC m=+1158.966051268" Oct 07 11:39:52 crc kubenswrapper[4700]: I1007 11:39:52.188296 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.902477238 podStartE2EDuration="6.188281314s" podCreationTimestamp="2025-10-07 11:39:46 +0000 UTC" firstStartedPulling="2025-10-07 11:39:47.834328985 +0000 UTC m=+1154.630727974" lastFinishedPulling="2025-10-07 11:39:51.120133051 +0000 UTC m=+1157.916532050" observedRunningTime="2025-10-07 11:39:52.187105484 +0000 UTC m=+1158.983504513" watchObservedRunningTime="2025-10-07 11:39:52.188281314 +0000 UTC m=+1158.984680303" Oct 07 11:39:52 crc kubenswrapper[4700]: I1007 11:39:52.735476 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 11:39:52 crc kubenswrapper[4700]: I1007 11:39:52.847918 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a351e5c-84ad-4252-96c0-87b770185cca-config-data\") pod \"4a351e5c-84ad-4252-96c0-87b770185cca\" (UID: \"4a351e5c-84ad-4252-96c0-87b770185cca\") " Oct 07 11:39:52 crc kubenswrapper[4700]: I1007 11:39:52.847964 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cxwk\" (UniqueName: \"kubernetes.io/projected/4a351e5c-84ad-4252-96c0-87b770185cca-kube-api-access-9cxwk\") pod \"4a351e5c-84ad-4252-96c0-87b770185cca\" (UID: \"4a351e5c-84ad-4252-96c0-87b770185cca\") " Oct 07 11:39:52 crc kubenswrapper[4700]: I1007 11:39:52.847995 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a351e5c-84ad-4252-96c0-87b770185cca-combined-ca-bundle\") pod \"4a351e5c-84ad-4252-96c0-87b770185cca\" (UID: \"4a351e5c-84ad-4252-96c0-87b770185cca\") " Oct 07 11:39:52 crc kubenswrapper[4700]: I1007 11:39:52.848053 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a351e5c-84ad-4252-96c0-87b770185cca-logs\") pod \"4a351e5c-84ad-4252-96c0-87b770185cca\" (UID: \"4a351e5c-84ad-4252-96c0-87b770185cca\") " Oct 07 11:39:52 crc kubenswrapper[4700]: I1007 11:39:52.848860 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a351e5c-84ad-4252-96c0-87b770185cca-logs" (OuterVolumeSpecName: "logs") pod "4a351e5c-84ad-4252-96c0-87b770185cca" (UID: "4a351e5c-84ad-4252-96c0-87b770185cca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:39:52 crc kubenswrapper[4700]: I1007 11:39:52.854821 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a351e5c-84ad-4252-96c0-87b770185cca-kube-api-access-9cxwk" (OuterVolumeSpecName: "kube-api-access-9cxwk") pod "4a351e5c-84ad-4252-96c0-87b770185cca" (UID: "4a351e5c-84ad-4252-96c0-87b770185cca"). InnerVolumeSpecName "kube-api-access-9cxwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:39:52 crc kubenswrapper[4700]: I1007 11:39:52.879866 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a351e5c-84ad-4252-96c0-87b770185cca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a351e5c-84ad-4252-96c0-87b770185cca" (UID: "4a351e5c-84ad-4252-96c0-87b770185cca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:39:52 crc kubenswrapper[4700]: I1007 11:39:52.885732 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a351e5c-84ad-4252-96c0-87b770185cca-config-data" (OuterVolumeSpecName: "config-data") pod "4a351e5c-84ad-4252-96c0-87b770185cca" (UID: "4a351e5c-84ad-4252-96c0-87b770185cca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:39:52 crc kubenswrapper[4700]: I1007 11:39:52.950641 4700 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a351e5c-84ad-4252-96c0-87b770185cca-logs\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:52 crc kubenswrapper[4700]: I1007 11:39:52.950676 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a351e5c-84ad-4252-96c0-87b770185cca-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:52 crc kubenswrapper[4700]: I1007 11:39:52.950689 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cxwk\" (UniqueName: \"kubernetes.io/projected/4a351e5c-84ad-4252-96c0-87b770185cca-kube-api-access-9cxwk\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:52 crc kubenswrapper[4700]: I1007 11:39:52.950702 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a351e5c-84ad-4252-96c0-87b770185cca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.151908 4700 generic.go:334] "Generic (PLEG): container finished" podID="4a351e5c-84ad-4252-96c0-87b770185cca" containerID="40019738950f968d899d5f8660ccdcd97f66bf35ade9a2b27b4dbc45030d0540" exitCode=0 Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.151944 4700 generic.go:334] "Generic (PLEG): container finished" podID="4a351e5c-84ad-4252-96c0-87b770185cca" containerID="12b39120483acf5e27c7c0e9ff92dea6b732cc22d13a6fdd77c9f524b0ef98d3" exitCode=143 Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.151989 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4a351e5c-84ad-4252-96c0-87b770185cca","Type":"ContainerDied","Data":"40019738950f968d899d5f8660ccdcd97f66bf35ade9a2b27b4dbc45030d0540"} Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.152035 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4a351e5c-84ad-4252-96c0-87b770185cca","Type":"ContainerDied","Data":"12b39120483acf5e27c7c0e9ff92dea6b732cc22d13a6fdd77c9f524b0ef98d3"} Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.152049 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4a351e5c-84ad-4252-96c0-87b770185cca","Type":"ContainerDied","Data":"330c5300ce113eefbae56aa835c6a1076b4b037b2b712a347520657feed2bf85"} Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.152068 4700 scope.go:117] "RemoveContainer" containerID="40019738950f968d899d5f8660ccdcd97f66bf35ade9a2b27b4dbc45030d0540" Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.152147 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.211579 4700 scope.go:117] "RemoveContainer" containerID="12b39120483acf5e27c7c0e9ff92dea6b732cc22d13a6fdd77c9f524b0ef98d3" Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.218423 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.228017 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.237926 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 07 11:39:53 crc kubenswrapper[4700]: E1007 11:39:53.238571 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a351e5c-84ad-4252-96c0-87b770185cca" containerName="nova-metadata-metadata" Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.238599 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a351e5c-84ad-4252-96c0-87b770185cca" containerName="nova-metadata-metadata" Oct 07 11:39:53 crc kubenswrapper[4700]: E1007 11:39:53.238614 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a351e5c-84ad-4252-96c0-87b770185cca" containerName="nova-metadata-log" Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.238623 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a351e5c-84ad-4252-96c0-87b770185cca" containerName="nova-metadata-log" Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.238863 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a351e5c-84ad-4252-96c0-87b770185cca" containerName="nova-metadata-metadata" Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.238887 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a351e5c-84ad-4252-96c0-87b770185cca" containerName="nova-metadata-log" Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.245449 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.249926 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.250364 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.267087 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53cbf569-dc75-4043-bed9-8e82ee7b5cbb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"53cbf569-dc75-4043-bed9-8e82ee7b5cbb\") " pod="openstack/nova-metadata-0" Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.267174 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j84vz\" (UniqueName: \"kubernetes.io/projected/53cbf569-dc75-4043-bed9-8e82ee7b5cbb-kube-api-access-j84vz\") pod \"nova-metadata-0\" (UID: \"53cbf569-dc75-4043-bed9-8e82ee7b5cbb\") " pod="openstack/nova-metadata-0" Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.267377 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53cbf569-dc75-4043-bed9-8e82ee7b5cbb-logs\") pod \"nova-metadata-0\" (UID: \"53cbf569-dc75-4043-bed9-8e82ee7b5cbb\") " pod="openstack/nova-metadata-0" Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.267571 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53cbf569-dc75-4043-bed9-8e82ee7b5cbb-config-data\") pod \"nova-metadata-0\" (UID: \"53cbf569-dc75-4043-bed9-8e82ee7b5cbb\") " pod="openstack/nova-metadata-0" Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.267762 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/53cbf569-dc75-4043-bed9-8e82ee7b5cbb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"53cbf569-dc75-4043-bed9-8e82ee7b5cbb\") " pod="openstack/nova-metadata-0" Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.272715 4700 scope.go:117] "RemoveContainer" containerID="40019738950f968d899d5f8660ccdcd97f66bf35ade9a2b27b4dbc45030d0540" Oct 07 11:39:53 crc kubenswrapper[4700]: E1007 11:39:53.276501 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40019738950f968d899d5f8660ccdcd97f66bf35ade9a2b27b4dbc45030d0540\": container with ID starting with 40019738950f968d899d5f8660ccdcd97f66bf35ade9a2b27b4dbc45030d0540 not found: ID does not exist" containerID="40019738950f968d899d5f8660ccdcd97f66bf35ade9a2b27b4dbc45030d0540" Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.276735 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40019738950f968d899d5f8660ccdcd97f66bf35ade9a2b27b4dbc45030d0540"} err="failed to get container status \"40019738950f968d899d5f8660ccdcd97f66bf35ade9a2b27b4dbc45030d0540\": rpc error: code = NotFound desc = could not find container \"40019738950f968d899d5f8660ccdcd97f66bf35ade9a2b27b4dbc45030d0540\": container with ID starting with 40019738950f968d899d5f8660ccdcd97f66bf35ade9a2b27b4dbc45030d0540 not found: ID does not exist" Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.276763 4700 scope.go:117] "RemoveContainer" containerID="12b39120483acf5e27c7c0e9ff92dea6b732cc22d13a6fdd77c9f524b0ef98d3" Oct 07 11:39:53 crc kubenswrapper[4700]: E1007 11:39:53.277335 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12b39120483acf5e27c7c0e9ff92dea6b732cc22d13a6fdd77c9f524b0ef98d3\": container with ID starting with 12b39120483acf5e27c7c0e9ff92dea6b732cc22d13a6fdd77c9f524b0ef98d3 not found: ID does not exist" containerID="12b39120483acf5e27c7c0e9ff92dea6b732cc22d13a6fdd77c9f524b0ef98d3" Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.277357 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12b39120483acf5e27c7c0e9ff92dea6b732cc22d13a6fdd77c9f524b0ef98d3"} err="failed to get container status \"12b39120483acf5e27c7c0e9ff92dea6b732cc22d13a6fdd77c9f524b0ef98d3\": rpc error: code = NotFound desc = could not find container \"12b39120483acf5e27c7c0e9ff92dea6b732cc22d13a6fdd77c9f524b0ef98d3\": container with ID starting with 12b39120483acf5e27c7c0e9ff92dea6b732cc22d13a6fdd77c9f524b0ef98d3 not found: ID does not exist" Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.277371 4700 scope.go:117] "RemoveContainer" containerID="40019738950f968d899d5f8660ccdcd97f66bf35ade9a2b27b4dbc45030d0540" Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.277670 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40019738950f968d899d5f8660ccdcd97f66bf35ade9a2b27b4dbc45030d0540"} err="failed to get container status \"40019738950f968d899d5f8660ccdcd97f66bf35ade9a2b27b4dbc45030d0540\": rpc error: code = NotFound desc = could not find container \"40019738950f968d899d5f8660ccdcd97f66bf35ade9a2b27b4dbc45030d0540\": container with ID starting with 40019738950f968d899d5f8660ccdcd97f66bf35ade9a2b27b4dbc45030d0540 not found: ID does not exist" Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.277690 4700 scope.go:117] "RemoveContainer" containerID="12b39120483acf5e27c7c0e9ff92dea6b732cc22d13a6fdd77c9f524b0ef98d3" Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.277907 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12b39120483acf5e27c7c0e9ff92dea6b732cc22d13a6fdd77c9f524b0ef98d3"} err="failed to get container status \"12b39120483acf5e27c7c0e9ff92dea6b732cc22d13a6fdd77c9f524b0ef98d3\": rpc error: code = NotFound desc = could not find container \"12b39120483acf5e27c7c0e9ff92dea6b732cc22d13a6fdd77c9f524b0ef98d3\": container with ID starting with 12b39120483acf5e27c7c0e9ff92dea6b732cc22d13a6fdd77c9f524b0ef98d3 not found: ID does not exist" Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.286581 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.377332 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/53cbf569-dc75-4043-bed9-8e82ee7b5cbb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"53cbf569-dc75-4043-bed9-8e82ee7b5cbb\") " pod="openstack/nova-metadata-0" Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.377562 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53cbf569-dc75-4043-bed9-8e82ee7b5cbb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"53cbf569-dc75-4043-bed9-8e82ee7b5cbb\") " pod="openstack/nova-metadata-0" Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.377593 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j84vz\" (UniqueName: \"kubernetes.io/projected/53cbf569-dc75-4043-bed9-8e82ee7b5cbb-kube-api-access-j84vz\") pod \"nova-metadata-0\" (UID: \"53cbf569-dc75-4043-bed9-8e82ee7b5cbb\") " pod="openstack/nova-metadata-0" Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.377660 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53cbf569-dc75-4043-bed9-8e82ee7b5cbb-logs\") pod \"nova-metadata-0\" (UID: \"53cbf569-dc75-4043-bed9-8e82ee7b5cbb\") " pod="openstack/nova-metadata-0" Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.377735 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53cbf569-dc75-4043-bed9-8e82ee7b5cbb-config-data\") pod \"nova-metadata-0\" (UID: \"53cbf569-dc75-4043-bed9-8e82ee7b5cbb\") " pod="openstack/nova-metadata-0" Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.378238 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53cbf569-dc75-4043-bed9-8e82ee7b5cbb-logs\") pod \"nova-metadata-0\" (UID: \"53cbf569-dc75-4043-bed9-8e82ee7b5cbb\") " pod="openstack/nova-metadata-0" Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.383094 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53cbf569-dc75-4043-bed9-8e82ee7b5cbb-config-data\") pod \"nova-metadata-0\" (UID: \"53cbf569-dc75-4043-bed9-8e82ee7b5cbb\") " pod="openstack/nova-metadata-0" Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.390725 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/53cbf569-dc75-4043-bed9-8e82ee7b5cbb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"53cbf569-dc75-4043-bed9-8e82ee7b5cbb\") " pod="openstack/nova-metadata-0" Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.392326 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53cbf569-dc75-4043-bed9-8e82ee7b5cbb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"53cbf569-dc75-4043-bed9-8e82ee7b5cbb\") " pod="openstack/nova-metadata-0" Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.394111 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j84vz\" (UniqueName: \"kubernetes.io/projected/53cbf569-dc75-4043-bed9-8e82ee7b5cbb-kube-api-access-j84vz\") pod \"nova-metadata-0\" (UID: \"53cbf569-dc75-4043-bed9-8e82ee7b5cbb\") " pod="openstack/nova-metadata-0" Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.577669 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 11:39:53 crc kubenswrapper[4700]: I1007 11:39:53.969896 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a351e5c-84ad-4252-96c0-87b770185cca" path="/var/lib/kubelet/pods/4a351e5c-84ad-4252-96c0-87b770185cca/volumes" Oct 07 11:39:54 crc kubenswrapper[4700]: I1007 11:39:54.136949 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 11:39:54 crc kubenswrapper[4700]: I1007 11:39:54.142045 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 07 11:39:54 crc kubenswrapper[4700]: I1007 11:39:54.187949 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"53cbf569-dc75-4043-bed9-8e82ee7b5cbb","Type":"ContainerStarted","Data":"bc4bc7f7f32ab414fca0016bee8ae030928d5ee44a2af24832e9cb4f8a1b6282"} Oct 07 11:39:55 crc kubenswrapper[4700]: I1007 11:39:55.233034 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"53cbf569-dc75-4043-bed9-8e82ee7b5cbb","Type":"ContainerStarted","Data":"0fe18548ad5a14a88c03c1ef741326e27ed95ee7960bc6d5fb42e24c73ef5bd5"} Oct 07 11:39:55 crc kubenswrapper[4700]: I1007 11:39:55.233528 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"53cbf569-dc75-4043-bed9-8e82ee7b5cbb","Type":"ContainerStarted","Data":"8d9e44d0ed9ac81d2b7b79114ced6e87ae012073f84b9fd2224831ee95ab4611"} Oct 07 11:39:55 crc kubenswrapper[4700]: I1007 11:39:55.237867 4700 generic.go:334] "Generic (PLEG): container finished" podID="8f14d077-e353-4931-a13e-c2184c602276" containerID="c1af685aad61d376c1cfc82a063190289f77886b4ad1e6339fbed2b04f28c047" exitCode=0 Oct 07 11:39:55 crc kubenswrapper[4700]: I1007 11:39:55.237919 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7vgrf" event={"ID":"8f14d077-e353-4931-a13e-c2184c602276","Type":"ContainerDied","Data":"c1af685aad61d376c1cfc82a063190289f77886b4ad1e6339fbed2b04f28c047"} Oct 07 11:39:55 crc kubenswrapper[4700]: I1007 11:39:55.240944 4700 generic.go:334] "Generic (PLEG): container finished" podID="3089c2b1-1959-42bb-87ff-1f9c4ab4d01d" containerID="deadea4bbdd14a2a9e9ae2298fb29d71b595d3d35bbee933462eb6a99a59727a" exitCode=0 Oct 07 11:39:55 crc kubenswrapper[4700]: I1007 11:39:55.240970 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hk898" event={"ID":"3089c2b1-1959-42bb-87ff-1f9c4ab4d01d","Type":"ContainerDied","Data":"deadea4bbdd14a2a9e9ae2298fb29d71b595d3d35bbee933462eb6a99a59727a"} Oct 07 11:39:55 crc kubenswrapper[4700]: I1007 11:39:55.263574 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.263556077 podStartE2EDuration="2.263556077s" podCreationTimestamp="2025-10-07 11:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:39:55.253018703 +0000 UTC m=+1162.049417702" watchObservedRunningTime="2025-10-07 11:39:55.263556077 +0000 UTC m=+1162.059955066" Oct 07 11:39:56 crc kubenswrapper[4700]: I1007 11:39:56.692300 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 07 11:39:56 crc kubenswrapper[4700]: I1007 11:39:56.716583 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 11:39:56 crc kubenswrapper[4700]: I1007 11:39:56.717423 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 11:39:56 crc kubenswrapper[4700]: I1007 11:39:56.755880 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5fbc4d444f-c9svf" Oct 07 11:39:56 crc kubenswrapper[4700]: I1007 11:39:56.773026 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hk898" Oct 07 11:39:56 crc kubenswrapper[4700]: I1007 11:39:56.782111 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7vgrf" Oct 07 11:39:56 crc kubenswrapper[4700]: I1007 11:39:56.821497 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 07 11:39:56 crc kubenswrapper[4700]: I1007 11:39:56.823417 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 07 11:39:56 crc kubenswrapper[4700]: I1007 11:39:56.867689 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f6bc4c6c9-4mfq6"] Oct 07 11:39:56 crc kubenswrapper[4700]: I1007 11:39:56.869866 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f6bc4c6c9-4mfq6" podUID="0366de18-d4be-4a04-8bc7-b6343f5fc3f8" containerName="dnsmasq-dns" containerID="cri-o://d353466d27b429c9a343e53308a8c7a0c67a9f1555fa1792fcbb02b44add53bb" gracePeriod=10 Oct 07 11:39:56 crc kubenswrapper[4700]: I1007 11:39:56.871609 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 07 11:39:56 crc kubenswrapper[4700]: I1007 11:39:56.958060 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3089c2b1-1959-42bb-87ff-1f9c4ab4d01d-config-data\") pod \"3089c2b1-1959-42bb-87ff-1f9c4ab4d01d\" (UID: \"3089c2b1-1959-42bb-87ff-1f9c4ab4d01d\") " Oct 07 11:39:56 crc kubenswrapper[4700]: I1007 11:39:56.958171 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9669\" (UniqueName: \"kubernetes.io/projected/8f14d077-e353-4931-a13e-c2184c602276-kube-api-access-j9669\") pod \"8f14d077-e353-4931-a13e-c2184c602276\" (UID: \"8f14d077-e353-4931-a13e-c2184c602276\") " Oct 07 11:39:56 crc kubenswrapper[4700]: I1007 11:39:56.958211 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3089c2b1-1959-42bb-87ff-1f9c4ab4d01d-combined-ca-bundle\") pod \"3089c2b1-1959-42bb-87ff-1f9c4ab4d01d\" (UID: \"3089c2b1-1959-42bb-87ff-1f9c4ab4d01d\") " Oct 07 11:39:56 crc kubenswrapper[4700]: I1007 11:39:56.958256 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f14d077-e353-4931-a13e-c2184c602276-config-data\") pod \"8f14d077-e353-4931-a13e-c2184c602276\" (UID: \"8f14d077-e353-4931-a13e-c2184c602276\") " Oct 07 11:39:56 crc kubenswrapper[4700]: I1007 11:39:56.958348 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lgrp\" (UniqueName: \"kubernetes.io/projected/3089c2b1-1959-42bb-87ff-1f9c4ab4d01d-kube-api-access-5lgrp\") pod \"3089c2b1-1959-42bb-87ff-1f9c4ab4d01d\" (UID: \"3089c2b1-1959-42bb-87ff-1f9c4ab4d01d\") " Oct 07 11:39:56 crc kubenswrapper[4700]: I1007 11:39:56.958473 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f14d077-e353-4931-a13e-c2184c602276-combined-ca-bundle\") pod \"8f14d077-e353-4931-a13e-c2184c602276\" (UID: \"8f14d077-e353-4931-a13e-c2184c602276\") " Oct 07 11:39:56 crc kubenswrapper[4700]: I1007 11:39:56.958494 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f14d077-e353-4931-a13e-c2184c602276-scripts\") pod \"8f14d077-e353-4931-a13e-c2184c602276\" (UID: \"8f14d077-e353-4931-a13e-c2184c602276\") " Oct 07 11:39:56 crc kubenswrapper[4700]: I1007 11:39:56.958907 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3089c2b1-1959-42bb-87ff-1f9c4ab4d01d-scripts\") pod \"3089c2b1-1959-42bb-87ff-1f9c4ab4d01d\" (UID: \"3089c2b1-1959-42bb-87ff-1f9c4ab4d01d\") " Oct 07 11:39:56 crc kubenswrapper[4700]: I1007 11:39:56.965860 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3089c2b1-1959-42bb-87ff-1f9c4ab4d01d-kube-api-access-5lgrp" (OuterVolumeSpecName: "kube-api-access-5lgrp") pod "3089c2b1-1959-42bb-87ff-1f9c4ab4d01d" (UID: "3089c2b1-1959-42bb-87ff-1f9c4ab4d01d"). InnerVolumeSpecName "kube-api-access-5lgrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:39:56 crc kubenswrapper[4700]: I1007 11:39:56.966769 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3089c2b1-1959-42bb-87ff-1f9c4ab4d01d-scripts" (OuterVolumeSpecName: "scripts") pod "3089c2b1-1959-42bb-87ff-1f9c4ab4d01d" (UID: "3089c2b1-1959-42bb-87ff-1f9c4ab4d01d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:39:56 crc kubenswrapper[4700]: I1007 11:39:56.969492 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f14d077-e353-4931-a13e-c2184c602276-kube-api-access-j9669" (OuterVolumeSpecName: "kube-api-access-j9669") pod "8f14d077-e353-4931-a13e-c2184c602276" (UID: "8f14d077-e353-4931-a13e-c2184c602276"). InnerVolumeSpecName "kube-api-access-j9669". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:39:56 crc kubenswrapper[4700]: I1007 11:39:56.989452 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f14d077-e353-4931-a13e-c2184c602276-scripts" (OuterVolumeSpecName: "scripts") pod "8f14d077-e353-4931-a13e-c2184c602276" (UID: "8f14d077-e353-4931-a13e-c2184c602276"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.012627 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f14d077-e353-4931-a13e-c2184c602276-config-data" (OuterVolumeSpecName: "config-data") pod "8f14d077-e353-4931-a13e-c2184c602276" (UID: "8f14d077-e353-4931-a13e-c2184c602276"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.016139 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f14d077-e353-4931-a13e-c2184c602276-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8f14d077-e353-4931-a13e-c2184c602276" (UID: "8f14d077-e353-4931-a13e-c2184c602276"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.037053 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3089c2b1-1959-42bb-87ff-1f9c4ab4d01d-config-data" (OuterVolumeSpecName: "config-data") pod "3089c2b1-1959-42bb-87ff-1f9c4ab4d01d" (UID: "3089c2b1-1959-42bb-87ff-1f9c4ab4d01d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.037320 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3089c2b1-1959-42bb-87ff-1f9c4ab4d01d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3089c2b1-1959-42bb-87ff-1f9c4ab4d01d" (UID: "3089c2b1-1959-42bb-87ff-1f9c4ab4d01d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.061474 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lgrp\" (UniqueName: \"kubernetes.io/projected/3089c2b1-1959-42bb-87ff-1f9c4ab4d01d-kube-api-access-5lgrp\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.061518 4700 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f14d077-e353-4931-a13e-c2184c602276-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.061528 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f14d077-e353-4931-a13e-c2184c602276-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.061535 4700 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3089c2b1-1959-42bb-87ff-1f9c4ab4d01d-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.061544 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3089c2b1-1959-42bb-87ff-1f9c4ab4d01d-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.061552 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9669\" (UniqueName: \"kubernetes.io/projected/8f14d077-e353-4931-a13e-c2184c602276-kube-api-access-j9669\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.061561 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3089c2b1-1959-42bb-87ff-1f9c4ab4d01d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.061571 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f14d077-e353-4931-a13e-c2184c602276-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.292168 4700 generic.go:334] "Generic (PLEG): container finished" podID="0366de18-d4be-4a04-8bc7-b6343f5fc3f8" containerID="d353466d27b429c9a343e53308a8c7a0c67a9f1555fa1792fcbb02b44add53bb" exitCode=0 Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.292239 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6bc4c6c9-4mfq6" event={"ID":"0366de18-d4be-4a04-8bc7-b6343f5fc3f8","Type":"ContainerDied","Data":"d353466d27b429c9a343e53308a8c7a0c67a9f1555fa1792fcbb02b44add53bb"} Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.294296 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7vgrf" event={"ID":"8f14d077-e353-4931-a13e-c2184c602276","Type":"ContainerDied","Data":"591d3bae7ddfb73b4a2d144afb0cb59efa653958863bd6a2968b217b8d040f5d"} Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.294339 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="591d3bae7ddfb73b4a2d144afb0cb59efa653958863bd6a2968b217b8d040f5d" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.294396 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7vgrf" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.319490 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hk898" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.319729 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hk898" event={"ID":"3089c2b1-1959-42bb-87ff-1f9c4ab4d01d","Type":"ContainerDied","Data":"6a9bd262c75e83ed3650b86273d2b84a7d0fac6bbfedf336a980b52c816764d1"} Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.319767 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a9bd262c75e83ed3650b86273d2b84a7d0fac6bbfedf336a980b52c816764d1" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.380596 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.471896 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 11:39:57 crc kubenswrapper[4700]: E1007 11:39:57.472280 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f14d077-e353-4931-a13e-c2184c602276" containerName="nova-manage" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.472296 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f14d077-e353-4931-a13e-c2184c602276" containerName="nova-manage" Oct 07 11:39:57 crc kubenswrapper[4700]: E1007 11:39:57.472320 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3089c2b1-1959-42bb-87ff-1f9c4ab4d01d" containerName="nova-cell1-conductor-db-sync" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.472329 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="3089c2b1-1959-42bb-87ff-1f9c4ab4d01d" containerName="nova-cell1-conductor-db-sync" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.472512 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f14d077-e353-4931-a13e-c2184c602276" containerName="nova-manage" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.472537 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="3089c2b1-1959-42bb-87ff-1f9c4ab4d01d" containerName="nova-cell1-conductor-db-sync" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.473103 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.475715 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.489798 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.498693 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.531110 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.531338 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="53cbf569-dc75-4043-bed9-8e82ee7b5cbb" containerName="nova-metadata-log" containerID="cri-o://8d9e44d0ed9ac81d2b7b79114ced6e87ae012073f84b9fd2224831ee95ab4611" gracePeriod=30 Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.531408 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="53cbf569-dc75-4043-bed9-8e82ee7b5cbb" containerName="nova-metadata-metadata" containerID="cri-o://0fe18548ad5a14a88c03c1ef741326e27ed95ee7960bc6d5fb42e24c73ef5bd5" gracePeriod=30 Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.553852 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f6bc4c6c9-4mfq6" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.577807 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9z2p\" (UniqueName: \"kubernetes.io/projected/c70174ca-8b17-4f4a-9c6f-0df36cdd3fe1-kube-api-access-p9z2p\") pod \"nova-cell1-conductor-0\" (UID: \"c70174ca-8b17-4f4a-9c6f-0df36cdd3fe1\") " pod="openstack/nova-cell1-conductor-0" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.577892 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c70174ca-8b17-4f4a-9c6f-0df36cdd3fe1-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c70174ca-8b17-4f4a-9c6f-0df36cdd3fe1\") " pod="openstack/nova-cell1-conductor-0" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.577954 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c70174ca-8b17-4f4a-9c6f-0df36cdd3fe1-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c70174ca-8b17-4f4a-9c6f-0df36cdd3fe1\") " pod="openstack/nova-cell1-conductor-0" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.679321 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxlf7\" (UniqueName: \"kubernetes.io/projected/0366de18-d4be-4a04-8bc7-b6343f5fc3f8-kube-api-access-dxlf7\") pod \"0366de18-d4be-4a04-8bc7-b6343f5fc3f8\" (UID: \"0366de18-d4be-4a04-8bc7-b6343f5fc3f8\") " Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.679392 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0366de18-d4be-4a04-8bc7-b6343f5fc3f8-config\") pod \"0366de18-d4be-4a04-8bc7-b6343f5fc3f8\" (UID: \"0366de18-d4be-4a04-8bc7-b6343f5fc3f8\") " Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.679894 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0366de18-d4be-4a04-8bc7-b6343f5fc3f8-ovsdbserver-nb\") pod \"0366de18-d4be-4a04-8bc7-b6343f5fc3f8\" (UID: \"0366de18-d4be-4a04-8bc7-b6343f5fc3f8\") " Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.679941 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0366de18-d4be-4a04-8bc7-b6343f5fc3f8-ovsdbserver-sb\") pod \"0366de18-d4be-4a04-8bc7-b6343f5fc3f8\" (UID: \"0366de18-d4be-4a04-8bc7-b6343f5fc3f8\") " Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.679964 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0366de18-d4be-4a04-8bc7-b6343f5fc3f8-dns-svc\") pod \"0366de18-d4be-4a04-8bc7-b6343f5fc3f8\" (UID: \"0366de18-d4be-4a04-8bc7-b6343f5fc3f8\") " Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.680115 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0366de18-d4be-4a04-8bc7-b6343f5fc3f8-dns-swift-storage-0\") pod \"0366de18-d4be-4a04-8bc7-b6343f5fc3f8\" (UID: \"0366de18-d4be-4a04-8bc7-b6343f5fc3f8\") " Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.680467 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9z2p\" (UniqueName: \"kubernetes.io/projected/c70174ca-8b17-4f4a-9c6f-0df36cdd3fe1-kube-api-access-p9z2p\") pod \"nova-cell1-conductor-0\" (UID: \"c70174ca-8b17-4f4a-9c6f-0df36cdd3fe1\") " pod="openstack/nova-cell1-conductor-0" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.680553 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c70174ca-8b17-4f4a-9c6f-0df36cdd3fe1-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c70174ca-8b17-4f4a-9c6f-0df36cdd3fe1\") " pod="openstack/nova-cell1-conductor-0" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.680620 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c70174ca-8b17-4f4a-9c6f-0df36cdd3fe1-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c70174ca-8b17-4f4a-9c6f-0df36cdd3fe1\") " pod="openstack/nova-cell1-conductor-0" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.683733 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0366de18-d4be-4a04-8bc7-b6343f5fc3f8-kube-api-access-dxlf7" (OuterVolumeSpecName: "kube-api-access-dxlf7") pod "0366de18-d4be-4a04-8bc7-b6343f5fc3f8" (UID: "0366de18-d4be-4a04-8bc7-b6343f5fc3f8"). InnerVolumeSpecName "kube-api-access-dxlf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.685069 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c70174ca-8b17-4f4a-9c6f-0df36cdd3fe1-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c70174ca-8b17-4f4a-9c6f-0df36cdd3fe1\") " pod="openstack/nova-cell1-conductor-0" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.703450 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c70174ca-8b17-4f4a-9c6f-0df36cdd3fe1-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c70174ca-8b17-4f4a-9c6f-0df36cdd3fe1\") " pod="openstack/nova-cell1-conductor-0" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.704338 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9z2p\" (UniqueName: \"kubernetes.io/projected/c70174ca-8b17-4f4a-9c6f-0df36cdd3fe1-kube-api-access-p9z2p\") pod \"nova-cell1-conductor-0\" (UID: \"c70174ca-8b17-4f4a-9c6f-0df36cdd3fe1\") " pod="openstack/nova-cell1-conductor-0" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.716495 4700 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="615aed2c-5f89-4595-b80a-d1241196b64a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.197:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.758526 4700 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="615aed2c-5f89-4595-b80a-d1241196b64a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.197:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.765587 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0366de18-d4be-4a04-8bc7-b6343f5fc3f8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0366de18-d4be-4a04-8bc7-b6343f5fc3f8" (UID: "0366de18-d4be-4a04-8bc7-b6343f5fc3f8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.766212 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0366de18-d4be-4a04-8bc7-b6343f5fc3f8-config" (OuterVolumeSpecName: "config") pod "0366de18-d4be-4a04-8bc7-b6343f5fc3f8" (UID: "0366de18-d4be-4a04-8bc7-b6343f5fc3f8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.782260 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxlf7\" (UniqueName: \"kubernetes.io/projected/0366de18-d4be-4a04-8bc7-b6343f5fc3f8-kube-api-access-dxlf7\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.782290 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0366de18-d4be-4a04-8bc7-b6343f5fc3f8-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.782299 4700 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0366de18-d4be-4a04-8bc7-b6343f5fc3f8-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.783663 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0366de18-d4be-4a04-8bc7-b6343f5fc3f8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0366de18-d4be-4a04-8bc7-b6343f5fc3f8" (UID: "0366de18-d4be-4a04-8bc7-b6343f5fc3f8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.804917 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0366de18-d4be-4a04-8bc7-b6343f5fc3f8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0366de18-d4be-4a04-8bc7-b6343f5fc3f8" (UID: "0366de18-d4be-4a04-8bc7-b6343f5fc3f8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.815774 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0366de18-d4be-4a04-8bc7-b6343f5fc3f8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0366de18-d4be-4a04-8bc7-b6343f5fc3f8" (UID: "0366de18-d4be-4a04-8bc7-b6343f5fc3f8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.877217 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.888484 4700 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0366de18-d4be-4a04-8bc7-b6343f5fc3f8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.888506 4700 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0366de18-d4be-4a04-8bc7-b6343f5fc3f8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:57 crc kubenswrapper[4700]: I1007 11:39:57.888517 4700 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0366de18-d4be-4a04-8bc7-b6343f5fc3f8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.018176 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.022730 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.196458 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53cbf569-dc75-4043-bed9-8e82ee7b5cbb-combined-ca-bundle\") pod \"53cbf569-dc75-4043-bed9-8e82ee7b5cbb\" (UID: \"53cbf569-dc75-4043-bed9-8e82ee7b5cbb\") " Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.196869 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53cbf569-dc75-4043-bed9-8e82ee7b5cbb-logs\") pod \"53cbf569-dc75-4043-bed9-8e82ee7b5cbb\" (UID: \"53cbf569-dc75-4043-bed9-8e82ee7b5cbb\") " Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.196940 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j84vz\" (UniqueName: \"kubernetes.io/projected/53cbf569-dc75-4043-bed9-8e82ee7b5cbb-kube-api-access-j84vz\") pod \"53cbf569-dc75-4043-bed9-8e82ee7b5cbb\" (UID: \"53cbf569-dc75-4043-bed9-8e82ee7b5cbb\") " Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.197026 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/53cbf569-dc75-4043-bed9-8e82ee7b5cbb-nova-metadata-tls-certs\") pod \"53cbf569-dc75-4043-bed9-8e82ee7b5cbb\" (UID: \"53cbf569-dc75-4043-bed9-8e82ee7b5cbb\") " Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.197073 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53cbf569-dc75-4043-bed9-8e82ee7b5cbb-config-data\") pod \"53cbf569-dc75-4043-bed9-8e82ee7b5cbb\" (UID: \"53cbf569-dc75-4043-bed9-8e82ee7b5cbb\") " Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.197205 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53cbf569-dc75-4043-bed9-8e82ee7b5cbb-logs" (OuterVolumeSpecName: "logs") pod "53cbf569-dc75-4043-bed9-8e82ee7b5cbb" (UID: "53cbf569-dc75-4043-bed9-8e82ee7b5cbb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.202534 4700 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53cbf569-dc75-4043-bed9-8e82ee7b5cbb-logs\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.203626 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53cbf569-dc75-4043-bed9-8e82ee7b5cbb-kube-api-access-j84vz" (OuterVolumeSpecName: "kube-api-access-j84vz") pod "53cbf569-dc75-4043-bed9-8e82ee7b5cbb" (UID: "53cbf569-dc75-4043-bed9-8e82ee7b5cbb"). InnerVolumeSpecName "kube-api-access-j84vz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.237110 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53cbf569-dc75-4043-bed9-8e82ee7b5cbb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53cbf569-dc75-4043-bed9-8e82ee7b5cbb" (UID: "53cbf569-dc75-4043-bed9-8e82ee7b5cbb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.242498 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53cbf569-dc75-4043-bed9-8e82ee7b5cbb-config-data" (OuterVolumeSpecName: "config-data") pod "53cbf569-dc75-4043-bed9-8e82ee7b5cbb" (UID: "53cbf569-dc75-4043-bed9-8e82ee7b5cbb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.260463 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53cbf569-dc75-4043-bed9-8e82ee7b5cbb-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "53cbf569-dc75-4043-bed9-8e82ee7b5cbb" (UID: "53cbf569-dc75-4043-bed9-8e82ee7b5cbb"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.304738 4700 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/53cbf569-dc75-4043-bed9-8e82ee7b5cbb-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.304785 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53cbf569-dc75-4043-bed9-8e82ee7b5cbb-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.304795 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53cbf569-dc75-4043-bed9-8e82ee7b5cbb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.304804 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j84vz\" (UniqueName: \"kubernetes.io/projected/53cbf569-dc75-4043-bed9-8e82ee7b5cbb-kube-api-access-j84vz\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.332421 4700 generic.go:334] "Generic (PLEG): container finished" podID="53cbf569-dc75-4043-bed9-8e82ee7b5cbb" containerID="0fe18548ad5a14a88c03c1ef741326e27ed95ee7960bc6d5fb42e24c73ef5bd5" exitCode=0 Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.332462 4700 generic.go:334] "Generic (PLEG): container finished" podID="53cbf569-dc75-4043-bed9-8e82ee7b5cbb" containerID="8d9e44d0ed9ac81d2b7b79114ced6e87ae012073f84b9fd2224831ee95ab4611" exitCode=143 Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.332513 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"53cbf569-dc75-4043-bed9-8e82ee7b5cbb","Type":"ContainerDied","Data":"0fe18548ad5a14a88c03c1ef741326e27ed95ee7960bc6d5fb42e24c73ef5bd5"} Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.332548 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"53cbf569-dc75-4043-bed9-8e82ee7b5cbb","Type":"ContainerDied","Data":"8d9e44d0ed9ac81d2b7b79114ced6e87ae012073f84b9fd2224831ee95ab4611"} Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.332566 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"53cbf569-dc75-4043-bed9-8e82ee7b5cbb","Type":"ContainerDied","Data":"bc4bc7f7f32ab414fca0016bee8ae030928d5ee44a2af24832e9cb4f8a1b6282"} Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.332584 4700 scope.go:117] "RemoveContainer" containerID="0fe18548ad5a14a88c03c1ef741326e27ed95ee7960bc6d5fb42e24c73ef5bd5" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.332745 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.344814 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f6bc4c6c9-4mfq6" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.344996 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6bc4c6c9-4mfq6" event={"ID":"0366de18-d4be-4a04-8bc7-b6343f5fc3f8","Type":"ContainerDied","Data":"5445846a9d2be1fa12fe5f1e511331b5de4a91e53ed584762fcef7fe34c15919"} Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.345337 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="615aed2c-5f89-4595-b80a-d1241196b64a" containerName="nova-api-log" containerID="cri-o://4cf9fab36dfa862401b67a39ff4a5238ce0c9b6cb2521ac2b78c7ab0740ad2cb" gracePeriod=30 Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.345411 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="615aed2c-5f89-4595-b80a-d1241196b64a" containerName="nova-api-api" containerID="cri-o://ff202ffd72b27ded4bad082b24ccc925b216deac726d94b11b3514daa9c30de2" gracePeriod=30 Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.374518 4700 scope.go:117] "RemoveContainer" containerID="8d9e44d0ed9ac81d2b7b79114ced6e87ae012073f84b9fd2224831ee95ab4611" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.391058 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f6bc4c6c9-4mfq6"] Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.401138 4700 scope.go:117] "RemoveContainer" containerID="0fe18548ad5a14a88c03c1ef741326e27ed95ee7960bc6d5fb42e24c73ef5bd5" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.402652 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f6bc4c6c9-4mfq6"] Oct 07 11:39:58 crc kubenswrapper[4700]: E1007 11:39:58.410464 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fe18548ad5a14a88c03c1ef741326e27ed95ee7960bc6d5fb42e24c73ef5bd5\": container with ID starting with 0fe18548ad5a14a88c03c1ef741326e27ed95ee7960bc6d5fb42e24c73ef5bd5 not found: ID does not exist" containerID="0fe18548ad5a14a88c03c1ef741326e27ed95ee7960bc6d5fb42e24c73ef5bd5" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.410515 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fe18548ad5a14a88c03c1ef741326e27ed95ee7960bc6d5fb42e24c73ef5bd5"} err="failed to get container status \"0fe18548ad5a14a88c03c1ef741326e27ed95ee7960bc6d5fb42e24c73ef5bd5\": rpc error: code = NotFound desc = could not find container \"0fe18548ad5a14a88c03c1ef741326e27ed95ee7960bc6d5fb42e24c73ef5bd5\": container with ID starting with 0fe18548ad5a14a88c03c1ef741326e27ed95ee7960bc6d5fb42e24c73ef5bd5 not found: ID does not exist" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.410541 4700 scope.go:117] "RemoveContainer" containerID="8d9e44d0ed9ac81d2b7b79114ced6e87ae012073f84b9fd2224831ee95ab4611" Oct 07 11:39:58 crc kubenswrapper[4700]: E1007 11:39:58.415961 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d9e44d0ed9ac81d2b7b79114ced6e87ae012073f84b9fd2224831ee95ab4611\": container with ID starting with 8d9e44d0ed9ac81d2b7b79114ced6e87ae012073f84b9fd2224831ee95ab4611 not found: ID does not exist" containerID="8d9e44d0ed9ac81d2b7b79114ced6e87ae012073f84b9fd2224831ee95ab4611" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.415995 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d9e44d0ed9ac81d2b7b79114ced6e87ae012073f84b9fd2224831ee95ab4611"} err="failed to get container status \"8d9e44d0ed9ac81d2b7b79114ced6e87ae012073f84b9fd2224831ee95ab4611\": rpc error: code = NotFound desc = could not find container \"8d9e44d0ed9ac81d2b7b79114ced6e87ae012073f84b9fd2224831ee95ab4611\": container with ID starting with 8d9e44d0ed9ac81d2b7b79114ced6e87ae012073f84b9fd2224831ee95ab4611 not found: ID does not exist" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.416015 4700 scope.go:117] "RemoveContainer" containerID="0fe18548ad5a14a88c03c1ef741326e27ed95ee7960bc6d5fb42e24c73ef5bd5" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.416091 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.418633 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fe18548ad5a14a88c03c1ef741326e27ed95ee7960bc6d5fb42e24c73ef5bd5"} err="failed to get container status \"0fe18548ad5a14a88c03c1ef741326e27ed95ee7960bc6d5fb42e24c73ef5bd5\": rpc error: code = NotFound desc = could not find container \"0fe18548ad5a14a88c03c1ef741326e27ed95ee7960bc6d5fb42e24c73ef5bd5\": container with ID starting with 0fe18548ad5a14a88c03c1ef741326e27ed95ee7960bc6d5fb42e24c73ef5bd5 not found: ID does not exist" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.418656 4700 scope.go:117] "RemoveContainer" containerID="8d9e44d0ed9ac81d2b7b79114ced6e87ae012073f84b9fd2224831ee95ab4611" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.425709 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d9e44d0ed9ac81d2b7b79114ced6e87ae012073f84b9fd2224831ee95ab4611"} err="failed to get container status \"8d9e44d0ed9ac81d2b7b79114ced6e87ae012073f84b9fd2224831ee95ab4611\": rpc error: code = NotFound desc = could not find container \"8d9e44d0ed9ac81d2b7b79114ced6e87ae012073f84b9fd2224831ee95ab4611\": container with ID starting with 8d9e44d0ed9ac81d2b7b79114ced6e87ae012073f84b9fd2224831ee95ab4611 not found: ID does not exist" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.425760 4700 scope.go:117] "RemoveContainer" containerID="d353466d27b429c9a343e53308a8c7a0c67a9f1555fa1792fcbb02b44add53bb" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.428461 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.448554 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 07 11:39:58 crc kubenswrapper[4700]: E1007 11:39:58.449090 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53cbf569-dc75-4043-bed9-8e82ee7b5cbb" containerName="nova-metadata-log" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.449108 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="53cbf569-dc75-4043-bed9-8e82ee7b5cbb" containerName="nova-metadata-log" Oct 07 11:39:58 crc kubenswrapper[4700]: E1007 11:39:58.449132 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0366de18-d4be-4a04-8bc7-b6343f5fc3f8" containerName="dnsmasq-dns" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.449140 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="0366de18-d4be-4a04-8bc7-b6343f5fc3f8" containerName="dnsmasq-dns" Oct 07 11:39:58 crc kubenswrapper[4700]: E1007 11:39:58.449169 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0366de18-d4be-4a04-8bc7-b6343f5fc3f8" containerName="init" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.449177 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="0366de18-d4be-4a04-8bc7-b6343f5fc3f8" containerName="init" Oct 07 11:39:58 crc kubenswrapper[4700]: E1007 11:39:58.449201 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53cbf569-dc75-4043-bed9-8e82ee7b5cbb" containerName="nova-metadata-metadata" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.449209 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="53cbf569-dc75-4043-bed9-8e82ee7b5cbb" containerName="nova-metadata-metadata" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.449456 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="53cbf569-dc75-4043-bed9-8e82ee7b5cbb" containerName="nova-metadata-log" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.449474 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="0366de18-d4be-4a04-8bc7-b6343f5fc3f8" containerName="dnsmasq-dns" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.449488 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="53cbf569-dc75-4043-bed9-8e82ee7b5cbb" containerName="nova-metadata-metadata" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.450984 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.453818 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.454150 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.455646 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.458407 4700 scope.go:117] "RemoveContainer" containerID="aab20955711c5d551669a5cabacec100b65721767ecfd298073c7c73c3d02a6d" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.482346 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.513253 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13922810-c6e2-4774-b143-a1aadd32210b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"13922810-c6e2-4774-b143-a1aadd32210b\") " pod="openstack/nova-metadata-0" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.513324 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13922810-c6e2-4774-b143-a1aadd32210b-logs\") pod \"nova-metadata-0\" (UID: \"13922810-c6e2-4774-b143-a1aadd32210b\") " pod="openstack/nova-metadata-0" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.513360 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/13922810-c6e2-4774-b143-a1aadd32210b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"13922810-c6e2-4774-b143-a1aadd32210b\") " pod="openstack/nova-metadata-0" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.513381 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13922810-c6e2-4774-b143-a1aadd32210b-config-data\") pod \"nova-metadata-0\" (UID: \"13922810-c6e2-4774-b143-a1aadd32210b\") " pod="openstack/nova-metadata-0" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.513407 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp5t5\" (UniqueName: \"kubernetes.io/projected/13922810-c6e2-4774-b143-a1aadd32210b-kube-api-access-kp5t5\") pod \"nova-metadata-0\" (UID: \"13922810-c6e2-4774-b143-a1aadd32210b\") " pod="openstack/nova-metadata-0" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.615577 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13922810-c6e2-4774-b143-a1aadd32210b-logs\") pod \"nova-metadata-0\" (UID: \"13922810-c6e2-4774-b143-a1aadd32210b\") " pod="openstack/nova-metadata-0" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.615989 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13922810-c6e2-4774-b143-a1aadd32210b-logs\") pod \"nova-metadata-0\" (UID: \"13922810-c6e2-4774-b143-a1aadd32210b\") " pod="openstack/nova-metadata-0" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.616038 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/13922810-c6e2-4774-b143-a1aadd32210b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"13922810-c6e2-4774-b143-a1aadd32210b\") " pod="openstack/nova-metadata-0" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.616102 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13922810-c6e2-4774-b143-a1aadd32210b-config-data\") pod \"nova-metadata-0\" (UID: \"13922810-c6e2-4774-b143-a1aadd32210b\") " pod="openstack/nova-metadata-0" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.616748 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp5t5\" (UniqueName: \"kubernetes.io/projected/13922810-c6e2-4774-b143-a1aadd32210b-kube-api-access-kp5t5\") pod \"nova-metadata-0\" (UID: \"13922810-c6e2-4774-b143-a1aadd32210b\") " pod="openstack/nova-metadata-0" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.616975 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13922810-c6e2-4774-b143-a1aadd32210b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"13922810-c6e2-4774-b143-a1aadd32210b\") " pod="openstack/nova-metadata-0" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.619049 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/13922810-c6e2-4774-b143-a1aadd32210b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"13922810-c6e2-4774-b143-a1aadd32210b\") " pod="openstack/nova-metadata-0" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.620151 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13922810-c6e2-4774-b143-a1aadd32210b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"13922810-c6e2-4774-b143-a1aadd32210b\") " pod="openstack/nova-metadata-0" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.622122 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13922810-c6e2-4774-b143-a1aadd32210b-config-data\") pod \"nova-metadata-0\" (UID: \"13922810-c6e2-4774-b143-a1aadd32210b\") " pod="openstack/nova-metadata-0" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.636797 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp5t5\" (UniqueName: \"kubernetes.io/projected/13922810-c6e2-4774-b143-a1aadd32210b-kube-api-access-kp5t5\") pod \"nova-metadata-0\" (UID: \"13922810-c6e2-4774-b143-a1aadd32210b\") " pod="openstack/nova-metadata-0" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.782547 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.900822 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 11:39:58 crc kubenswrapper[4700]: I1007 11:39:58.901041 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="67b87725-3618-4837-b1b5-c98afe5de4a4" containerName="kube-state-metrics" containerID="cri-o://3b9924ecb48939a9f995e1ad46667c3e95d3f828df19c72d5316ddcb7558a8c3" gracePeriod=30 Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.051343 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 11:39:59 crc kubenswrapper[4700]: W1007 11:39:59.069718 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13922810_c6e2_4774_b143_a1aadd32210b.slice/crio-2a52d001e1fa210b393fa3e7b5e1b668d8bbf8da3c2df86dbb1c4ab45a67493d WatchSource:0}: Error finding container 2a52d001e1fa210b393fa3e7b5e1b668d8bbf8da3c2df86dbb1c4ab45a67493d: Status 404 returned error can't find the container with id 2a52d001e1fa210b393fa3e7b5e1b668d8bbf8da3c2df86dbb1c4ab45a67493d Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.316353 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.360349 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c70174ca-8b17-4f4a-9c6f-0df36cdd3fe1","Type":"ContainerStarted","Data":"8de03f1ce8a7103069f22397f15fdb6654f8347ceab1c9101582c2de4f96a8f6"} Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.360387 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c70174ca-8b17-4f4a-9c6f-0df36cdd3fe1","Type":"ContainerStarted","Data":"ff738597f0168efa6a2c6792833729d7b8fea17747d0f4431d37bed47e8a4a96"} Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.361234 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.378395 4700 generic.go:334] "Generic (PLEG): container finished" podID="615aed2c-5f89-4595-b80a-d1241196b64a" containerID="4cf9fab36dfa862401b67a39ff4a5238ce0c9b6cb2521ac2b78c7ab0740ad2cb" exitCode=143 Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.378478 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"615aed2c-5f89-4595-b80a-d1241196b64a","Type":"ContainerDied","Data":"4cf9fab36dfa862401b67a39ff4a5238ce0c9b6cb2521ac2b78c7ab0740ad2cb"} Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.383332 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"13922810-c6e2-4774-b143-a1aadd32210b","Type":"ContainerStarted","Data":"f6def2a9779086cf4b364f91fea389c3b9c73adefa706e818fe269f437314ab5"} Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.383376 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"13922810-c6e2-4774-b143-a1aadd32210b","Type":"ContainerStarted","Data":"2a52d001e1fa210b393fa3e7b5e1b668d8bbf8da3c2df86dbb1c4ab45a67493d"} Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.388655 4700 generic.go:334] "Generic (PLEG): container finished" podID="67b87725-3618-4837-b1b5-c98afe5de4a4" containerID="3b9924ecb48939a9f995e1ad46667c3e95d3f828df19c72d5316ddcb7558a8c3" exitCode=2 Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.388728 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"67b87725-3618-4837-b1b5-c98afe5de4a4","Type":"ContainerDied","Data":"3b9924ecb48939a9f995e1ad46667c3e95d3f828df19c72d5316ddcb7558a8c3"} Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.388759 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"67b87725-3618-4837-b1b5-c98afe5de4a4","Type":"ContainerDied","Data":"ba7afb090d031851d61e5e006f06b697007f89d947e9084f3453d1b4100f8bf3"} Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.388779 4700 scope.go:117] "RemoveContainer" containerID="3b9924ecb48939a9f995e1ad46667c3e95d3f828df19c72d5316ddcb7558a8c3" Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.388895 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.391850 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.391827301 podStartE2EDuration="2.391827301s" podCreationTimestamp="2025-10-07 11:39:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:39:59.378807451 +0000 UTC m=+1166.175206450" watchObservedRunningTime="2025-10-07 11:39:59.391827301 +0000 UTC m=+1166.188226290" Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.399256 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="05ed3d4c-807a-4022-a1e0-6e2fc312c99b" containerName="nova-scheduler-scheduler" containerID="cri-o://edb0bfe4d57fed9e2aa29f479479c8c6d9e0446175d093acf8fa0526a9b10373" gracePeriod=30 Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.427460 4700 scope.go:117] "RemoveContainer" containerID="3b9924ecb48939a9f995e1ad46667c3e95d3f828df19c72d5316ddcb7558a8c3" Oct 07 11:39:59 crc kubenswrapper[4700]: E1007 11:39:59.431576 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b9924ecb48939a9f995e1ad46667c3e95d3f828df19c72d5316ddcb7558a8c3\": container with ID starting with 3b9924ecb48939a9f995e1ad46667c3e95d3f828df19c72d5316ddcb7558a8c3 not found: ID does not exist" containerID="3b9924ecb48939a9f995e1ad46667c3e95d3f828df19c72d5316ddcb7558a8c3" Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.431624 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b9924ecb48939a9f995e1ad46667c3e95d3f828df19c72d5316ddcb7558a8c3"} err="failed to get container status \"3b9924ecb48939a9f995e1ad46667c3e95d3f828df19c72d5316ddcb7558a8c3\": rpc error: code = NotFound desc = could not find container \"3b9924ecb48939a9f995e1ad46667c3e95d3f828df19c72d5316ddcb7558a8c3\": container with ID starting with 3b9924ecb48939a9f995e1ad46667c3e95d3f828df19c72d5316ddcb7558a8c3 not found: ID does not exist" Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.434646 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbs5p\" (UniqueName: \"kubernetes.io/projected/67b87725-3618-4837-b1b5-c98afe5de4a4-kube-api-access-rbs5p\") pod \"67b87725-3618-4837-b1b5-c98afe5de4a4\" (UID: \"67b87725-3618-4837-b1b5-c98afe5de4a4\") " Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.439685 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67b87725-3618-4837-b1b5-c98afe5de4a4-kube-api-access-rbs5p" (OuterVolumeSpecName: "kube-api-access-rbs5p") pod "67b87725-3618-4837-b1b5-c98afe5de4a4" (UID: "67b87725-3618-4837-b1b5-c98afe5de4a4"). InnerVolumeSpecName "kube-api-access-rbs5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.537758 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbs5p\" (UniqueName: \"kubernetes.io/projected/67b87725-3618-4837-b1b5-c98afe5de4a4-kube-api-access-rbs5p\") on node \"crc\" DevicePath \"\"" Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.732696 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.739342 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.751731 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 11:39:59 crc kubenswrapper[4700]: E1007 11:39:59.752220 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67b87725-3618-4837-b1b5-c98afe5de4a4" containerName="kube-state-metrics" Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.752243 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="67b87725-3618-4837-b1b5-c98afe5de4a4" containerName="kube-state-metrics" Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.752509 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="67b87725-3618-4837-b1b5-c98afe5de4a4" containerName="kube-state-metrics" Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.753534 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.755742 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.756319 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.763446 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.843919 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd4h7\" (UniqueName: \"kubernetes.io/projected/ed269e79-4083-4c3b-b44e-4986f2d82921-kube-api-access-cd4h7\") pod \"kube-state-metrics-0\" (UID: \"ed269e79-4083-4c3b-b44e-4986f2d82921\") " pod="openstack/kube-state-metrics-0" Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.843988 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed269e79-4083-4c3b-b44e-4986f2d82921-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ed269e79-4083-4c3b-b44e-4986f2d82921\") " pod="openstack/kube-state-metrics-0" Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.844150 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ed269e79-4083-4c3b-b44e-4986f2d82921-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ed269e79-4083-4c3b-b44e-4986f2d82921\") " pod="openstack/kube-state-metrics-0" Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.844258 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed269e79-4083-4c3b-b44e-4986f2d82921-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ed269e79-4083-4c3b-b44e-4986f2d82921\") " pod="openstack/kube-state-metrics-0" Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.945483 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed269e79-4083-4c3b-b44e-4986f2d82921-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ed269e79-4083-4c3b-b44e-4986f2d82921\") " pod="openstack/kube-state-metrics-0" Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.945564 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ed269e79-4083-4c3b-b44e-4986f2d82921-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ed269e79-4083-4c3b-b44e-4986f2d82921\") " pod="openstack/kube-state-metrics-0" Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.945630 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed269e79-4083-4c3b-b44e-4986f2d82921-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ed269e79-4083-4c3b-b44e-4986f2d82921\") " pod="openstack/kube-state-metrics-0" Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.945676 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cd4h7\" (UniqueName: \"kubernetes.io/projected/ed269e79-4083-4c3b-b44e-4986f2d82921-kube-api-access-cd4h7\") pod \"kube-state-metrics-0\" (UID: \"ed269e79-4083-4c3b-b44e-4986f2d82921\") " pod="openstack/kube-state-metrics-0" Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.950952 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed269e79-4083-4c3b-b44e-4986f2d82921-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ed269e79-4083-4c3b-b44e-4986f2d82921\") " pod="openstack/kube-state-metrics-0" Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.952829 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ed269e79-4083-4c3b-b44e-4986f2d82921-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ed269e79-4083-4c3b-b44e-4986f2d82921\") " pod="openstack/kube-state-metrics-0" Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.964936 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd4h7\" (UniqueName: \"kubernetes.io/projected/ed269e79-4083-4c3b-b44e-4986f2d82921-kube-api-access-cd4h7\") pod \"kube-state-metrics-0\" (UID: \"ed269e79-4083-4c3b-b44e-4986f2d82921\") " pod="openstack/kube-state-metrics-0" Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.967815 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0366de18-d4be-4a04-8bc7-b6343f5fc3f8" path="/var/lib/kubelet/pods/0366de18-d4be-4a04-8bc7-b6343f5fc3f8/volumes" Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.968420 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53cbf569-dc75-4043-bed9-8e82ee7b5cbb" path="/var/lib/kubelet/pods/53cbf569-dc75-4043-bed9-8e82ee7b5cbb/volumes" Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.968896 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67b87725-3618-4837-b1b5-c98afe5de4a4" path="/var/lib/kubelet/pods/67b87725-3618-4837-b1b5-c98afe5de4a4/volumes" Oct 07 11:39:59 crc kubenswrapper[4700]: I1007 11:39:59.971224 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed269e79-4083-4c3b-b44e-4986f2d82921-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ed269e79-4083-4c3b-b44e-4986f2d82921\") " pod="openstack/kube-state-metrics-0" Oct 07 11:40:00 crc kubenswrapper[4700]: I1007 11:40:00.073877 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 11:40:00 crc kubenswrapper[4700]: I1007 11:40:00.410471 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"13922810-c6e2-4774-b143-a1aadd32210b","Type":"ContainerStarted","Data":"3a158e372c8eecc57c570519337623538a48b0fcb5d43bec1ba34297cc3c49d3"} Oct 07 11:40:00 crc kubenswrapper[4700]: I1007 11:40:00.444399 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.44438223 podStartE2EDuration="2.44438223s" podCreationTimestamp="2025-10-07 11:39:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:40:00.430247801 +0000 UTC m=+1167.226646790" watchObservedRunningTime="2025-10-07 11:40:00.44438223 +0000 UTC m=+1167.240781219" Oct 07 11:40:00 crc kubenswrapper[4700]: I1007 11:40:00.576773 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 11:40:01 crc kubenswrapper[4700]: I1007 11:40:01.244234 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:40:01 crc kubenswrapper[4700]: I1007 11:40:01.244930 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b765c26c-24c0-41b8-a126-0524806c134d" containerName="ceilometer-central-agent" containerID="cri-o://6e927eac471a5fd273fd455375bf8554a20a98e87b7c46a54563db72f4d86b97" gracePeriod=30 Oct 07 11:40:01 crc kubenswrapper[4700]: I1007 11:40:01.245020 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b765c26c-24c0-41b8-a126-0524806c134d" containerName="proxy-httpd" containerID="cri-o://f83d0e3646111a30ff529e17ebeb46e1ef5491f801750f3a1defb5aff45ed57a" gracePeriod=30 Oct 07 11:40:01 crc kubenswrapper[4700]: I1007 11:40:01.245005 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b765c26c-24c0-41b8-a126-0524806c134d" containerName="sg-core" containerID="cri-o://afc0c4721959cf69ae63566f0342a4922a525abca700f59eb5d713f0cbad60fc" gracePeriod=30 Oct 07 11:40:01 crc kubenswrapper[4700]: I1007 11:40:01.245005 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b765c26c-24c0-41b8-a126-0524806c134d" containerName="ceilometer-notification-agent" containerID="cri-o://5e5801de6a400d3d3dcffcd9343f50edcd34145d0cee52bf49ee0f30d7eb17cb" gracePeriod=30 Oct 07 11:40:01 crc kubenswrapper[4700]: I1007 11:40:01.424531 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ed269e79-4083-4c3b-b44e-4986f2d82921","Type":"ContainerStarted","Data":"cefd39b4f99c7cf823170d61bccf2bdb977542e99340b45cb8521970a7a03bce"} Oct 07 11:40:01 crc kubenswrapper[4700]: I1007 11:40:01.424574 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ed269e79-4083-4c3b-b44e-4986f2d82921","Type":"ContainerStarted","Data":"816f0d102a9d0c538c364311bdd557c423c65cd15c783b26de652ad00da3db3b"} Oct 07 11:40:01 crc kubenswrapper[4700]: I1007 11:40:01.425590 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 07 11:40:01 crc kubenswrapper[4700]: I1007 11:40:01.428839 4700 generic.go:334] "Generic (PLEG): container finished" podID="b765c26c-24c0-41b8-a126-0524806c134d" containerID="f83d0e3646111a30ff529e17ebeb46e1ef5491f801750f3a1defb5aff45ed57a" exitCode=0 Oct 07 11:40:01 crc kubenswrapper[4700]: I1007 11:40:01.428866 4700 generic.go:334] "Generic (PLEG): container finished" podID="b765c26c-24c0-41b8-a126-0524806c134d" containerID="afc0c4721959cf69ae63566f0342a4922a525abca700f59eb5d713f0cbad60fc" exitCode=2 Oct 07 11:40:01 crc kubenswrapper[4700]: I1007 11:40:01.429545 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b765c26c-24c0-41b8-a126-0524806c134d","Type":"ContainerDied","Data":"f83d0e3646111a30ff529e17ebeb46e1ef5491f801750f3a1defb5aff45ed57a"} Oct 07 11:40:01 crc kubenswrapper[4700]: I1007 11:40:01.429566 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b765c26c-24c0-41b8-a126-0524806c134d","Type":"ContainerDied","Data":"afc0c4721959cf69ae63566f0342a4922a525abca700f59eb5d713f0cbad60fc"} Oct 07 11:40:01 crc kubenswrapper[4700]: I1007 11:40:01.449351 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.100694483 podStartE2EDuration="2.449330478s" podCreationTimestamp="2025-10-07 11:39:59 +0000 UTC" firstStartedPulling="2025-10-07 11:40:00.585926443 +0000 UTC m=+1167.382325452" lastFinishedPulling="2025-10-07 11:40:00.934562448 +0000 UTC m=+1167.730961447" observedRunningTime="2025-10-07 11:40:01.448143567 +0000 UTC m=+1168.244542556" watchObservedRunningTime="2025-10-07 11:40:01.449330478 +0000 UTC m=+1168.245729477" Oct 07 11:40:01 crc kubenswrapper[4700]: E1007 11:40:01.824104 4700 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="edb0bfe4d57fed9e2aa29f479479c8c6d9e0446175d093acf8fa0526a9b10373" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 11:40:01 crc kubenswrapper[4700]: E1007 11:40:01.825545 4700 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="edb0bfe4d57fed9e2aa29f479479c8c6d9e0446175d093acf8fa0526a9b10373" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 11:40:01 crc kubenswrapper[4700]: E1007 11:40:01.826936 4700 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="edb0bfe4d57fed9e2aa29f479479c8c6d9e0446175d093acf8fa0526a9b10373" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 11:40:01 crc kubenswrapper[4700]: E1007 11:40:01.826971 4700 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="05ed3d4c-807a-4022-a1e0-6e2fc312c99b" containerName="nova-scheduler-scheduler" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.038867 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.183446 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b765c26c-24c0-41b8-a126-0524806c134d-combined-ca-bundle\") pod \"b765c26c-24c0-41b8-a126-0524806c134d\" (UID: \"b765c26c-24c0-41b8-a126-0524806c134d\") " Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.183802 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b765c26c-24c0-41b8-a126-0524806c134d-log-httpd\") pod \"b765c26c-24c0-41b8-a126-0524806c134d\" (UID: \"b765c26c-24c0-41b8-a126-0524806c134d\") " Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.183843 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b765c26c-24c0-41b8-a126-0524806c134d-run-httpd\") pod \"b765c26c-24c0-41b8-a126-0524806c134d\" (UID: \"b765c26c-24c0-41b8-a126-0524806c134d\") " Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.183863 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b765c26c-24c0-41b8-a126-0524806c134d-config-data\") pod \"b765c26c-24c0-41b8-a126-0524806c134d\" (UID: \"b765c26c-24c0-41b8-a126-0524806c134d\") " Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.183884 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hchbt\" (UniqueName: \"kubernetes.io/projected/b765c26c-24c0-41b8-a126-0524806c134d-kube-api-access-hchbt\") pod \"b765c26c-24c0-41b8-a126-0524806c134d\" (UID: \"b765c26c-24c0-41b8-a126-0524806c134d\") " Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.183900 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b765c26c-24c0-41b8-a126-0524806c134d-sg-core-conf-yaml\") pod \"b765c26c-24c0-41b8-a126-0524806c134d\" (UID: \"b765c26c-24c0-41b8-a126-0524806c134d\") " Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.183949 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b765c26c-24c0-41b8-a126-0524806c134d-scripts\") pod \"b765c26c-24c0-41b8-a126-0524806c134d\" (UID: \"b765c26c-24c0-41b8-a126-0524806c134d\") " Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.184252 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b765c26c-24c0-41b8-a126-0524806c134d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b765c26c-24c0-41b8-a126-0524806c134d" (UID: "b765c26c-24c0-41b8-a126-0524806c134d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.184472 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b765c26c-24c0-41b8-a126-0524806c134d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b765c26c-24c0-41b8-a126-0524806c134d" (UID: "b765c26c-24c0-41b8-a126-0524806c134d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.184606 4700 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b765c26c-24c0-41b8-a126-0524806c134d-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.189383 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b765c26c-24c0-41b8-a126-0524806c134d-kube-api-access-hchbt" (OuterVolumeSpecName: "kube-api-access-hchbt") pod "b765c26c-24c0-41b8-a126-0524806c134d" (UID: "b765c26c-24c0-41b8-a126-0524806c134d"). InnerVolumeSpecName "kube-api-access-hchbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.189433 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b765c26c-24c0-41b8-a126-0524806c134d-scripts" (OuterVolumeSpecName: "scripts") pod "b765c26c-24c0-41b8-a126-0524806c134d" (UID: "b765c26c-24c0-41b8-a126-0524806c134d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.209497 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b765c26c-24c0-41b8-a126-0524806c134d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b765c26c-24c0-41b8-a126-0524806c134d" (UID: "b765c26c-24c0-41b8-a126-0524806c134d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.259087 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b765c26c-24c0-41b8-a126-0524806c134d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b765c26c-24c0-41b8-a126-0524806c134d" (UID: "b765c26c-24c0-41b8-a126-0524806c134d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.286109 4700 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b765c26c-24c0-41b8-a126-0524806c134d-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.286192 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hchbt\" (UniqueName: \"kubernetes.io/projected/b765c26c-24c0-41b8-a126-0524806c134d-kube-api-access-hchbt\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.286208 4700 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b765c26c-24c0-41b8-a126-0524806c134d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.286219 4700 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b765c26c-24c0-41b8-a126-0524806c134d-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.286287 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b765c26c-24c0-41b8-a126-0524806c134d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.301784 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b765c26c-24c0-41b8-a126-0524806c134d-config-data" (OuterVolumeSpecName: "config-data") pod "b765c26c-24c0-41b8-a126-0524806c134d" (UID: "b765c26c-24c0-41b8-a126-0524806c134d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.388331 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b765c26c-24c0-41b8-a126-0524806c134d-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.442930 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.442891 4700 generic.go:334] "Generic (PLEG): container finished" podID="b765c26c-24c0-41b8-a126-0524806c134d" containerID="5e5801de6a400d3d3dcffcd9343f50edcd34145d0cee52bf49ee0f30d7eb17cb" exitCode=0 Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.443299 4700 generic.go:334] "Generic (PLEG): container finished" podID="b765c26c-24c0-41b8-a126-0524806c134d" containerID="6e927eac471a5fd273fd455375bf8554a20a98e87b7c46a54563db72f4d86b97" exitCode=0 Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.442952 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b765c26c-24c0-41b8-a126-0524806c134d","Type":"ContainerDied","Data":"5e5801de6a400d3d3dcffcd9343f50edcd34145d0cee52bf49ee0f30d7eb17cb"} Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.443379 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b765c26c-24c0-41b8-a126-0524806c134d","Type":"ContainerDied","Data":"6e927eac471a5fd273fd455375bf8554a20a98e87b7c46a54563db72f4d86b97"} Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.443420 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b765c26c-24c0-41b8-a126-0524806c134d","Type":"ContainerDied","Data":"6902e3bc236407e12ed29d73d61d13cdb85e8be92dba60de94684c9779c4aa06"} Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.443465 4700 scope.go:117] "RemoveContainer" containerID="f83d0e3646111a30ff529e17ebeb46e1ef5491f801750f3a1defb5aff45ed57a" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.484178 4700 scope.go:117] "RemoveContainer" containerID="afc0c4721959cf69ae63566f0342a4922a525abca700f59eb5d713f0cbad60fc" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.487579 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.501121 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.520647 4700 scope.go:117] "RemoveContainer" containerID="5e5801de6a400d3d3dcffcd9343f50edcd34145d0cee52bf49ee0f30d7eb17cb" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.525428 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:40:02 crc kubenswrapper[4700]: E1007 11:40:02.525809 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b765c26c-24c0-41b8-a126-0524806c134d" containerName="proxy-httpd" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.525825 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="b765c26c-24c0-41b8-a126-0524806c134d" containerName="proxy-httpd" Oct 07 11:40:02 crc kubenswrapper[4700]: E1007 11:40:02.525842 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b765c26c-24c0-41b8-a126-0524806c134d" containerName="ceilometer-central-agent" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.525850 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="b765c26c-24c0-41b8-a126-0524806c134d" containerName="ceilometer-central-agent" Oct 07 11:40:02 crc kubenswrapper[4700]: E1007 11:40:02.525878 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b765c26c-24c0-41b8-a126-0524806c134d" containerName="sg-core" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.525886 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="b765c26c-24c0-41b8-a126-0524806c134d" containerName="sg-core" Oct 07 11:40:02 crc kubenswrapper[4700]: E1007 11:40:02.525895 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b765c26c-24c0-41b8-a126-0524806c134d" containerName="ceilometer-notification-agent" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.525901 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="b765c26c-24c0-41b8-a126-0524806c134d" containerName="ceilometer-notification-agent" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.526065 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="b765c26c-24c0-41b8-a126-0524806c134d" containerName="proxy-httpd" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.526082 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="b765c26c-24c0-41b8-a126-0524806c134d" containerName="sg-core" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.526092 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="b765c26c-24c0-41b8-a126-0524806c134d" containerName="ceilometer-central-agent" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.526102 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="b765c26c-24c0-41b8-a126-0524806c134d" containerName="ceilometer-notification-agent" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.539501 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.547016 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.547180 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.547297 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.574569 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.576478 4700 scope.go:117] "RemoveContainer" containerID="6e927eac471a5fd273fd455375bf8554a20a98e87b7c46a54563db72f4d86b97" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.652063 4700 scope.go:117] "RemoveContainer" containerID="f83d0e3646111a30ff529e17ebeb46e1ef5491f801750f3a1defb5aff45ed57a" Oct 07 11:40:02 crc kubenswrapper[4700]: E1007 11:40:02.652441 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f83d0e3646111a30ff529e17ebeb46e1ef5491f801750f3a1defb5aff45ed57a\": container with ID starting with f83d0e3646111a30ff529e17ebeb46e1ef5491f801750f3a1defb5aff45ed57a not found: ID does not exist" containerID="f83d0e3646111a30ff529e17ebeb46e1ef5491f801750f3a1defb5aff45ed57a" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.652468 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f83d0e3646111a30ff529e17ebeb46e1ef5491f801750f3a1defb5aff45ed57a"} err="failed to get container status \"f83d0e3646111a30ff529e17ebeb46e1ef5491f801750f3a1defb5aff45ed57a\": rpc error: code = NotFound desc = could not find container \"f83d0e3646111a30ff529e17ebeb46e1ef5491f801750f3a1defb5aff45ed57a\": container with ID starting with f83d0e3646111a30ff529e17ebeb46e1ef5491f801750f3a1defb5aff45ed57a not found: ID does not exist" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.652489 4700 scope.go:117] "RemoveContainer" containerID="afc0c4721959cf69ae63566f0342a4922a525abca700f59eb5d713f0cbad60fc" Oct 07 11:40:02 crc kubenswrapper[4700]: E1007 11:40:02.652644 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afc0c4721959cf69ae63566f0342a4922a525abca700f59eb5d713f0cbad60fc\": container with ID starting with afc0c4721959cf69ae63566f0342a4922a525abca700f59eb5d713f0cbad60fc not found: ID does not exist" containerID="afc0c4721959cf69ae63566f0342a4922a525abca700f59eb5d713f0cbad60fc" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.652659 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afc0c4721959cf69ae63566f0342a4922a525abca700f59eb5d713f0cbad60fc"} err="failed to get container status \"afc0c4721959cf69ae63566f0342a4922a525abca700f59eb5d713f0cbad60fc\": rpc error: code = NotFound desc = could not find container \"afc0c4721959cf69ae63566f0342a4922a525abca700f59eb5d713f0cbad60fc\": container with ID starting with afc0c4721959cf69ae63566f0342a4922a525abca700f59eb5d713f0cbad60fc not found: ID does not exist" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.652671 4700 scope.go:117] "RemoveContainer" containerID="5e5801de6a400d3d3dcffcd9343f50edcd34145d0cee52bf49ee0f30d7eb17cb" Oct 07 11:40:02 crc kubenswrapper[4700]: E1007 11:40:02.652824 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e5801de6a400d3d3dcffcd9343f50edcd34145d0cee52bf49ee0f30d7eb17cb\": container with ID starting with 5e5801de6a400d3d3dcffcd9343f50edcd34145d0cee52bf49ee0f30d7eb17cb not found: ID does not exist" containerID="5e5801de6a400d3d3dcffcd9343f50edcd34145d0cee52bf49ee0f30d7eb17cb" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.652846 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e5801de6a400d3d3dcffcd9343f50edcd34145d0cee52bf49ee0f30d7eb17cb"} err="failed to get container status \"5e5801de6a400d3d3dcffcd9343f50edcd34145d0cee52bf49ee0f30d7eb17cb\": rpc error: code = NotFound desc = could not find container \"5e5801de6a400d3d3dcffcd9343f50edcd34145d0cee52bf49ee0f30d7eb17cb\": container with ID starting with 5e5801de6a400d3d3dcffcd9343f50edcd34145d0cee52bf49ee0f30d7eb17cb not found: ID does not exist" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.652859 4700 scope.go:117] "RemoveContainer" containerID="6e927eac471a5fd273fd455375bf8554a20a98e87b7c46a54563db72f4d86b97" Oct 07 11:40:02 crc kubenswrapper[4700]: E1007 11:40:02.653005 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e927eac471a5fd273fd455375bf8554a20a98e87b7c46a54563db72f4d86b97\": container with ID starting with 6e927eac471a5fd273fd455375bf8554a20a98e87b7c46a54563db72f4d86b97 not found: ID does not exist" containerID="6e927eac471a5fd273fd455375bf8554a20a98e87b7c46a54563db72f4d86b97" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.653024 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e927eac471a5fd273fd455375bf8554a20a98e87b7c46a54563db72f4d86b97"} err="failed to get container status \"6e927eac471a5fd273fd455375bf8554a20a98e87b7c46a54563db72f4d86b97\": rpc error: code = NotFound desc = could not find container \"6e927eac471a5fd273fd455375bf8554a20a98e87b7c46a54563db72f4d86b97\": container with ID starting with 6e927eac471a5fd273fd455375bf8554a20a98e87b7c46a54563db72f4d86b97 not found: ID does not exist" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.653037 4700 scope.go:117] "RemoveContainer" containerID="f83d0e3646111a30ff529e17ebeb46e1ef5491f801750f3a1defb5aff45ed57a" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.653193 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f83d0e3646111a30ff529e17ebeb46e1ef5491f801750f3a1defb5aff45ed57a"} err="failed to get container status \"f83d0e3646111a30ff529e17ebeb46e1ef5491f801750f3a1defb5aff45ed57a\": rpc error: code = NotFound desc = could not find container \"f83d0e3646111a30ff529e17ebeb46e1ef5491f801750f3a1defb5aff45ed57a\": container with ID starting with f83d0e3646111a30ff529e17ebeb46e1ef5491f801750f3a1defb5aff45ed57a not found: ID does not exist" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.653211 4700 scope.go:117] "RemoveContainer" containerID="afc0c4721959cf69ae63566f0342a4922a525abca700f59eb5d713f0cbad60fc" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.653369 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afc0c4721959cf69ae63566f0342a4922a525abca700f59eb5d713f0cbad60fc"} err="failed to get container status \"afc0c4721959cf69ae63566f0342a4922a525abca700f59eb5d713f0cbad60fc\": rpc error: code = NotFound desc = could not find container \"afc0c4721959cf69ae63566f0342a4922a525abca700f59eb5d713f0cbad60fc\": container with ID starting with afc0c4721959cf69ae63566f0342a4922a525abca700f59eb5d713f0cbad60fc not found: ID does not exist" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.653408 4700 scope.go:117] "RemoveContainer" containerID="5e5801de6a400d3d3dcffcd9343f50edcd34145d0cee52bf49ee0f30d7eb17cb" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.653556 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e5801de6a400d3d3dcffcd9343f50edcd34145d0cee52bf49ee0f30d7eb17cb"} err="failed to get container status \"5e5801de6a400d3d3dcffcd9343f50edcd34145d0cee52bf49ee0f30d7eb17cb\": rpc error: code = NotFound desc = could not find container \"5e5801de6a400d3d3dcffcd9343f50edcd34145d0cee52bf49ee0f30d7eb17cb\": container with ID starting with 5e5801de6a400d3d3dcffcd9343f50edcd34145d0cee52bf49ee0f30d7eb17cb not found: ID does not exist" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.653577 4700 scope.go:117] "RemoveContainer" containerID="6e927eac471a5fd273fd455375bf8554a20a98e87b7c46a54563db72f4d86b97" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.653742 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e927eac471a5fd273fd455375bf8554a20a98e87b7c46a54563db72f4d86b97"} err="failed to get container status \"6e927eac471a5fd273fd455375bf8554a20a98e87b7c46a54563db72f4d86b97\": rpc error: code = NotFound desc = could not find container \"6e927eac471a5fd273fd455375bf8554a20a98e87b7c46a54563db72f4d86b97\": container with ID starting with 6e927eac471a5fd273fd455375bf8554a20a98e87b7c46a54563db72f4d86b97 not found: ID does not exist" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.697707 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/693c5adf-f425-44be-8e3c-29c0f61b0e92-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"693c5adf-f425-44be-8e3c-29c0f61b0e92\") " pod="openstack/ceilometer-0" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.697767 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/693c5adf-f425-44be-8e3c-29c0f61b0e92-scripts\") pod \"ceilometer-0\" (UID: \"693c5adf-f425-44be-8e3c-29c0f61b0e92\") " pod="openstack/ceilometer-0" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.697798 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/693c5adf-f425-44be-8e3c-29c0f61b0e92-config-data\") pod \"ceilometer-0\" (UID: \"693c5adf-f425-44be-8e3c-29c0f61b0e92\") " pod="openstack/ceilometer-0" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.697825 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp6t8\" (UniqueName: \"kubernetes.io/projected/693c5adf-f425-44be-8e3c-29c0f61b0e92-kube-api-access-vp6t8\") pod \"ceilometer-0\" (UID: \"693c5adf-f425-44be-8e3c-29c0f61b0e92\") " pod="openstack/ceilometer-0" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.697842 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/693c5adf-f425-44be-8e3c-29c0f61b0e92-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"693c5adf-f425-44be-8e3c-29c0f61b0e92\") " pod="openstack/ceilometer-0" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.697982 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/693c5adf-f425-44be-8e3c-29c0f61b0e92-log-httpd\") pod \"ceilometer-0\" (UID: \"693c5adf-f425-44be-8e3c-29c0f61b0e92\") " pod="openstack/ceilometer-0" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.698075 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/693c5adf-f425-44be-8e3c-29c0f61b0e92-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"693c5adf-f425-44be-8e3c-29c0f61b0e92\") " pod="openstack/ceilometer-0" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.698114 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/693c5adf-f425-44be-8e3c-29c0f61b0e92-run-httpd\") pod \"ceilometer-0\" (UID: \"693c5adf-f425-44be-8e3c-29c0f61b0e92\") " pod="openstack/ceilometer-0" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.799849 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/693c5adf-f425-44be-8e3c-29c0f61b0e92-scripts\") pod \"ceilometer-0\" (UID: \"693c5adf-f425-44be-8e3c-29c0f61b0e92\") " pod="openstack/ceilometer-0" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.799908 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/693c5adf-f425-44be-8e3c-29c0f61b0e92-config-data\") pod \"ceilometer-0\" (UID: \"693c5adf-f425-44be-8e3c-29c0f61b0e92\") " pod="openstack/ceilometer-0" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.799940 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp6t8\" (UniqueName: \"kubernetes.io/projected/693c5adf-f425-44be-8e3c-29c0f61b0e92-kube-api-access-vp6t8\") pod \"ceilometer-0\" (UID: \"693c5adf-f425-44be-8e3c-29c0f61b0e92\") " pod="openstack/ceilometer-0" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.799963 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/693c5adf-f425-44be-8e3c-29c0f61b0e92-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"693c5adf-f425-44be-8e3c-29c0f61b0e92\") " pod="openstack/ceilometer-0" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.800002 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/693c5adf-f425-44be-8e3c-29c0f61b0e92-log-httpd\") pod \"ceilometer-0\" (UID: \"693c5adf-f425-44be-8e3c-29c0f61b0e92\") " pod="openstack/ceilometer-0" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.800033 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/693c5adf-f425-44be-8e3c-29c0f61b0e92-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"693c5adf-f425-44be-8e3c-29c0f61b0e92\") " pod="openstack/ceilometer-0" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.800055 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/693c5adf-f425-44be-8e3c-29c0f61b0e92-run-httpd\") pod \"ceilometer-0\" (UID: \"693c5adf-f425-44be-8e3c-29c0f61b0e92\") " pod="openstack/ceilometer-0" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.800120 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/693c5adf-f425-44be-8e3c-29c0f61b0e92-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"693c5adf-f425-44be-8e3c-29c0f61b0e92\") " pod="openstack/ceilometer-0" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.800641 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/693c5adf-f425-44be-8e3c-29c0f61b0e92-log-httpd\") pod \"ceilometer-0\" (UID: \"693c5adf-f425-44be-8e3c-29c0f61b0e92\") " pod="openstack/ceilometer-0" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.800820 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/693c5adf-f425-44be-8e3c-29c0f61b0e92-run-httpd\") pod \"ceilometer-0\" (UID: \"693c5adf-f425-44be-8e3c-29c0f61b0e92\") " pod="openstack/ceilometer-0" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.804489 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/693c5adf-f425-44be-8e3c-29c0f61b0e92-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"693c5adf-f425-44be-8e3c-29c0f61b0e92\") " pod="openstack/ceilometer-0" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.807884 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/693c5adf-f425-44be-8e3c-29c0f61b0e92-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"693c5adf-f425-44be-8e3c-29c0f61b0e92\") " pod="openstack/ceilometer-0" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.808292 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/693c5adf-f425-44be-8e3c-29c0f61b0e92-scripts\") pod \"ceilometer-0\" (UID: \"693c5adf-f425-44be-8e3c-29c0f61b0e92\") " pod="openstack/ceilometer-0" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.808991 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/693c5adf-f425-44be-8e3c-29c0f61b0e92-config-data\") pod \"ceilometer-0\" (UID: \"693c5adf-f425-44be-8e3c-29c0f61b0e92\") " pod="openstack/ceilometer-0" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.813655 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/693c5adf-f425-44be-8e3c-29c0f61b0e92-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"693c5adf-f425-44be-8e3c-29c0f61b0e92\") " pod="openstack/ceilometer-0" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.815817 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp6t8\" (UniqueName: \"kubernetes.io/projected/693c5adf-f425-44be-8e3c-29c0f61b0e92-kube-api-access-vp6t8\") pod \"ceilometer-0\" (UID: \"693c5adf-f425-44be-8e3c-29c0f61b0e92\") " pod="openstack/ceilometer-0" Oct 07 11:40:02 crc kubenswrapper[4700]: I1007 11:40:02.862687 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 11:40:03 crc kubenswrapper[4700]: I1007 11:40:03.359909 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 11:40:03 crc kubenswrapper[4700]: I1007 11:40:03.393689 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:40:03 crc kubenswrapper[4700]: W1007 11:40:03.402137 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod693c5adf_f425_44be_8e3c_29c0f61b0e92.slice/crio-80dea088c185aa94b55f05de476530cc72c7c618bddaae3e5905ac6f7ff63866 WatchSource:0}: Error finding container 80dea088c185aa94b55f05de476530cc72c7c618bddaae3e5905ac6f7ff63866: Status 404 returned error can't find the container with id 80dea088c185aa94b55f05de476530cc72c7c618bddaae3e5905ac6f7ff63866 Oct 07 11:40:03 crc kubenswrapper[4700]: I1007 11:40:03.455474 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"693c5adf-f425-44be-8e3c-29c0f61b0e92","Type":"ContainerStarted","Data":"80dea088c185aa94b55f05de476530cc72c7c618bddaae3e5905ac6f7ff63866"} Oct 07 11:40:03 crc kubenswrapper[4700]: I1007 11:40:03.457100 4700 generic.go:334] "Generic (PLEG): container finished" podID="05ed3d4c-807a-4022-a1e0-6e2fc312c99b" containerID="edb0bfe4d57fed9e2aa29f479479c8c6d9e0446175d093acf8fa0526a9b10373" exitCode=0 Oct 07 11:40:03 crc kubenswrapper[4700]: I1007 11:40:03.457174 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 11:40:03 crc kubenswrapper[4700]: I1007 11:40:03.457214 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"05ed3d4c-807a-4022-a1e0-6e2fc312c99b","Type":"ContainerDied","Data":"edb0bfe4d57fed9e2aa29f479479c8c6d9e0446175d093acf8fa0526a9b10373"} Oct 07 11:40:03 crc kubenswrapper[4700]: I1007 11:40:03.457254 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"05ed3d4c-807a-4022-a1e0-6e2fc312c99b","Type":"ContainerDied","Data":"c8fbb1930cca26990610056fe7841bf7ad8e7f0d2ee52c1090da96e2a1b7f7c4"} Oct 07 11:40:03 crc kubenswrapper[4700]: I1007 11:40:03.457275 4700 scope.go:117] "RemoveContainer" containerID="edb0bfe4d57fed9e2aa29f479479c8c6d9e0446175d093acf8fa0526a9b10373" Oct 07 11:40:03 crc kubenswrapper[4700]: I1007 11:40:03.480113 4700 scope.go:117] "RemoveContainer" containerID="edb0bfe4d57fed9e2aa29f479479c8c6d9e0446175d093acf8fa0526a9b10373" Oct 07 11:40:03 crc kubenswrapper[4700]: E1007 11:40:03.480606 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edb0bfe4d57fed9e2aa29f479479c8c6d9e0446175d093acf8fa0526a9b10373\": container with ID starting with edb0bfe4d57fed9e2aa29f479479c8c6d9e0446175d093acf8fa0526a9b10373 not found: ID does not exist" containerID="edb0bfe4d57fed9e2aa29f479479c8c6d9e0446175d093acf8fa0526a9b10373" Oct 07 11:40:03 crc kubenswrapper[4700]: I1007 11:40:03.480659 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edb0bfe4d57fed9e2aa29f479479c8c6d9e0446175d093acf8fa0526a9b10373"} err="failed to get container status \"edb0bfe4d57fed9e2aa29f479479c8c6d9e0446175d093acf8fa0526a9b10373\": rpc error: code = NotFound desc = could not find container \"edb0bfe4d57fed9e2aa29f479479c8c6d9e0446175d093acf8fa0526a9b10373\": container with ID starting with edb0bfe4d57fed9e2aa29f479479c8c6d9e0446175d093acf8fa0526a9b10373 not found: ID does not exist" Oct 07 11:40:03 crc kubenswrapper[4700]: I1007 11:40:03.526348 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05ed3d4c-807a-4022-a1e0-6e2fc312c99b-combined-ca-bundle\") pod \"05ed3d4c-807a-4022-a1e0-6e2fc312c99b\" (UID: \"05ed3d4c-807a-4022-a1e0-6e2fc312c99b\") " Oct 07 11:40:03 crc kubenswrapper[4700]: I1007 11:40:03.526482 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05ed3d4c-807a-4022-a1e0-6e2fc312c99b-config-data\") pod \"05ed3d4c-807a-4022-a1e0-6e2fc312c99b\" (UID: \"05ed3d4c-807a-4022-a1e0-6e2fc312c99b\") " Oct 07 11:40:03 crc kubenswrapper[4700]: I1007 11:40:03.526682 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hs9l\" (UniqueName: \"kubernetes.io/projected/05ed3d4c-807a-4022-a1e0-6e2fc312c99b-kube-api-access-7hs9l\") pod \"05ed3d4c-807a-4022-a1e0-6e2fc312c99b\" (UID: \"05ed3d4c-807a-4022-a1e0-6e2fc312c99b\") " Oct 07 11:40:03 crc kubenswrapper[4700]: I1007 11:40:03.531214 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05ed3d4c-807a-4022-a1e0-6e2fc312c99b-kube-api-access-7hs9l" (OuterVolumeSpecName: "kube-api-access-7hs9l") pod "05ed3d4c-807a-4022-a1e0-6e2fc312c99b" (UID: "05ed3d4c-807a-4022-a1e0-6e2fc312c99b"). InnerVolumeSpecName "kube-api-access-7hs9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:40:03 crc kubenswrapper[4700]: I1007 11:40:03.553386 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05ed3d4c-807a-4022-a1e0-6e2fc312c99b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05ed3d4c-807a-4022-a1e0-6e2fc312c99b" (UID: "05ed3d4c-807a-4022-a1e0-6e2fc312c99b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:40:03 crc kubenswrapper[4700]: I1007 11:40:03.566835 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05ed3d4c-807a-4022-a1e0-6e2fc312c99b-config-data" (OuterVolumeSpecName: "config-data") pod "05ed3d4c-807a-4022-a1e0-6e2fc312c99b" (UID: "05ed3d4c-807a-4022-a1e0-6e2fc312c99b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:40:03 crc kubenswrapper[4700]: I1007 11:40:03.629896 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hs9l\" (UniqueName: \"kubernetes.io/projected/05ed3d4c-807a-4022-a1e0-6e2fc312c99b-kube-api-access-7hs9l\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:03 crc kubenswrapper[4700]: I1007 11:40:03.629930 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05ed3d4c-807a-4022-a1e0-6e2fc312c99b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:03 crc kubenswrapper[4700]: I1007 11:40:03.629942 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05ed3d4c-807a-4022-a1e0-6e2fc312c99b-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:03 crc kubenswrapper[4700]: I1007 11:40:03.782662 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 11:40:03 crc kubenswrapper[4700]: I1007 11:40:03.782856 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 11:40:03 crc kubenswrapper[4700]: I1007 11:40:03.798942 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 11:40:03 crc kubenswrapper[4700]: I1007 11:40:03.834366 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 11:40:03 crc kubenswrapper[4700]: I1007 11:40:03.843618 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 11:40:03 crc kubenswrapper[4700]: E1007 11:40:03.844206 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05ed3d4c-807a-4022-a1e0-6e2fc312c99b" containerName="nova-scheduler-scheduler" Oct 07 11:40:03 crc kubenswrapper[4700]: I1007 11:40:03.844232 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="05ed3d4c-807a-4022-a1e0-6e2fc312c99b" containerName="nova-scheduler-scheduler" Oct 07 11:40:03 crc kubenswrapper[4700]: I1007 11:40:03.844534 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="05ed3d4c-807a-4022-a1e0-6e2fc312c99b" containerName="nova-scheduler-scheduler" Oct 07 11:40:03 crc kubenswrapper[4700]: I1007 11:40:03.845426 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 11:40:03 crc kubenswrapper[4700]: I1007 11:40:03.851711 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 07 11:40:03 crc kubenswrapper[4700]: I1007 11:40:03.852503 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 11:40:03 crc kubenswrapper[4700]: I1007 11:40:03.935822 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf7hm\" (UniqueName: \"kubernetes.io/projected/81608928-68ca-42fc-9839-c19d6efdb0f1-kube-api-access-nf7hm\") pod \"nova-scheduler-0\" (UID: \"81608928-68ca-42fc-9839-c19d6efdb0f1\") " pod="openstack/nova-scheduler-0" Oct 07 11:40:03 crc kubenswrapper[4700]: I1007 11:40:03.936221 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81608928-68ca-42fc-9839-c19d6efdb0f1-config-data\") pod \"nova-scheduler-0\" (UID: \"81608928-68ca-42fc-9839-c19d6efdb0f1\") " pod="openstack/nova-scheduler-0" Oct 07 11:40:03 crc kubenswrapper[4700]: I1007 11:40:03.936442 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81608928-68ca-42fc-9839-c19d6efdb0f1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"81608928-68ca-42fc-9839-c19d6efdb0f1\") " pod="openstack/nova-scheduler-0" Oct 07 11:40:03 crc kubenswrapper[4700]: I1007 11:40:03.977253 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05ed3d4c-807a-4022-a1e0-6e2fc312c99b" path="/var/lib/kubelet/pods/05ed3d4c-807a-4022-a1e0-6e2fc312c99b/volumes" Oct 07 11:40:03 crc kubenswrapper[4700]: I1007 11:40:03.977963 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b765c26c-24c0-41b8-a126-0524806c134d" path="/var/lib/kubelet/pods/b765c26c-24c0-41b8-a126-0524806c134d/volumes" Oct 07 11:40:04 crc kubenswrapper[4700]: I1007 11:40:04.038462 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf7hm\" (UniqueName: \"kubernetes.io/projected/81608928-68ca-42fc-9839-c19d6efdb0f1-kube-api-access-nf7hm\") pod \"nova-scheduler-0\" (UID: \"81608928-68ca-42fc-9839-c19d6efdb0f1\") " pod="openstack/nova-scheduler-0" Oct 07 11:40:04 crc kubenswrapper[4700]: I1007 11:40:04.038775 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81608928-68ca-42fc-9839-c19d6efdb0f1-config-data\") pod \"nova-scheduler-0\" (UID: \"81608928-68ca-42fc-9839-c19d6efdb0f1\") " pod="openstack/nova-scheduler-0" Oct 07 11:40:04 crc kubenswrapper[4700]: I1007 11:40:04.038862 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81608928-68ca-42fc-9839-c19d6efdb0f1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"81608928-68ca-42fc-9839-c19d6efdb0f1\") " pod="openstack/nova-scheduler-0" Oct 07 11:40:04 crc kubenswrapper[4700]: I1007 11:40:04.046331 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81608928-68ca-42fc-9839-c19d6efdb0f1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"81608928-68ca-42fc-9839-c19d6efdb0f1\") " pod="openstack/nova-scheduler-0" Oct 07 11:40:04 crc kubenswrapper[4700]: I1007 11:40:04.047911 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81608928-68ca-42fc-9839-c19d6efdb0f1-config-data\") pod \"nova-scheduler-0\" (UID: \"81608928-68ca-42fc-9839-c19d6efdb0f1\") " pod="openstack/nova-scheduler-0" Oct 07 11:40:04 crc kubenswrapper[4700]: I1007 11:40:04.069792 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf7hm\" (UniqueName: \"kubernetes.io/projected/81608928-68ca-42fc-9839-c19d6efdb0f1-kube-api-access-nf7hm\") pod \"nova-scheduler-0\" (UID: \"81608928-68ca-42fc-9839-c19d6efdb0f1\") " pod="openstack/nova-scheduler-0" Oct 07 11:40:04 crc kubenswrapper[4700]: I1007 11:40:04.171183 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 11:40:04 crc kubenswrapper[4700]: I1007 11:40:04.469973 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"693c5adf-f425-44be-8e3c-29c0f61b0e92","Type":"ContainerStarted","Data":"b5ca86d993705515467bc82d7b48430cac778aa41cff9a536cc32bccca8930f5"} Oct 07 11:40:04 crc kubenswrapper[4700]: I1007 11:40:04.620120 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.254883 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.361179 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/615aed2c-5f89-4595-b80a-d1241196b64a-logs\") pod \"615aed2c-5f89-4595-b80a-d1241196b64a\" (UID: \"615aed2c-5f89-4595-b80a-d1241196b64a\") " Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.361291 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/615aed2c-5f89-4595-b80a-d1241196b64a-combined-ca-bundle\") pod \"615aed2c-5f89-4595-b80a-d1241196b64a\" (UID: \"615aed2c-5f89-4595-b80a-d1241196b64a\") " Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.361506 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/615aed2c-5f89-4595-b80a-d1241196b64a-config-data\") pod \"615aed2c-5f89-4595-b80a-d1241196b64a\" (UID: \"615aed2c-5f89-4595-b80a-d1241196b64a\") " Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.361544 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdc9g\" (UniqueName: \"kubernetes.io/projected/615aed2c-5f89-4595-b80a-d1241196b64a-kube-api-access-wdc9g\") pod \"615aed2c-5f89-4595-b80a-d1241196b64a\" (UID: \"615aed2c-5f89-4595-b80a-d1241196b64a\") " Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.361620 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/615aed2c-5f89-4595-b80a-d1241196b64a-logs" (OuterVolumeSpecName: "logs") pod "615aed2c-5f89-4595-b80a-d1241196b64a" (UID: "615aed2c-5f89-4595-b80a-d1241196b64a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.361933 4700 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/615aed2c-5f89-4595-b80a-d1241196b64a-logs\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.366718 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/615aed2c-5f89-4595-b80a-d1241196b64a-kube-api-access-wdc9g" (OuterVolumeSpecName: "kube-api-access-wdc9g") pod "615aed2c-5f89-4595-b80a-d1241196b64a" (UID: "615aed2c-5f89-4595-b80a-d1241196b64a"). InnerVolumeSpecName "kube-api-access-wdc9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.393476 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/615aed2c-5f89-4595-b80a-d1241196b64a-config-data" (OuterVolumeSpecName: "config-data") pod "615aed2c-5f89-4595-b80a-d1241196b64a" (UID: "615aed2c-5f89-4595-b80a-d1241196b64a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.397003 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/615aed2c-5f89-4595-b80a-d1241196b64a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "615aed2c-5f89-4595-b80a-d1241196b64a" (UID: "615aed2c-5f89-4595-b80a-d1241196b64a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.465391 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/615aed2c-5f89-4595-b80a-d1241196b64a-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.465439 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdc9g\" (UniqueName: \"kubernetes.io/projected/615aed2c-5f89-4595-b80a-d1241196b64a-kube-api-access-wdc9g\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.465457 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/615aed2c-5f89-4595-b80a-d1241196b64a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.493856 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"81608928-68ca-42fc-9839-c19d6efdb0f1","Type":"ContainerStarted","Data":"54cd06d47f27c76e44fa917539a672e3c0613f3b43ae5f0c3445b0f6b5e798dc"} Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.493916 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"81608928-68ca-42fc-9839-c19d6efdb0f1","Type":"ContainerStarted","Data":"45bf8fc13a710b2a3a9366073caaef4859071654cc7140ea8c09697825fe0096"} Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.498373 4700 generic.go:334] "Generic (PLEG): container finished" podID="615aed2c-5f89-4595-b80a-d1241196b64a" containerID="ff202ffd72b27ded4bad082b24ccc925b216deac726d94b11b3514daa9c30de2" exitCode=0 Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.498440 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"615aed2c-5f89-4595-b80a-d1241196b64a","Type":"ContainerDied","Data":"ff202ffd72b27ded4bad082b24ccc925b216deac726d94b11b3514daa9c30de2"} Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.498484 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.499354 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"615aed2c-5f89-4595-b80a-d1241196b64a","Type":"ContainerDied","Data":"8db1ae417e647abce81569a6e559bff8f4f9d9c18816f1e3a243c8526d512762"} Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.499379 4700 scope.go:117] "RemoveContainer" containerID="ff202ffd72b27ded4bad082b24ccc925b216deac726d94b11b3514daa9c30de2" Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.512807 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"693c5adf-f425-44be-8e3c-29c0f61b0e92","Type":"ContainerStarted","Data":"4614db4c9416cf4a3bb10f28e2933f2520aa92ed982f9fa40c681f8a2541dd3c"} Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.517123 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.517109141 podStartE2EDuration="2.517109141s" podCreationTimestamp="2025-10-07 11:40:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:40:05.51208557 +0000 UTC m=+1172.308484559" watchObservedRunningTime="2025-10-07 11:40:05.517109141 +0000 UTC m=+1172.313508130" Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.540057 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.541428 4700 scope.go:117] "RemoveContainer" containerID="4cf9fab36dfa862401b67a39ff4a5238ce0c9b6cb2521ac2b78c7ab0740ad2cb" Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.550919 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.562570 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 07 11:40:05 crc kubenswrapper[4700]: E1007 11:40:05.562945 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615aed2c-5f89-4595-b80a-d1241196b64a" containerName="nova-api-api" Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.562961 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="615aed2c-5f89-4595-b80a-d1241196b64a" containerName="nova-api-api" Oct 07 11:40:05 crc kubenswrapper[4700]: E1007 11:40:05.562996 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615aed2c-5f89-4595-b80a-d1241196b64a" containerName="nova-api-log" Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.563004 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="615aed2c-5f89-4595-b80a-d1241196b64a" containerName="nova-api-log" Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.563197 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="615aed2c-5f89-4595-b80a-d1241196b64a" containerName="nova-api-api" Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.563210 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="615aed2c-5f89-4595-b80a-d1241196b64a" containerName="nova-api-log" Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.564176 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.570008 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.577705 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.606073 4700 scope.go:117] "RemoveContainer" containerID="ff202ffd72b27ded4bad082b24ccc925b216deac726d94b11b3514daa9c30de2" Oct 07 11:40:05 crc kubenswrapper[4700]: E1007 11:40:05.606999 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff202ffd72b27ded4bad082b24ccc925b216deac726d94b11b3514daa9c30de2\": container with ID starting with ff202ffd72b27ded4bad082b24ccc925b216deac726d94b11b3514daa9c30de2 not found: ID does not exist" containerID="ff202ffd72b27ded4bad082b24ccc925b216deac726d94b11b3514daa9c30de2" Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.607026 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff202ffd72b27ded4bad082b24ccc925b216deac726d94b11b3514daa9c30de2"} err="failed to get container status \"ff202ffd72b27ded4bad082b24ccc925b216deac726d94b11b3514daa9c30de2\": rpc error: code = NotFound desc = could not find container \"ff202ffd72b27ded4bad082b24ccc925b216deac726d94b11b3514daa9c30de2\": container with ID starting with ff202ffd72b27ded4bad082b24ccc925b216deac726d94b11b3514daa9c30de2 not found: ID does not exist" Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.607048 4700 scope.go:117] "RemoveContainer" containerID="4cf9fab36dfa862401b67a39ff4a5238ce0c9b6cb2521ac2b78c7ab0740ad2cb" Oct 07 11:40:05 crc kubenswrapper[4700]: E1007 11:40:05.607500 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cf9fab36dfa862401b67a39ff4a5238ce0c9b6cb2521ac2b78c7ab0740ad2cb\": container with ID starting with 4cf9fab36dfa862401b67a39ff4a5238ce0c9b6cb2521ac2b78c7ab0740ad2cb not found: ID does not exist" containerID="4cf9fab36dfa862401b67a39ff4a5238ce0c9b6cb2521ac2b78c7ab0740ad2cb" Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.607523 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cf9fab36dfa862401b67a39ff4a5238ce0c9b6cb2521ac2b78c7ab0740ad2cb"} err="failed to get container status \"4cf9fab36dfa862401b67a39ff4a5238ce0c9b6cb2521ac2b78c7ab0740ad2cb\": rpc error: code = NotFound desc = could not find container \"4cf9fab36dfa862401b67a39ff4a5238ce0c9b6cb2521ac2b78c7ab0740ad2cb\": container with ID starting with 4cf9fab36dfa862401b67a39ff4a5238ce0c9b6cb2521ac2b78c7ab0740ad2cb not found: ID does not exist" Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.671640 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vncmq\" (UniqueName: \"kubernetes.io/projected/f820d4b3-174b-4f37-bdc5-69e416f94cde-kube-api-access-vncmq\") pod \"nova-api-0\" (UID: \"f820d4b3-174b-4f37-bdc5-69e416f94cde\") " pod="openstack/nova-api-0" Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.672076 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f820d4b3-174b-4f37-bdc5-69e416f94cde-logs\") pod \"nova-api-0\" (UID: \"f820d4b3-174b-4f37-bdc5-69e416f94cde\") " pod="openstack/nova-api-0" Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.672103 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f820d4b3-174b-4f37-bdc5-69e416f94cde-config-data\") pod \"nova-api-0\" (UID: \"f820d4b3-174b-4f37-bdc5-69e416f94cde\") " pod="openstack/nova-api-0" Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.672120 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f820d4b3-174b-4f37-bdc5-69e416f94cde-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f820d4b3-174b-4f37-bdc5-69e416f94cde\") " pod="openstack/nova-api-0" Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.773395 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f820d4b3-174b-4f37-bdc5-69e416f94cde-logs\") pod \"nova-api-0\" (UID: \"f820d4b3-174b-4f37-bdc5-69e416f94cde\") " pod="openstack/nova-api-0" Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.773438 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f820d4b3-174b-4f37-bdc5-69e416f94cde-config-data\") pod \"nova-api-0\" (UID: \"f820d4b3-174b-4f37-bdc5-69e416f94cde\") " pod="openstack/nova-api-0" Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.773458 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f820d4b3-174b-4f37-bdc5-69e416f94cde-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f820d4b3-174b-4f37-bdc5-69e416f94cde\") " pod="openstack/nova-api-0" Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.773565 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vncmq\" (UniqueName: \"kubernetes.io/projected/f820d4b3-174b-4f37-bdc5-69e416f94cde-kube-api-access-vncmq\") pod \"nova-api-0\" (UID: \"f820d4b3-174b-4f37-bdc5-69e416f94cde\") " pod="openstack/nova-api-0" Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.773878 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f820d4b3-174b-4f37-bdc5-69e416f94cde-logs\") pod \"nova-api-0\" (UID: \"f820d4b3-174b-4f37-bdc5-69e416f94cde\") " pod="openstack/nova-api-0" Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.786070 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f820d4b3-174b-4f37-bdc5-69e416f94cde-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f820d4b3-174b-4f37-bdc5-69e416f94cde\") " pod="openstack/nova-api-0" Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.786663 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f820d4b3-174b-4f37-bdc5-69e416f94cde-config-data\") pod \"nova-api-0\" (UID: \"f820d4b3-174b-4f37-bdc5-69e416f94cde\") " pod="openstack/nova-api-0" Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.794037 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vncmq\" (UniqueName: \"kubernetes.io/projected/f820d4b3-174b-4f37-bdc5-69e416f94cde-kube-api-access-vncmq\") pod \"nova-api-0\" (UID: \"f820d4b3-174b-4f37-bdc5-69e416f94cde\") " pod="openstack/nova-api-0" Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.906845 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 11:40:05 crc kubenswrapper[4700]: I1007 11:40:05.974584 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="615aed2c-5f89-4595-b80a-d1241196b64a" path="/var/lib/kubelet/pods/615aed2c-5f89-4595-b80a-d1241196b64a/volumes" Oct 07 11:40:06 crc kubenswrapper[4700]: I1007 11:40:06.356326 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 11:40:06 crc kubenswrapper[4700]: I1007 11:40:06.525155 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f820d4b3-174b-4f37-bdc5-69e416f94cde","Type":"ContainerStarted","Data":"f53e7b5f250caf360ef94c6bdd6af6c373ea58b32c2027a3b9029303d09c2a16"} Oct 07 11:40:06 crc kubenswrapper[4700]: I1007 11:40:06.530552 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"693c5adf-f425-44be-8e3c-29c0f61b0e92","Type":"ContainerStarted","Data":"27f2772e229a9fccdf3953bdf87966b7e6cf731f52c3d54454ed9c42125adc17"} Oct 07 11:40:07 crc kubenswrapper[4700]: I1007 11:40:07.542706 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"693c5adf-f425-44be-8e3c-29c0f61b0e92","Type":"ContainerStarted","Data":"5d78161f2dd0da41b827dcfcfc41105ee1b9588c0e970a345b5cb276a6d6b75c"} Oct 07 11:40:07 crc kubenswrapper[4700]: I1007 11:40:07.544041 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f820d4b3-174b-4f37-bdc5-69e416f94cde","Type":"ContainerStarted","Data":"9835cb7d8a7ba08185b02e2b9f4caadde1fa07d64b58c118e36bcdd6d9743355"} Oct 07 11:40:07 crc kubenswrapper[4700]: I1007 11:40:07.544084 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f820d4b3-174b-4f37-bdc5-69e416f94cde","Type":"ContainerStarted","Data":"90cc66be56ec228c85965822e3e686743d3f0bd42296008dad5155fe0f6c1617"} Oct 07 11:40:07 crc kubenswrapper[4700]: I1007 11:40:07.589101 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.259310655 podStartE2EDuration="5.589057604s" podCreationTimestamp="2025-10-07 11:40:02 +0000 UTC" firstStartedPulling="2025-10-07 11:40:03.404324061 +0000 UTC m=+1170.200723050" lastFinishedPulling="2025-10-07 11:40:06.734071 +0000 UTC m=+1173.530469999" observedRunningTime="2025-10-07 11:40:07.568451237 +0000 UTC m=+1174.364850256" watchObservedRunningTime="2025-10-07 11:40:07.589057604 +0000 UTC m=+1174.385456593" Oct 07 11:40:07 crc kubenswrapper[4700]: I1007 11:40:07.612179 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.612151877 podStartE2EDuration="2.612151877s" podCreationTimestamp="2025-10-07 11:40:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:40:07.604147638 +0000 UTC m=+1174.400546667" watchObservedRunningTime="2025-10-07 11:40:07.612151877 +0000 UTC m=+1174.408550886" Oct 07 11:40:07 crc kubenswrapper[4700]: I1007 11:40:07.932657 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 07 11:40:08 crc kubenswrapper[4700]: I1007 11:40:08.555037 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 11:40:08 crc kubenswrapper[4700]: I1007 11:40:08.783041 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 07 11:40:08 crc kubenswrapper[4700]: I1007 11:40:08.783101 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 07 11:40:09 crc kubenswrapper[4700]: I1007 11:40:09.172721 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 07 11:40:09 crc kubenswrapper[4700]: I1007 11:40:09.798573 4700 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="13922810-c6e2-4774-b143-a1aadd32210b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 11:40:09 crc kubenswrapper[4700]: I1007 11:40:09.798592 4700 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="13922810-c6e2-4774-b143-a1aadd32210b" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 11:40:10 crc kubenswrapper[4700]: I1007 11:40:10.088710 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 07 11:40:14 crc kubenswrapper[4700]: I1007 11:40:14.171988 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 07 11:40:14 crc kubenswrapper[4700]: I1007 11:40:14.212920 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 07 11:40:14 crc kubenswrapper[4700]: I1007 11:40:14.650171 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 07 11:40:15 crc kubenswrapper[4700]: I1007 11:40:15.907501 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 11:40:15 crc kubenswrapper[4700]: I1007 11:40:15.907880 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 11:40:16 crc kubenswrapper[4700]: I1007 11:40:16.990563 4700 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f820d4b3-174b-4f37-bdc5-69e416f94cde" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.207:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 11:40:16 crc kubenswrapper[4700]: I1007 11:40:16.990555 4700 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f820d4b3-174b-4f37-bdc5-69e416f94cde" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.207:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 11:40:18 crc kubenswrapper[4700]: I1007 11:40:18.796787 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 07 11:40:18 crc kubenswrapper[4700]: I1007 11:40:18.796891 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 07 11:40:18 crc kubenswrapper[4700]: I1007 11:40:18.804814 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 07 11:40:18 crc kubenswrapper[4700]: I1007 11:40:18.807379 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 07 11:40:22 crc kubenswrapper[4700]: I1007 11:40:22.643294 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 11:40:22 crc kubenswrapper[4700]: I1007 11:40:22.695559 4700 generic.go:334] "Generic (PLEG): container finished" podID="dc56a36f-8764-4037-8c5a-e14a0ee8e309" containerID="560e2a7cce2411869b81e9c004c6f90ac805820feb943061cf762cd99ad9d03c" exitCode=137 Oct 07 11:40:22 crc kubenswrapper[4700]: I1007 11:40:22.695601 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"dc56a36f-8764-4037-8c5a-e14a0ee8e309","Type":"ContainerDied","Data":"560e2a7cce2411869b81e9c004c6f90ac805820feb943061cf762cd99ad9d03c"} Oct 07 11:40:22 crc kubenswrapper[4700]: I1007 11:40:22.695630 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"dc56a36f-8764-4037-8c5a-e14a0ee8e309","Type":"ContainerDied","Data":"537285339f143d2e980e9a38bc3c93df2419237c52c31201159ef69a7242fbfe"} Oct 07 11:40:22 crc kubenswrapper[4700]: I1007 11:40:22.695646 4700 scope.go:117] "RemoveContainer" containerID="560e2a7cce2411869b81e9c004c6f90ac805820feb943061cf762cd99ad9d03c" Oct 07 11:40:22 crc kubenswrapper[4700]: I1007 11:40:22.695748 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 11:40:22 crc kubenswrapper[4700]: I1007 11:40:22.716838 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc56a36f-8764-4037-8c5a-e14a0ee8e309-config-data\") pod \"dc56a36f-8764-4037-8c5a-e14a0ee8e309\" (UID: \"dc56a36f-8764-4037-8c5a-e14a0ee8e309\") " Oct 07 11:40:22 crc kubenswrapper[4700]: I1007 11:40:22.716938 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc56a36f-8764-4037-8c5a-e14a0ee8e309-combined-ca-bundle\") pod \"dc56a36f-8764-4037-8c5a-e14a0ee8e309\" (UID: \"dc56a36f-8764-4037-8c5a-e14a0ee8e309\") " Oct 07 11:40:22 crc kubenswrapper[4700]: I1007 11:40:22.717039 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llnb7\" (UniqueName: \"kubernetes.io/projected/dc56a36f-8764-4037-8c5a-e14a0ee8e309-kube-api-access-llnb7\") pod \"dc56a36f-8764-4037-8c5a-e14a0ee8e309\" (UID: \"dc56a36f-8764-4037-8c5a-e14a0ee8e309\") " Oct 07 11:40:22 crc kubenswrapper[4700]: I1007 11:40:22.722666 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc56a36f-8764-4037-8c5a-e14a0ee8e309-kube-api-access-llnb7" (OuterVolumeSpecName: "kube-api-access-llnb7") pod "dc56a36f-8764-4037-8c5a-e14a0ee8e309" (UID: "dc56a36f-8764-4037-8c5a-e14a0ee8e309"). InnerVolumeSpecName "kube-api-access-llnb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:40:22 crc kubenswrapper[4700]: I1007 11:40:22.723572 4700 scope.go:117] "RemoveContainer" containerID="560e2a7cce2411869b81e9c004c6f90ac805820feb943061cf762cd99ad9d03c" Oct 07 11:40:22 crc kubenswrapper[4700]: E1007 11:40:22.724047 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"560e2a7cce2411869b81e9c004c6f90ac805820feb943061cf762cd99ad9d03c\": container with ID starting with 560e2a7cce2411869b81e9c004c6f90ac805820feb943061cf762cd99ad9d03c not found: ID does not exist" containerID="560e2a7cce2411869b81e9c004c6f90ac805820feb943061cf762cd99ad9d03c" Oct 07 11:40:22 crc kubenswrapper[4700]: I1007 11:40:22.724181 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"560e2a7cce2411869b81e9c004c6f90ac805820feb943061cf762cd99ad9d03c"} err="failed to get container status \"560e2a7cce2411869b81e9c004c6f90ac805820feb943061cf762cd99ad9d03c\": rpc error: code = NotFound desc = could not find container \"560e2a7cce2411869b81e9c004c6f90ac805820feb943061cf762cd99ad9d03c\": container with ID starting with 560e2a7cce2411869b81e9c004c6f90ac805820feb943061cf762cd99ad9d03c not found: ID does not exist" Oct 07 11:40:22 crc kubenswrapper[4700]: I1007 11:40:22.753273 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc56a36f-8764-4037-8c5a-e14a0ee8e309-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc56a36f-8764-4037-8c5a-e14a0ee8e309" (UID: "dc56a36f-8764-4037-8c5a-e14a0ee8e309"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:40:22 crc kubenswrapper[4700]: I1007 11:40:22.755105 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc56a36f-8764-4037-8c5a-e14a0ee8e309-config-data" (OuterVolumeSpecName: "config-data") pod "dc56a36f-8764-4037-8c5a-e14a0ee8e309" (UID: "dc56a36f-8764-4037-8c5a-e14a0ee8e309"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:40:22 crc kubenswrapper[4700]: I1007 11:40:22.818842 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llnb7\" (UniqueName: \"kubernetes.io/projected/dc56a36f-8764-4037-8c5a-e14a0ee8e309-kube-api-access-llnb7\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:22 crc kubenswrapper[4700]: I1007 11:40:22.818880 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc56a36f-8764-4037-8c5a-e14a0ee8e309-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:22 crc kubenswrapper[4700]: I1007 11:40:22.818890 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc56a36f-8764-4037-8c5a-e14a0ee8e309-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:23 crc kubenswrapper[4700]: I1007 11:40:23.047229 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 11:40:23 crc kubenswrapper[4700]: I1007 11:40:23.054762 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 11:40:23 crc kubenswrapper[4700]: I1007 11:40:23.081912 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 11:40:23 crc kubenswrapper[4700]: E1007 11:40:23.082501 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc56a36f-8764-4037-8c5a-e14a0ee8e309" containerName="nova-cell1-novncproxy-novncproxy" Oct 07 11:40:23 crc kubenswrapper[4700]: I1007 11:40:23.082528 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc56a36f-8764-4037-8c5a-e14a0ee8e309" containerName="nova-cell1-novncproxy-novncproxy" Oct 07 11:40:23 crc kubenswrapper[4700]: I1007 11:40:23.082808 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc56a36f-8764-4037-8c5a-e14a0ee8e309" containerName="nova-cell1-novncproxy-novncproxy" Oct 07 11:40:23 crc kubenswrapper[4700]: I1007 11:40:23.083734 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 11:40:23 crc kubenswrapper[4700]: I1007 11:40:23.087481 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 07 11:40:23 crc kubenswrapper[4700]: I1007 11:40:23.087789 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 07 11:40:23 crc kubenswrapper[4700]: I1007 11:40:23.088336 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 07 11:40:23 crc kubenswrapper[4700]: I1007 11:40:23.096121 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 11:40:23 crc kubenswrapper[4700]: I1007 11:40:23.226349 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4b40ae6-2f36-447e-bc97-7cbcfd970bce-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a4b40ae6-2f36-447e-bc97-7cbcfd970bce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 11:40:23 crc kubenswrapper[4700]: I1007 11:40:23.226427 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blbk2\" (UniqueName: \"kubernetes.io/projected/a4b40ae6-2f36-447e-bc97-7cbcfd970bce-kube-api-access-blbk2\") pod \"nova-cell1-novncproxy-0\" (UID: \"a4b40ae6-2f36-447e-bc97-7cbcfd970bce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 11:40:23 crc kubenswrapper[4700]: I1007 11:40:23.226517 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4b40ae6-2f36-447e-bc97-7cbcfd970bce-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a4b40ae6-2f36-447e-bc97-7cbcfd970bce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 11:40:23 crc kubenswrapper[4700]: I1007 11:40:23.226604 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4b40ae6-2f36-447e-bc97-7cbcfd970bce-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a4b40ae6-2f36-447e-bc97-7cbcfd970bce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 11:40:23 crc kubenswrapper[4700]: I1007 11:40:23.226664 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4b40ae6-2f36-447e-bc97-7cbcfd970bce-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a4b40ae6-2f36-447e-bc97-7cbcfd970bce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 11:40:23 crc kubenswrapper[4700]: I1007 11:40:23.328050 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4b40ae6-2f36-447e-bc97-7cbcfd970bce-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a4b40ae6-2f36-447e-bc97-7cbcfd970bce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 11:40:23 crc kubenswrapper[4700]: I1007 11:40:23.328139 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4b40ae6-2f36-447e-bc97-7cbcfd970bce-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a4b40ae6-2f36-447e-bc97-7cbcfd970bce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 11:40:23 crc kubenswrapper[4700]: I1007 11:40:23.328208 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4b40ae6-2f36-447e-bc97-7cbcfd970bce-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a4b40ae6-2f36-447e-bc97-7cbcfd970bce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 11:40:23 crc kubenswrapper[4700]: I1007 11:40:23.328255 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4b40ae6-2f36-447e-bc97-7cbcfd970bce-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a4b40ae6-2f36-447e-bc97-7cbcfd970bce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 11:40:23 crc kubenswrapper[4700]: I1007 11:40:23.328287 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blbk2\" (UniqueName: \"kubernetes.io/projected/a4b40ae6-2f36-447e-bc97-7cbcfd970bce-kube-api-access-blbk2\") pod \"nova-cell1-novncproxy-0\" (UID: \"a4b40ae6-2f36-447e-bc97-7cbcfd970bce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 11:40:23 crc kubenswrapper[4700]: I1007 11:40:23.333772 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4b40ae6-2f36-447e-bc97-7cbcfd970bce-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a4b40ae6-2f36-447e-bc97-7cbcfd970bce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 11:40:23 crc kubenswrapper[4700]: I1007 11:40:23.333933 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4b40ae6-2f36-447e-bc97-7cbcfd970bce-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a4b40ae6-2f36-447e-bc97-7cbcfd970bce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 11:40:23 crc kubenswrapper[4700]: I1007 11:40:23.334424 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4b40ae6-2f36-447e-bc97-7cbcfd970bce-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a4b40ae6-2f36-447e-bc97-7cbcfd970bce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 11:40:23 crc kubenswrapper[4700]: I1007 11:40:23.343355 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4b40ae6-2f36-447e-bc97-7cbcfd970bce-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a4b40ae6-2f36-447e-bc97-7cbcfd970bce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 11:40:23 crc kubenswrapper[4700]: I1007 11:40:23.346643 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blbk2\" (UniqueName: \"kubernetes.io/projected/a4b40ae6-2f36-447e-bc97-7cbcfd970bce-kube-api-access-blbk2\") pod \"nova-cell1-novncproxy-0\" (UID: \"a4b40ae6-2f36-447e-bc97-7cbcfd970bce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 11:40:23 crc kubenswrapper[4700]: I1007 11:40:23.414673 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 11:40:23 crc kubenswrapper[4700]: W1007 11:40:23.915519 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4b40ae6_2f36_447e_bc97_7cbcfd970bce.slice/crio-187ea6c4c5afa79c2e2a7d2641232fc1270d889bd67a4727a5c665d89fa3949c WatchSource:0}: Error finding container 187ea6c4c5afa79c2e2a7d2641232fc1270d889bd67a4727a5c665d89fa3949c: Status 404 returned error can't find the container with id 187ea6c4c5afa79c2e2a7d2641232fc1270d889bd67a4727a5c665d89fa3949c Oct 07 11:40:23 crc kubenswrapper[4700]: I1007 11:40:23.947952 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 11:40:23 crc kubenswrapper[4700]: I1007 11:40:23.988225 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc56a36f-8764-4037-8c5a-e14a0ee8e309" path="/var/lib/kubelet/pods/dc56a36f-8764-4037-8c5a-e14a0ee8e309/volumes" Oct 07 11:40:24 crc kubenswrapper[4700]: I1007 11:40:24.725087 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a4b40ae6-2f36-447e-bc97-7cbcfd970bce","Type":"ContainerStarted","Data":"26d3d8b6c63853bc517d6f7969d47f56e34818d904bf2b6b52e374021ca16328"} Oct 07 11:40:24 crc kubenswrapper[4700]: I1007 11:40:24.726966 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a4b40ae6-2f36-447e-bc97-7cbcfd970bce","Type":"ContainerStarted","Data":"187ea6c4c5afa79c2e2a7d2641232fc1270d889bd67a4727a5c665d89fa3949c"} Oct 07 11:40:24 crc kubenswrapper[4700]: I1007 11:40:24.757178 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.7571438860000002 podStartE2EDuration="1.757143886s" podCreationTimestamp="2025-10-07 11:40:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:40:24.754013285 +0000 UTC m=+1191.550412304" watchObservedRunningTime="2025-10-07 11:40:24.757143886 +0000 UTC m=+1191.553542915" Oct 07 11:40:25 crc kubenswrapper[4700]: I1007 11:40:25.911831 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 07 11:40:25 crc kubenswrapper[4700]: I1007 11:40:25.912261 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 07 11:40:25 crc kubenswrapper[4700]: I1007 11:40:25.913614 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 07 11:40:25 crc kubenswrapper[4700]: I1007 11:40:25.923786 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 07 11:40:26 crc kubenswrapper[4700]: I1007 11:40:26.759299 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 07 11:40:26 crc kubenswrapper[4700]: I1007 11:40:26.763044 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 07 11:40:26 crc kubenswrapper[4700]: I1007 11:40:26.970451 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79b5d74c8c-c4m94"] Oct 07 11:40:26 crc kubenswrapper[4700]: I1007 11:40:26.975443 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79b5d74c8c-c4m94" Oct 07 11:40:26 crc kubenswrapper[4700]: I1007 11:40:26.989694 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79b5d74c8c-c4m94"] Oct 07 11:40:27 crc kubenswrapper[4700]: I1007 11:40:27.117568 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/306818d0-d575-4dae-be97-b92df625ff44-ovsdbserver-sb\") pod \"dnsmasq-dns-79b5d74c8c-c4m94\" (UID: \"306818d0-d575-4dae-be97-b92df625ff44\") " pod="openstack/dnsmasq-dns-79b5d74c8c-c4m94" Oct 07 11:40:27 crc kubenswrapper[4700]: I1007 11:40:27.117622 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fp2j\" (UniqueName: \"kubernetes.io/projected/306818d0-d575-4dae-be97-b92df625ff44-kube-api-access-4fp2j\") pod \"dnsmasq-dns-79b5d74c8c-c4m94\" (UID: \"306818d0-d575-4dae-be97-b92df625ff44\") " pod="openstack/dnsmasq-dns-79b5d74c8c-c4m94" Oct 07 11:40:27 crc kubenswrapper[4700]: I1007 11:40:27.117651 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/306818d0-d575-4dae-be97-b92df625ff44-config\") pod \"dnsmasq-dns-79b5d74c8c-c4m94\" (UID: \"306818d0-d575-4dae-be97-b92df625ff44\") " pod="openstack/dnsmasq-dns-79b5d74c8c-c4m94" Oct 07 11:40:27 crc kubenswrapper[4700]: I1007 11:40:27.117699 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/306818d0-d575-4dae-be97-b92df625ff44-dns-svc\") pod \"dnsmasq-dns-79b5d74c8c-c4m94\" (UID: \"306818d0-d575-4dae-be97-b92df625ff44\") " pod="openstack/dnsmasq-dns-79b5d74c8c-c4m94" Oct 07 11:40:27 crc kubenswrapper[4700]: I1007 11:40:27.117715 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/306818d0-d575-4dae-be97-b92df625ff44-ovsdbserver-nb\") pod \"dnsmasq-dns-79b5d74c8c-c4m94\" (UID: \"306818d0-d575-4dae-be97-b92df625ff44\") " pod="openstack/dnsmasq-dns-79b5d74c8c-c4m94" Oct 07 11:40:27 crc kubenswrapper[4700]: I1007 11:40:27.117748 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/306818d0-d575-4dae-be97-b92df625ff44-dns-swift-storage-0\") pod \"dnsmasq-dns-79b5d74c8c-c4m94\" (UID: \"306818d0-d575-4dae-be97-b92df625ff44\") " pod="openstack/dnsmasq-dns-79b5d74c8c-c4m94" Oct 07 11:40:27 crc kubenswrapper[4700]: I1007 11:40:27.219196 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/306818d0-d575-4dae-be97-b92df625ff44-ovsdbserver-nb\") pod \"dnsmasq-dns-79b5d74c8c-c4m94\" (UID: \"306818d0-d575-4dae-be97-b92df625ff44\") " pod="openstack/dnsmasq-dns-79b5d74c8c-c4m94" Oct 07 11:40:27 crc kubenswrapper[4700]: I1007 11:40:27.219249 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/306818d0-d575-4dae-be97-b92df625ff44-dns-svc\") pod \"dnsmasq-dns-79b5d74c8c-c4m94\" (UID: \"306818d0-d575-4dae-be97-b92df625ff44\") " pod="openstack/dnsmasq-dns-79b5d74c8c-c4m94" Oct 07 11:40:27 crc kubenswrapper[4700]: I1007 11:40:27.219323 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/306818d0-d575-4dae-be97-b92df625ff44-dns-swift-storage-0\") pod \"dnsmasq-dns-79b5d74c8c-c4m94\" (UID: \"306818d0-d575-4dae-be97-b92df625ff44\") " pod="openstack/dnsmasq-dns-79b5d74c8c-c4m94" Oct 07 11:40:27 crc kubenswrapper[4700]: I1007 11:40:27.219526 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/306818d0-d575-4dae-be97-b92df625ff44-ovsdbserver-sb\") pod \"dnsmasq-dns-79b5d74c8c-c4m94\" (UID: \"306818d0-d575-4dae-be97-b92df625ff44\") " pod="openstack/dnsmasq-dns-79b5d74c8c-c4m94" Oct 07 11:40:27 crc kubenswrapper[4700]: I1007 11:40:27.219566 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fp2j\" (UniqueName: \"kubernetes.io/projected/306818d0-d575-4dae-be97-b92df625ff44-kube-api-access-4fp2j\") pod \"dnsmasq-dns-79b5d74c8c-c4m94\" (UID: \"306818d0-d575-4dae-be97-b92df625ff44\") " pod="openstack/dnsmasq-dns-79b5d74c8c-c4m94" Oct 07 11:40:27 crc kubenswrapper[4700]: I1007 11:40:27.219597 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/306818d0-d575-4dae-be97-b92df625ff44-config\") pod \"dnsmasq-dns-79b5d74c8c-c4m94\" (UID: \"306818d0-d575-4dae-be97-b92df625ff44\") " pod="openstack/dnsmasq-dns-79b5d74c8c-c4m94" Oct 07 11:40:27 crc kubenswrapper[4700]: I1007 11:40:27.220336 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/306818d0-d575-4dae-be97-b92df625ff44-dns-svc\") pod \"dnsmasq-dns-79b5d74c8c-c4m94\" (UID: \"306818d0-d575-4dae-be97-b92df625ff44\") " pod="openstack/dnsmasq-dns-79b5d74c8c-c4m94" Oct 07 11:40:27 crc kubenswrapper[4700]: I1007 11:40:27.220418 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/306818d0-d575-4dae-be97-b92df625ff44-dns-swift-storage-0\") pod \"dnsmasq-dns-79b5d74c8c-c4m94\" (UID: \"306818d0-d575-4dae-be97-b92df625ff44\") " pod="openstack/dnsmasq-dns-79b5d74c8c-c4m94" Oct 07 11:40:27 crc kubenswrapper[4700]: I1007 11:40:27.220467 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/306818d0-d575-4dae-be97-b92df625ff44-config\") pod \"dnsmasq-dns-79b5d74c8c-c4m94\" (UID: \"306818d0-d575-4dae-be97-b92df625ff44\") " pod="openstack/dnsmasq-dns-79b5d74c8c-c4m94" Oct 07 11:40:27 crc kubenswrapper[4700]: I1007 11:40:27.220566 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/306818d0-d575-4dae-be97-b92df625ff44-ovsdbserver-sb\") pod \"dnsmasq-dns-79b5d74c8c-c4m94\" (UID: \"306818d0-d575-4dae-be97-b92df625ff44\") " pod="openstack/dnsmasq-dns-79b5d74c8c-c4m94" Oct 07 11:40:27 crc kubenswrapper[4700]: I1007 11:40:27.221780 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/306818d0-d575-4dae-be97-b92df625ff44-ovsdbserver-nb\") pod \"dnsmasq-dns-79b5d74c8c-c4m94\" (UID: \"306818d0-d575-4dae-be97-b92df625ff44\") " pod="openstack/dnsmasq-dns-79b5d74c8c-c4m94" Oct 07 11:40:27 crc kubenswrapper[4700]: I1007 11:40:27.244333 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fp2j\" (UniqueName: \"kubernetes.io/projected/306818d0-d575-4dae-be97-b92df625ff44-kube-api-access-4fp2j\") pod \"dnsmasq-dns-79b5d74c8c-c4m94\" (UID: \"306818d0-d575-4dae-be97-b92df625ff44\") " pod="openstack/dnsmasq-dns-79b5d74c8c-c4m94" Oct 07 11:40:27 crc kubenswrapper[4700]: I1007 11:40:27.307344 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79b5d74c8c-c4m94" Oct 07 11:40:27 crc kubenswrapper[4700]: W1007 11:40:27.838780 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod306818d0_d575_4dae_be97_b92df625ff44.slice/crio-fbad093306cafb9237c8d7de696fc8032502cc9601004d42a0a083ec1aec272c WatchSource:0}: Error finding container fbad093306cafb9237c8d7de696fc8032502cc9601004d42a0a083ec1aec272c: Status 404 returned error can't find the container with id fbad093306cafb9237c8d7de696fc8032502cc9601004d42a0a083ec1aec272c Oct 07 11:40:27 crc kubenswrapper[4700]: I1007 11:40:27.851586 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79b5d74c8c-c4m94"] Oct 07 11:40:28 crc kubenswrapper[4700]: I1007 11:40:28.415780 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 07 11:40:28 crc kubenswrapper[4700]: I1007 11:40:28.779362 4700 generic.go:334] "Generic (PLEG): container finished" podID="306818d0-d575-4dae-be97-b92df625ff44" containerID="c6e3ea96907bbb4ab29a7f1ccd3beb0c467987f954aa1a5e1440ec0e12230d10" exitCode=0 Oct 07 11:40:28 crc kubenswrapper[4700]: I1007 11:40:28.779422 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b5d74c8c-c4m94" event={"ID":"306818d0-d575-4dae-be97-b92df625ff44","Type":"ContainerDied","Data":"c6e3ea96907bbb4ab29a7f1ccd3beb0c467987f954aa1a5e1440ec0e12230d10"} Oct 07 11:40:28 crc kubenswrapper[4700]: I1007 11:40:28.779478 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b5d74c8c-c4m94" event={"ID":"306818d0-d575-4dae-be97-b92df625ff44","Type":"ContainerStarted","Data":"fbad093306cafb9237c8d7de696fc8032502cc9601004d42a0a083ec1aec272c"} Oct 07 11:40:28 crc kubenswrapper[4700]: I1007 11:40:28.916188 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:40:28 crc kubenswrapper[4700]: I1007 11:40:28.921432 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="693c5adf-f425-44be-8e3c-29c0f61b0e92" containerName="ceilometer-central-agent" containerID="cri-o://b5ca86d993705515467bc82d7b48430cac778aa41cff9a536cc32bccca8930f5" gracePeriod=30 Oct 07 11:40:28 crc kubenswrapper[4700]: I1007 11:40:28.921634 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="693c5adf-f425-44be-8e3c-29c0f61b0e92" containerName="proxy-httpd" containerID="cri-o://5d78161f2dd0da41b827dcfcfc41105ee1b9588c0e970a345b5cb276a6d6b75c" gracePeriod=30 Oct 07 11:40:28 crc kubenswrapper[4700]: I1007 11:40:28.921686 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="693c5adf-f425-44be-8e3c-29c0f61b0e92" containerName="sg-core" containerID="cri-o://27f2772e229a9fccdf3953bdf87966b7e6cf731f52c3d54454ed9c42125adc17" gracePeriod=30 Oct 07 11:40:28 crc kubenswrapper[4700]: I1007 11:40:28.921729 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="693c5adf-f425-44be-8e3c-29c0f61b0e92" containerName="ceilometer-notification-agent" containerID="cri-o://4614db4c9416cf4a3bb10f28e2933f2520aa92ed982f9fa40c681f8a2541dd3c" gracePeriod=30 Oct 07 11:40:29 crc kubenswrapper[4700]: I1007 11:40:29.026029 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="693c5adf-f425-44be-8e3c-29c0f61b0e92" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.205:3000/\": read tcp 10.217.0.2:47328->10.217.0.205:3000: read: connection reset by peer" Oct 07 11:40:29 crc kubenswrapper[4700]: I1007 11:40:29.432605 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 11:40:29 crc kubenswrapper[4700]: I1007 11:40:29.790818 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b5d74c8c-c4m94" event={"ID":"306818d0-d575-4dae-be97-b92df625ff44","Type":"ContainerStarted","Data":"dfa01667243cdbbfc801641a2c1f414d0b172e31288c2b5de7d233329fe1d227"} Oct 07 11:40:29 crc kubenswrapper[4700]: I1007 11:40:29.792196 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79b5d74c8c-c4m94" Oct 07 11:40:29 crc kubenswrapper[4700]: I1007 11:40:29.794916 4700 generic.go:334] "Generic (PLEG): container finished" podID="693c5adf-f425-44be-8e3c-29c0f61b0e92" containerID="5d78161f2dd0da41b827dcfcfc41105ee1b9588c0e970a345b5cb276a6d6b75c" exitCode=0 Oct 07 11:40:29 crc kubenswrapper[4700]: I1007 11:40:29.795064 4700 generic.go:334] "Generic (PLEG): container finished" podID="693c5adf-f425-44be-8e3c-29c0f61b0e92" containerID="27f2772e229a9fccdf3953bdf87966b7e6cf731f52c3d54454ed9c42125adc17" exitCode=2 Oct 07 11:40:29 crc kubenswrapper[4700]: I1007 11:40:29.795134 4700 generic.go:334] "Generic (PLEG): container finished" podID="693c5adf-f425-44be-8e3c-29c0f61b0e92" containerID="b5ca86d993705515467bc82d7b48430cac778aa41cff9a536cc32bccca8930f5" exitCode=0 Oct 07 11:40:29 crc kubenswrapper[4700]: I1007 11:40:29.794987 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"693c5adf-f425-44be-8e3c-29c0f61b0e92","Type":"ContainerDied","Data":"5d78161f2dd0da41b827dcfcfc41105ee1b9588c0e970a345b5cb276a6d6b75c"} Oct 07 11:40:29 crc kubenswrapper[4700]: I1007 11:40:29.795288 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"693c5adf-f425-44be-8e3c-29c0f61b0e92","Type":"ContainerDied","Data":"27f2772e229a9fccdf3953bdf87966b7e6cf731f52c3d54454ed9c42125adc17"} Oct 07 11:40:29 crc kubenswrapper[4700]: I1007 11:40:29.795340 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"693c5adf-f425-44be-8e3c-29c0f61b0e92","Type":"ContainerDied","Data":"b5ca86d993705515467bc82d7b48430cac778aa41cff9a536cc32bccca8930f5"} Oct 07 11:40:29 crc kubenswrapper[4700]: I1007 11:40:29.795550 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f820d4b3-174b-4f37-bdc5-69e416f94cde" containerName="nova-api-log" containerID="cri-o://9835cb7d8a7ba08185b02e2b9f4caadde1fa07d64b58c118e36bcdd6d9743355" gracePeriod=30 Oct 07 11:40:29 crc kubenswrapper[4700]: I1007 11:40:29.795610 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f820d4b3-174b-4f37-bdc5-69e416f94cde" containerName="nova-api-api" containerID="cri-o://90cc66be56ec228c85965822e3e686743d3f0bd42296008dad5155fe0f6c1617" gracePeriod=30 Oct 07 11:40:29 crc kubenswrapper[4700]: I1007 11:40:29.825371 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79b5d74c8c-c4m94" podStartSLOduration=3.825347788 podStartE2EDuration="3.825347788s" podCreationTimestamp="2025-10-07 11:40:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:40:29.824879146 +0000 UTC m=+1196.621278165" watchObservedRunningTime="2025-10-07 11:40:29.825347788 +0000 UTC m=+1196.621746787" Oct 07 11:40:30 crc kubenswrapper[4700]: I1007 11:40:30.815611 4700 generic.go:334] "Generic (PLEG): container finished" podID="f820d4b3-174b-4f37-bdc5-69e416f94cde" containerID="9835cb7d8a7ba08185b02e2b9f4caadde1fa07d64b58c118e36bcdd6d9743355" exitCode=143 Oct 07 11:40:30 crc kubenswrapper[4700]: I1007 11:40:30.817835 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f820d4b3-174b-4f37-bdc5-69e416f94cde","Type":"ContainerDied","Data":"9835cb7d8a7ba08185b02e2b9f4caadde1fa07d64b58c118e36bcdd6d9743355"} Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.626626 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.712700 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/693c5adf-f425-44be-8e3c-29c0f61b0e92-ceilometer-tls-certs\") pod \"693c5adf-f425-44be-8e3c-29c0f61b0e92\" (UID: \"693c5adf-f425-44be-8e3c-29c0f61b0e92\") " Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.712797 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vp6t8\" (UniqueName: \"kubernetes.io/projected/693c5adf-f425-44be-8e3c-29c0f61b0e92-kube-api-access-vp6t8\") pod \"693c5adf-f425-44be-8e3c-29c0f61b0e92\" (UID: \"693c5adf-f425-44be-8e3c-29c0f61b0e92\") " Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.712863 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/693c5adf-f425-44be-8e3c-29c0f61b0e92-scripts\") pod \"693c5adf-f425-44be-8e3c-29c0f61b0e92\" (UID: \"693c5adf-f425-44be-8e3c-29c0f61b0e92\") " Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.712944 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/693c5adf-f425-44be-8e3c-29c0f61b0e92-run-httpd\") pod \"693c5adf-f425-44be-8e3c-29c0f61b0e92\" (UID: \"693c5adf-f425-44be-8e3c-29c0f61b0e92\") " Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.713072 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/693c5adf-f425-44be-8e3c-29c0f61b0e92-config-data\") pod \"693c5adf-f425-44be-8e3c-29c0f61b0e92\" (UID: \"693c5adf-f425-44be-8e3c-29c0f61b0e92\") " Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.713100 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/693c5adf-f425-44be-8e3c-29c0f61b0e92-sg-core-conf-yaml\") pod \"693c5adf-f425-44be-8e3c-29c0f61b0e92\" (UID: \"693c5adf-f425-44be-8e3c-29c0f61b0e92\") " Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.713123 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/693c5adf-f425-44be-8e3c-29c0f61b0e92-combined-ca-bundle\") pod \"693c5adf-f425-44be-8e3c-29c0f61b0e92\" (UID: \"693c5adf-f425-44be-8e3c-29c0f61b0e92\") " Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.713149 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/693c5adf-f425-44be-8e3c-29c0f61b0e92-log-httpd\") pod \"693c5adf-f425-44be-8e3c-29c0f61b0e92\" (UID: \"693c5adf-f425-44be-8e3c-29c0f61b0e92\") " Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.713484 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/693c5adf-f425-44be-8e3c-29c0f61b0e92-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "693c5adf-f425-44be-8e3c-29c0f61b0e92" (UID: "693c5adf-f425-44be-8e3c-29c0f61b0e92"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.713573 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/693c5adf-f425-44be-8e3c-29c0f61b0e92-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "693c5adf-f425-44be-8e3c-29c0f61b0e92" (UID: "693c5adf-f425-44be-8e3c-29c0f61b0e92"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.713894 4700 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/693c5adf-f425-44be-8e3c-29c0f61b0e92-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.713915 4700 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/693c5adf-f425-44be-8e3c-29c0f61b0e92-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.735071 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/693c5adf-f425-44be-8e3c-29c0f61b0e92-kube-api-access-vp6t8" (OuterVolumeSpecName: "kube-api-access-vp6t8") pod "693c5adf-f425-44be-8e3c-29c0f61b0e92" (UID: "693c5adf-f425-44be-8e3c-29c0f61b0e92"). InnerVolumeSpecName "kube-api-access-vp6t8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.735115 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/693c5adf-f425-44be-8e3c-29c0f61b0e92-scripts" (OuterVolumeSpecName: "scripts") pod "693c5adf-f425-44be-8e3c-29c0f61b0e92" (UID: "693c5adf-f425-44be-8e3c-29c0f61b0e92"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.758361 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/693c5adf-f425-44be-8e3c-29c0f61b0e92-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "693c5adf-f425-44be-8e3c-29c0f61b0e92" (UID: "693c5adf-f425-44be-8e3c-29c0f61b0e92"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.767793 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/693c5adf-f425-44be-8e3c-29c0f61b0e92-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "693c5adf-f425-44be-8e3c-29c0f61b0e92" (UID: "693c5adf-f425-44be-8e3c-29c0f61b0e92"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.807792 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/693c5adf-f425-44be-8e3c-29c0f61b0e92-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "693c5adf-f425-44be-8e3c-29c0f61b0e92" (UID: "693c5adf-f425-44be-8e3c-29c0f61b0e92"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.816208 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vp6t8\" (UniqueName: \"kubernetes.io/projected/693c5adf-f425-44be-8e3c-29c0f61b0e92-kube-api-access-vp6t8\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.816235 4700 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/693c5adf-f425-44be-8e3c-29c0f61b0e92-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.816247 4700 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/693c5adf-f425-44be-8e3c-29c0f61b0e92-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.816257 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/693c5adf-f425-44be-8e3c-29c0f61b0e92-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.816268 4700 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/693c5adf-f425-44be-8e3c-29c0f61b0e92-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.823014 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/693c5adf-f425-44be-8e3c-29c0f61b0e92-config-data" (OuterVolumeSpecName: "config-data") pod "693c5adf-f425-44be-8e3c-29c0f61b0e92" (UID: "693c5adf-f425-44be-8e3c-29c0f61b0e92"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.828748 4700 generic.go:334] "Generic (PLEG): container finished" podID="693c5adf-f425-44be-8e3c-29c0f61b0e92" containerID="4614db4c9416cf4a3bb10f28e2933f2520aa92ed982f9fa40c681f8a2541dd3c" exitCode=0 Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.828809 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"693c5adf-f425-44be-8e3c-29c0f61b0e92","Type":"ContainerDied","Data":"4614db4c9416cf4a3bb10f28e2933f2520aa92ed982f9fa40c681f8a2541dd3c"} Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.828846 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"693c5adf-f425-44be-8e3c-29c0f61b0e92","Type":"ContainerDied","Data":"80dea088c185aa94b55f05de476530cc72c7c618bddaae3e5905ac6f7ff63866"} Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.828866 4700 scope.go:117] "RemoveContainer" containerID="5d78161f2dd0da41b827dcfcfc41105ee1b9588c0e970a345b5cb276a6d6b75c" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.828989 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.866809 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.867342 4700 scope.go:117] "RemoveContainer" containerID="27f2772e229a9fccdf3953bdf87966b7e6cf731f52c3d54454ed9c42125adc17" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.875718 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.889842 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:40:31 crc kubenswrapper[4700]: E1007 11:40:31.890256 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="693c5adf-f425-44be-8e3c-29c0f61b0e92" containerName="ceilometer-notification-agent" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.890270 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="693c5adf-f425-44be-8e3c-29c0f61b0e92" containerName="ceilometer-notification-agent" Oct 07 11:40:31 crc kubenswrapper[4700]: E1007 11:40:31.890289 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="693c5adf-f425-44be-8e3c-29c0f61b0e92" containerName="proxy-httpd" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.890394 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="693c5adf-f425-44be-8e3c-29c0f61b0e92" containerName="proxy-httpd" Oct 07 11:40:31 crc kubenswrapper[4700]: E1007 11:40:31.890427 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="693c5adf-f425-44be-8e3c-29c0f61b0e92" containerName="sg-core" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.890433 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="693c5adf-f425-44be-8e3c-29c0f61b0e92" containerName="sg-core" Oct 07 11:40:31 crc kubenswrapper[4700]: E1007 11:40:31.890448 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="693c5adf-f425-44be-8e3c-29c0f61b0e92" containerName="ceilometer-central-agent" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.890453 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="693c5adf-f425-44be-8e3c-29c0f61b0e92" containerName="ceilometer-central-agent" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.890675 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="693c5adf-f425-44be-8e3c-29c0f61b0e92" containerName="ceilometer-notification-agent" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.890690 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="693c5adf-f425-44be-8e3c-29c0f61b0e92" containerName="ceilometer-central-agent" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.890705 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="693c5adf-f425-44be-8e3c-29c0f61b0e92" containerName="proxy-httpd" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.890720 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="693c5adf-f425-44be-8e3c-29c0f61b0e92" containerName="sg-core" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.892470 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.896757 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.897156 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.897548 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.902705 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.918131 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/693c5adf-f425-44be-8e3c-29c0f61b0e92-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.926636 4700 scope.go:117] "RemoveContainer" containerID="4614db4c9416cf4a3bb10f28e2933f2520aa92ed982f9fa40c681f8a2541dd3c" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.946628 4700 scope.go:117] "RemoveContainer" containerID="b5ca86d993705515467bc82d7b48430cac778aa41cff9a536cc32bccca8930f5" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.967090 4700 scope.go:117] "RemoveContainer" containerID="5d78161f2dd0da41b827dcfcfc41105ee1b9588c0e970a345b5cb276a6d6b75c" Oct 07 11:40:31 crc kubenswrapper[4700]: E1007 11:40:31.967380 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d78161f2dd0da41b827dcfcfc41105ee1b9588c0e970a345b5cb276a6d6b75c\": container with ID starting with 5d78161f2dd0da41b827dcfcfc41105ee1b9588c0e970a345b5cb276a6d6b75c not found: ID does not exist" containerID="5d78161f2dd0da41b827dcfcfc41105ee1b9588c0e970a345b5cb276a6d6b75c" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.967410 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d78161f2dd0da41b827dcfcfc41105ee1b9588c0e970a345b5cb276a6d6b75c"} err="failed to get container status \"5d78161f2dd0da41b827dcfcfc41105ee1b9588c0e970a345b5cb276a6d6b75c\": rpc error: code = NotFound desc = could not find container \"5d78161f2dd0da41b827dcfcfc41105ee1b9588c0e970a345b5cb276a6d6b75c\": container with ID starting with 5d78161f2dd0da41b827dcfcfc41105ee1b9588c0e970a345b5cb276a6d6b75c not found: ID does not exist" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.967427 4700 scope.go:117] "RemoveContainer" containerID="27f2772e229a9fccdf3953bdf87966b7e6cf731f52c3d54454ed9c42125adc17" Oct 07 11:40:31 crc kubenswrapper[4700]: E1007 11:40:31.967627 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27f2772e229a9fccdf3953bdf87966b7e6cf731f52c3d54454ed9c42125adc17\": container with ID starting with 27f2772e229a9fccdf3953bdf87966b7e6cf731f52c3d54454ed9c42125adc17 not found: ID does not exist" containerID="27f2772e229a9fccdf3953bdf87966b7e6cf731f52c3d54454ed9c42125adc17" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.967649 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27f2772e229a9fccdf3953bdf87966b7e6cf731f52c3d54454ed9c42125adc17"} err="failed to get container status \"27f2772e229a9fccdf3953bdf87966b7e6cf731f52c3d54454ed9c42125adc17\": rpc error: code = NotFound desc = could not find container \"27f2772e229a9fccdf3953bdf87966b7e6cf731f52c3d54454ed9c42125adc17\": container with ID starting with 27f2772e229a9fccdf3953bdf87966b7e6cf731f52c3d54454ed9c42125adc17 not found: ID does not exist" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.967666 4700 scope.go:117] "RemoveContainer" containerID="4614db4c9416cf4a3bb10f28e2933f2520aa92ed982f9fa40c681f8a2541dd3c" Oct 07 11:40:31 crc kubenswrapper[4700]: E1007 11:40:31.967857 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4614db4c9416cf4a3bb10f28e2933f2520aa92ed982f9fa40c681f8a2541dd3c\": container with ID starting with 4614db4c9416cf4a3bb10f28e2933f2520aa92ed982f9fa40c681f8a2541dd3c not found: ID does not exist" containerID="4614db4c9416cf4a3bb10f28e2933f2520aa92ed982f9fa40c681f8a2541dd3c" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.967905 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4614db4c9416cf4a3bb10f28e2933f2520aa92ed982f9fa40c681f8a2541dd3c"} err="failed to get container status \"4614db4c9416cf4a3bb10f28e2933f2520aa92ed982f9fa40c681f8a2541dd3c\": rpc error: code = NotFound desc = could not find container \"4614db4c9416cf4a3bb10f28e2933f2520aa92ed982f9fa40c681f8a2541dd3c\": container with ID starting with 4614db4c9416cf4a3bb10f28e2933f2520aa92ed982f9fa40c681f8a2541dd3c not found: ID does not exist" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.967924 4700 scope.go:117] "RemoveContainer" containerID="b5ca86d993705515467bc82d7b48430cac778aa41cff9a536cc32bccca8930f5" Oct 07 11:40:31 crc kubenswrapper[4700]: E1007 11:40:31.968119 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5ca86d993705515467bc82d7b48430cac778aa41cff9a536cc32bccca8930f5\": container with ID starting with b5ca86d993705515467bc82d7b48430cac778aa41cff9a536cc32bccca8930f5 not found: ID does not exist" containerID="b5ca86d993705515467bc82d7b48430cac778aa41cff9a536cc32bccca8930f5" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.968143 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5ca86d993705515467bc82d7b48430cac778aa41cff9a536cc32bccca8930f5"} err="failed to get container status \"b5ca86d993705515467bc82d7b48430cac778aa41cff9a536cc32bccca8930f5\": rpc error: code = NotFound desc = could not find container \"b5ca86d993705515467bc82d7b48430cac778aa41cff9a536cc32bccca8930f5\": container with ID starting with b5ca86d993705515467bc82d7b48430cac778aa41cff9a536cc32bccca8930f5 not found: ID does not exist" Oct 07 11:40:31 crc kubenswrapper[4700]: I1007 11:40:31.968240 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="693c5adf-f425-44be-8e3c-29c0f61b0e92" path="/var/lib/kubelet/pods/693c5adf-f425-44be-8e3c-29c0f61b0e92/volumes" Oct 07 11:40:32 crc kubenswrapper[4700]: I1007 11:40:32.019671 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-scripts\") pod \"ceilometer-0\" (UID: \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\") " pod="openstack/ceilometer-0" Oct 07 11:40:32 crc kubenswrapper[4700]: I1007 11:40:32.019726 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgr42\" (UniqueName: \"kubernetes.io/projected/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-kube-api-access-jgr42\") pod \"ceilometer-0\" (UID: \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\") " pod="openstack/ceilometer-0" Oct 07 11:40:32 crc kubenswrapper[4700]: I1007 11:40:32.019754 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\") " pod="openstack/ceilometer-0" Oct 07 11:40:32 crc kubenswrapper[4700]: I1007 11:40:32.019948 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-log-httpd\") pod \"ceilometer-0\" (UID: \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\") " pod="openstack/ceilometer-0" Oct 07 11:40:32 crc kubenswrapper[4700]: I1007 11:40:32.020037 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-config-data\") pod \"ceilometer-0\" (UID: \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\") " pod="openstack/ceilometer-0" Oct 07 11:40:32 crc kubenswrapper[4700]: I1007 11:40:32.020169 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\") " pod="openstack/ceilometer-0" Oct 07 11:40:32 crc kubenswrapper[4700]: I1007 11:40:32.020229 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-run-httpd\") pod \"ceilometer-0\" (UID: \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\") " pod="openstack/ceilometer-0" Oct 07 11:40:32 crc kubenswrapper[4700]: I1007 11:40:32.020462 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\") " pod="openstack/ceilometer-0" Oct 07 11:40:32 crc kubenswrapper[4700]: I1007 11:40:32.121635 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\") " pod="openstack/ceilometer-0" Oct 07 11:40:32 crc kubenswrapper[4700]: I1007 11:40:32.121725 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-scripts\") pod \"ceilometer-0\" (UID: \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\") " pod="openstack/ceilometer-0" Oct 07 11:40:32 crc kubenswrapper[4700]: I1007 11:40:32.121754 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgr42\" (UniqueName: \"kubernetes.io/projected/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-kube-api-access-jgr42\") pod \"ceilometer-0\" (UID: \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\") " pod="openstack/ceilometer-0" Oct 07 11:40:32 crc kubenswrapper[4700]: I1007 11:40:32.121776 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\") " pod="openstack/ceilometer-0" Oct 07 11:40:32 crc kubenswrapper[4700]: I1007 11:40:32.121806 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-log-httpd\") pod \"ceilometer-0\" (UID: \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\") " pod="openstack/ceilometer-0" Oct 07 11:40:32 crc kubenswrapper[4700]: I1007 11:40:32.121827 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-config-data\") pod \"ceilometer-0\" (UID: \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\") " pod="openstack/ceilometer-0" Oct 07 11:40:32 crc kubenswrapper[4700]: I1007 11:40:32.121865 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\") " pod="openstack/ceilometer-0" Oct 07 11:40:32 crc kubenswrapper[4700]: I1007 11:40:32.121886 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-run-httpd\") pod \"ceilometer-0\" (UID: \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\") " pod="openstack/ceilometer-0" Oct 07 11:40:32 crc kubenswrapper[4700]: I1007 11:40:32.122268 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-run-httpd\") pod \"ceilometer-0\" (UID: \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\") " pod="openstack/ceilometer-0" Oct 07 11:40:32 crc kubenswrapper[4700]: I1007 11:40:32.126559 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-log-httpd\") pod \"ceilometer-0\" (UID: \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\") " pod="openstack/ceilometer-0" Oct 07 11:40:32 crc kubenswrapper[4700]: I1007 11:40:32.128114 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\") " pod="openstack/ceilometer-0" Oct 07 11:40:32 crc kubenswrapper[4700]: I1007 11:40:32.129448 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-scripts\") pod \"ceilometer-0\" (UID: \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\") " pod="openstack/ceilometer-0" Oct 07 11:40:32 crc kubenswrapper[4700]: I1007 11:40:32.129890 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\") " pod="openstack/ceilometer-0" Oct 07 11:40:32 crc kubenswrapper[4700]: I1007 11:40:32.134098 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\") " pod="openstack/ceilometer-0" Oct 07 11:40:32 crc kubenswrapper[4700]: I1007 11:40:32.134299 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-config-data\") pod \"ceilometer-0\" (UID: \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\") " pod="openstack/ceilometer-0" Oct 07 11:40:32 crc kubenswrapper[4700]: I1007 11:40:32.151241 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgr42\" (UniqueName: \"kubernetes.io/projected/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-kube-api-access-jgr42\") pod \"ceilometer-0\" (UID: \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\") " pod="openstack/ceilometer-0" Oct 07 11:40:32 crc kubenswrapper[4700]: I1007 11:40:32.218166 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 11:40:32 crc kubenswrapper[4700]: I1007 11:40:32.712457 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 11:40:32 crc kubenswrapper[4700]: W1007 11:40:32.725839 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5193cebf_c7b2_4e53_8dc1_d3c37a551e03.slice/crio-0596ae1af214f195bcd9b8510c5bacd2866f4897657cd316fd30e4da9350e02d WatchSource:0}: Error finding container 0596ae1af214f195bcd9b8510c5bacd2866f4897657cd316fd30e4da9350e02d: Status 404 returned error can't find the container with id 0596ae1af214f195bcd9b8510c5bacd2866f4897657cd316fd30e4da9350e02d Oct 07 11:40:32 crc kubenswrapper[4700]: I1007 11:40:32.841214 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5193cebf-c7b2-4e53-8dc1-d3c37a551e03","Type":"ContainerStarted","Data":"0596ae1af214f195bcd9b8510c5bacd2866f4897657cd316fd30e4da9350e02d"} Oct 07 11:40:33 crc kubenswrapper[4700]: I1007 11:40:33.364392 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 11:40:33 crc kubenswrapper[4700]: I1007 11:40:33.415504 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 07 11:40:33 crc kubenswrapper[4700]: I1007 11:40:33.451678 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 07 11:40:33 crc kubenswrapper[4700]: I1007 11:40:33.453686 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f820d4b3-174b-4f37-bdc5-69e416f94cde-logs\") pod \"f820d4b3-174b-4f37-bdc5-69e416f94cde\" (UID: \"f820d4b3-174b-4f37-bdc5-69e416f94cde\") " Oct 07 11:40:33 crc kubenswrapper[4700]: I1007 11:40:33.453732 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f820d4b3-174b-4f37-bdc5-69e416f94cde-config-data\") pod \"f820d4b3-174b-4f37-bdc5-69e416f94cde\" (UID: \"f820d4b3-174b-4f37-bdc5-69e416f94cde\") " Oct 07 11:40:33 crc kubenswrapper[4700]: I1007 11:40:33.453905 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vncmq\" (UniqueName: \"kubernetes.io/projected/f820d4b3-174b-4f37-bdc5-69e416f94cde-kube-api-access-vncmq\") pod \"f820d4b3-174b-4f37-bdc5-69e416f94cde\" (UID: \"f820d4b3-174b-4f37-bdc5-69e416f94cde\") " Oct 07 11:40:33 crc kubenswrapper[4700]: I1007 11:40:33.454040 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f820d4b3-174b-4f37-bdc5-69e416f94cde-combined-ca-bundle\") pod \"f820d4b3-174b-4f37-bdc5-69e416f94cde\" (UID: \"f820d4b3-174b-4f37-bdc5-69e416f94cde\") " Oct 07 11:40:33 crc kubenswrapper[4700]: I1007 11:40:33.454357 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f820d4b3-174b-4f37-bdc5-69e416f94cde-logs" (OuterVolumeSpecName: "logs") pod "f820d4b3-174b-4f37-bdc5-69e416f94cde" (UID: "f820d4b3-174b-4f37-bdc5-69e416f94cde"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:40:33 crc kubenswrapper[4700]: I1007 11:40:33.454842 4700 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f820d4b3-174b-4f37-bdc5-69e416f94cde-logs\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:33 crc kubenswrapper[4700]: I1007 11:40:33.467698 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f820d4b3-174b-4f37-bdc5-69e416f94cde-kube-api-access-vncmq" (OuterVolumeSpecName: "kube-api-access-vncmq") pod "f820d4b3-174b-4f37-bdc5-69e416f94cde" (UID: "f820d4b3-174b-4f37-bdc5-69e416f94cde"). InnerVolumeSpecName "kube-api-access-vncmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:40:33 crc kubenswrapper[4700]: I1007 11:40:33.491354 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f820d4b3-174b-4f37-bdc5-69e416f94cde-config-data" (OuterVolumeSpecName: "config-data") pod "f820d4b3-174b-4f37-bdc5-69e416f94cde" (UID: "f820d4b3-174b-4f37-bdc5-69e416f94cde"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:40:33 crc kubenswrapper[4700]: I1007 11:40:33.501241 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f820d4b3-174b-4f37-bdc5-69e416f94cde-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f820d4b3-174b-4f37-bdc5-69e416f94cde" (UID: "f820d4b3-174b-4f37-bdc5-69e416f94cde"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:40:33 crc kubenswrapper[4700]: I1007 11:40:33.556900 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f820d4b3-174b-4f37-bdc5-69e416f94cde-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:33 crc kubenswrapper[4700]: I1007 11:40:33.556942 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vncmq\" (UniqueName: \"kubernetes.io/projected/f820d4b3-174b-4f37-bdc5-69e416f94cde-kube-api-access-vncmq\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:33 crc kubenswrapper[4700]: I1007 11:40:33.556957 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f820d4b3-174b-4f37-bdc5-69e416f94cde-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:33 crc kubenswrapper[4700]: I1007 11:40:33.854040 4700 generic.go:334] "Generic (PLEG): container finished" podID="f820d4b3-174b-4f37-bdc5-69e416f94cde" containerID="90cc66be56ec228c85965822e3e686743d3f0bd42296008dad5155fe0f6c1617" exitCode=0 Oct 07 11:40:33 crc kubenswrapper[4700]: I1007 11:40:33.854100 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 11:40:33 crc kubenswrapper[4700]: I1007 11:40:33.854120 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f820d4b3-174b-4f37-bdc5-69e416f94cde","Type":"ContainerDied","Data":"90cc66be56ec228c85965822e3e686743d3f0bd42296008dad5155fe0f6c1617"} Oct 07 11:40:33 crc kubenswrapper[4700]: I1007 11:40:33.854179 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f820d4b3-174b-4f37-bdc5-69e416f94cde","Type":"ContainerDied","Data":"f53e7b5f250caf360ef94c6bdd6af6c373ea58b32c2027a3b9029303d09c2a16"} Oct 07 11:40:33 crc kubenswrapper[4700]: I1007 11:40:33.854196 4700 scope.go:117] "RemoveContainer" containerID="90cc66be56ec228c85965822e3e686743d3f0bd42296008dad5155fe0f6c1617" Oct 07 11:40:33 crc kubenswrapper[4700]: I1007 11:40:33.855865 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5193cebf-c7b2-4e53-8dc1-d3c37a551e03","Type":"ContainerStarted","Data":"8e42aaa5b9c5d1137fac1f44a2c67f854ffb18539aa419f0f307be279d6f4b96"} Oct 07 11:40:33 crc kubenswrapper[4700]: I1007 11:40:33.887861 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 07 11:40:33 crc kubenswrapper[4700]: I1007 11:40:33.896560 4700 scope.go:117] "RemoveContainer" containerID="9835cb7d8a7ba08185b02e2b9f4caadde1fa07d64b58c118e36bcdd6d9743355" Oct 07 11:40:33 crc kubenswrapper[4700]: I1007 11:40:33.899753 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 11:40:33 crc kubenswrapper[4700]: I1007 11:40:33.932605 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 07 11:40:33 crc kubenswrapper[4700]: I1007 11:40:33.954320 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 07 11:40:33 crc kubenswrapper[4700]: E1007 11:40:33.954867 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f820d4b3-174b-4f37-bdc5-69e416f94cde" containerName="nova-api-log" Oct 07 11:40:33 crc kubenswrapper[4700]: I1007 11:40:33.954894 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="f820d4b3-174b-4f37-bdc5-69e416f94cde" containerName="nova-api-log" Oct 07 11:40:33 crc kubenswrapper[4700]: E1007 11:40:33.954922 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f820d4b3-174b-4f37-bdc5-69e416f94cde" containerName="nova-api-api" Oct 07 11:40:33 crc kubenswrapper[4700]: I1007 11:40:33.954932 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="f820d4b3-174b-4f37-bdc5-69e416f94cde" containerName="nova-api-api" Oct 07 11:40:33 crc kubenswrapper[4700]: I1007 11:40:33.955151 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="f820d4b3-174b-4f37-bdc5-69e416f94cde" containerName="nova-api-log" Oct 07 11:40:33 crc kubenswrapper[4700]: I1007 11:40:33.955199 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="f820d4b3-174b-4f37-bdc5-69e416f94cde" containerName="nova-api-api" Oct 07 11:40:33 crc kubenswrapper[4700]: I1007 11:40:33.964727 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 11:40:33 crc kubenswrapper[4700]: I1007 11:40:33.974871 4700 scope.go:117] "RemoveContainer" containerID="90cc66be56ec228c85965822e3e686743d3f0bd42296008dad5155fe0f6c1617" Oct 07 11:40:33 crc kubenswrapper[4700]: E1007 11:40:33.975373 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90cc66be56ec228c85965822e3e686743d3f0bd42296008dad5155fe0f6c1617\": container with ID starting with 90cc66be56ec228c85965822e3e686743d3f0bd42296008dad5155fe0f6c1617 not found: ID does not exist" containerID="90cc66be56ec228c85965822e3e686743d3f0bd42296008dad5155fe0f6c1617" Oct 07 11:40:33 crc kubenswrapper[4700]: I1007 11:40:33.975570 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90cc66be56ec228c85965822e3e686743d3f0bd42296008dad5155fe0f6c1617"} err="failed to get container status \"90cc66be56ec228c85965822e3e686743d3f0bd42296008dad5155fe0f6c1617\": rpc error: code = NotFound desc = could not find container \"90cc66be56ec228c85965822e3e686743d3f0bd42296008dad5155fe0f6c1617\": container with ID starting with 90cc66be56ec228c85965822e3e686743d3f0bd42296008dad5155fe0f6c1617 not found: ID does not exist" Oct 07 11:40:33 crc kubenswrapper[4700]: I1007 11:40:33.975593 4700 scope.go:117] "RemoveContainer" containerID="9835cb7d8a7ba08185b02e2b9f4caadde1fa07d64b58c118e36bcdd6d9743355" Oct 07 11:40:33 crc kubenswrapper[4700]: E1007 11:40:33.976453 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9835cb7d8a7ba08185b02e2b9f4caadde1fa07d64b58c118e36bcdd6d9743355\": container with ID starting with 9835cb7d8a7ba08185b02e2b9f4caadde1fa07d64b58c118e36bcdd6d9743355 not found: ID does not exist" containerID="9835cb7d8a7ba08185b02e2b9f4caadde1fa07d64b58c118e36bcdd6d9743355" Oct 07 11:40:33 crc kubenswrapper[4700]: I1007 11:40:33.976512 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9835cb7d8a7ba08185b02e2b9f4caadde1fa07d64b58c118e36bcdd6d9743355"} err="failed to get container status \"9835cb7d8a7ba08185b02e2b9f4caadde1fa07d64b58c118e36bcdd6d9743355\": rpc error: code = NotFound desc = could not find container \"9835cb7d8a7ba08185b02e2b9f4caadde1fa07d64b58c118e36bcdd6d9743355\": container with ID starting with 9835cb7d8a7ba08185b02e2b9f4caadde1fa07d64b58c118e36bcdd6d9743355 not found: ID does not exist" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.048466 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.048816 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f820d4b3-174b-4f37-bdc5-69e416f94cde" path="/var/lib/kubelet/pods/f820d4b3-174b-4f37-bdc5-69e416f94cde/volumes" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.049118 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.049325 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.064220 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.075343 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1342cb3b-48cc-4b55-8f08-476ca7d4d784-config-data\") pod \"nova-api-0\" (UID: \"1342cb3b-48cc-4b55-8f08-476ca7d4d784\") " pod="openstack/nova-api-0" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.075521 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvgwv\" (UniqueName: \"kubernetes.io/projected/1342cb3b-48cc-4b55-8f08-476ca7d4d784-kube-api-access-kvgwv\") pod \"nova-api-0\" (UID: \"1342cb3b-48cc-4b55-8f08-476ca7d4d784\") " pod="openstack/nova-api-0" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.075593 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1342cb3b-48cc-4b55-8f08-476ca7d4d784-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1342cb3b-48cc-4b55-8f08-476ca7d4d784\") " pod="openstack/nova-api-0" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.075680 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1342cb3b-48cc-4b55-8f08-476ca7d4d784-logs\") pod \"nova-api-0\" (UID: \"1342cb3b-48cc-4b55-8f08-476ca7d4d784\") " pod="openstack/nova-api-0" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.075708 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1342cb3b-48cc-4b55-8f08-476ca7d4d784-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1342cb3b-48cc-4b55-8f08-476ca7d4d784\") " pod="openstack/nova-api-0" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.075826 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1342cb3b-48cc-4b55-8f08-476ca7d4d784-public-tls-certs\") pod \"nova-api-0\" (UID: \"1342cb3b-48cc-4b55-8f08-476ca7d4d784\") " pod="openstack/nova-api-0" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.170362 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-9lkqh"] Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.172366 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-9lkqh" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.181040 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-9lkqh"] Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.181961 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1342cb3b-48cc-4b55-8f08-476ca7d4d784-logs\") pod \"nova-api-0\" (UID: \"1342cb3b-48cc-4b55-8f08-476ca7d4d784\") " pod="openstack/nova-api-0" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.181993 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1342cb3b-48cc-4b55-8f08-476ca7d4d784-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1342cb3b-48cc-4b55-8f08-476ca7d4d784\") " pod="openstack/nova-api-0" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.182063 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1342cb3b-48cc-4b55-8f08-476ca7d4d784-public-tls-certs\") pod \"nova-api-0\" (UID: \"1342cb3b-48cc-4b55-8f08-476ca7d4d784\") " pod="openstack/nova-api-0" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.182134 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1342cb3b-48cc-4b55-8f08-476ca7d4d784-config-data\") pod \"nova-api-0\" (UID: \"1342cb3b-48cc-4b55-8f08-476ca7d4d784\") " pod="openstack/nova-api-0" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.182183 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvgwv\" (UniqueName: \"kubernetes.io/projected/1342cb3b-48cc-4b55-8f08-476ca7d4d784-kube-api-access-kvgwv\") pod \"nova-api-0\" (UID: \"1342cb3b-48cc-4b55-8f08-476ca7d4d784\") " pod="openstack/nova-api-0" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.182224 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1342cb3b-48cc-4b55-8f08-476ca7d4d784-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1342cb3b-48cc-4b55-8f08-476ca7d4d784\") " pod="openstack/nova-api-0" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.185219 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1342cb3b-48cc-4b55-8f08-476ca7d4d784-logs\") pod \"nova-api-0\" (UID: \"1342cb3b-48cc-4b55-8f08-476ca7d4d784\") " pod="openstack/nova-api-0" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.185882 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1342cb3b-48cc-4b55-8f08-476ca7d4d784-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1342cb3b-48cc-4b55-8f08-476ca7d4d784\") " pod="openstack/nova-api-0" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.189057 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.189170 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.191199 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1342cb3b-48cc-4b55-8f08-476ca7d4d784-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1342cb3b-48cc-4b55-8f08-476ca7d4d784\") " pod="openstack/nova-api-0" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.205709 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1342cb3b-48cc-4b55-8f08-476ca7d4d784-config-data\") pod \"nova-api-0\" (UID: \"1342cb3b-48cc-4b55-8f08-476ca7d4d784\") " pod="openstack/nova-api-0" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.209861 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvgwv\" (UniqueName: \"kubernetes.io/projected/1342cb3b-48cc-4b55-8f08-476ca7d4d784-kube-api-access-kvgwv\") pod \"nova-api-0\" (UID: \"1342cb3b-48cc-4b55-8f08-476ca7d4d784\") " pod="openstack/nova-api-0" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.210406 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1342cb3b-48cc-4b55-8f08-476ca7d4d784-public-tls-certs\") pod \"nova-api-0\" (UID: \"1342cb3b-48cc-4b55-8f08-476ca7d4d784\") " pod="openstack/nova-api-0" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.283859 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s7qp\" (UniqueName: \"kubernetes.io/projected/fea0db12-6a5a-4452-bc47-31df9a1bb76e-kube-api-access-5s7qp\") pod \"nova-cell1-cell-mapping-9lkqh\" (UID: \"fea0db12-6a5a-4452-bc47-31df9a1bb76e\") " pod="openstack/nova-cell1-cell-mapping-9lkqh" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.284110 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fea0db12-6a5a-4452-bc47-31df9a1bb76e-config-data\") pod \"nova-cell1-cell-mapping-9lkqh\" (UID: \"fea0db12-6a5a-4452-bc47-31df9a1bb76e\") " pod="openstack/nova-cell1-cell-mapping-9lkqh" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.284263 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fea0db12-6a5a-4452-bc47-31df9a1bb76e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-9lkqh\" (UID: \"fea0db12-6a5a-4452-bc47-31df9a1bb76e\") " pod="openstack/nova-cell1-cell-mapping-9lkqh" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.284437 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fea0db12-6a5a-4452-bc47-31df9a1bb76e-scripts\") pod \"nova-cell1-cell-mapping-9lkqh\" (UID: \"fea0db12-6a5a-4452-bc47-31df9a1bb76e\") " pod="openstack/nova-cell1-cell-mapping-9lkqh" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.366604 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.393090 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fea0db12-6a5a-4452-bc47-31df9a1bb76e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-9lkqh\" (UID: \"fea0db12-6a5a-4452-bc47-31df9a1bb76e\") " pod="openstack/nova-cell1-cell-mapping-9lkqh" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.393192 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fea0db12-6a5a-4452-bc47-31df9a1bb76e-scripts\") pod \"nova-cell1-cell-mapping-9lkqh\" (UID: \"fea0db12-6a5a-4452-bc47-31df9a1bb76e\") " pod="openstack/nova-cell1-cell-mapping-9lkqh" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.393327 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s7qp\" (UniqueName: \"kubernetes.io/projected/fea0db12-6a5a-4452-bc47-31df9a1bb76e-kube-api-access-5s7qp\") pod \"nova-cell1-cell-mapping-9lkqh\" (UID: \"fea0db12-6a5a-4452-bc47-31df9a1bb76e\") " pod="openstack/nova-cell1-cell-mapping-9lkqh" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.393375 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fea0db12-6a5a-4452-bc47-31df9a1bb76e-config-data\") pod \"nova-cell1-cell-mapping-9lkqh\" (UID: \"fea0db12-6a5a-4452-bc47-31df9a1bb76e\") " pod="openstack/nova-cell1-cell-mapping-9lkqh" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.400043 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fea0db12-6a5a-4452-bc47-31df9a1bb76e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-9lkqh\" (UID: \"fea0db12-6a5a-4452-bc47-31df9a1bb76e\") " pod="openstack/nova-cell1-cell-mapping-9lkqh" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.400347 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fea0db12-6a5a-4452-bc47-31df9a1bb76e-config-data\") pod \"nova-cell1-cell-mapping-9lkqh\" (UID: \"fea0db12-6a5a-4452-bc47-31df9a1bb76e\") " pod="openstack/nova-cell1-cell-mapping-9lkqh" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.400689 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fea0db12-6a5a-4452-bc47-31df9a1bb76e-scripts\") pod \"nova-cell1-cell-mapping-9lkqh\" (UID: \"fea0db12-6a5a-4452-bc47-31df9a1bb76e\") " pod="openstack/nova-cell1-cell-mapping-9lkqh" Oct 07 11:40:34 crc kubenswrapper[4700]: I1007 11:40:34.429776 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s7qp\" (UniqueName: \"kubernetes.io/projected/fea0db12-6a5a-4452-bc47-31df9a1bb76e-kube-api-access-5s7qp\") pod \"nova-cell1-cell-mapping-9lkqh\" (UID: \"fea0db12-6a5a-4452-bc47-31df9a1bb76e\") " pod="openstack/nova-cell1-cell-mapping-9lkqh" Oct 07 11:40:35 crc kubenswrapper[4700]: I1007 11:40:34.656167 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-9lkqh" Oct 07 11:40:35 crc kubenswrapper[4700]: I1007 11:40:34.852336 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 11:40:35 crc kubenswrapper[4700]: W1007 11:40:34.853405 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1342cb3b_48cc_4b55_8f08_476ca7d4d784.slice/crio-a45fe913b7e36c77c02fc5939a097085262443450d3986ad63eccbc27d812418 WatchSource:0}: Error finding container a45fe913b7e36c77c02fc5939a097085262443450d3986ad63eccbc27d812418: Status 404 returned error can't find the container with id a45fe913b7e36c77c02fc5939a097085262443450d3986ad63eccbc27d812418 Oct 07 11:40:35 crc kubenswrapper[4700]: I1007 11:40:34.867468 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5193cebf-c7b2-4e53-8dc1-d3c37a551e03","Type":"ContainerStarted","Data":"89dce6e6bfc597c1d929128947cd3721cd82d7e29c57c2c47994650b43ece49c"} Oct 07 11:40:35 crc kubenswrapper[4700]: I1007 11:40:34.869928 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1342cb3b-48cc-4b55-8f08-476ca7d4d784","Type":"ContainerStarted","Data":"a45fe913b7e36c77c02fc5939a097085262443450d3986ad63eccbc27d812418"} Oct 07 11:40:35 crc kubenswrapper[4700]: I1007 11:40:35.588402 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-9lkqh"] Oct 07 11:40:35 crc kubenswrapper[4700]: I1007 11:40:35.880136 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5193cebf-c7b2-4e53-8dc1-d3c37a551e03","Type":"ContainerStarted","Data":"e3451518ed041978620c92b36027bf5c035b675a1b7ed51a98194f3ad9259e7e"} Oct 07 11:40:35 crc kubenswrapper[4700]: I1007 11:40:35.882230 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-9lkqh" event={"ID":"fea0db12-6a5a-4452-bc47-31df9a1bb76e","Type":"ContainerStarted","Data":"1464cd0c52b0ed5ed8665b4cfe7e385c1f777085b1d80c27077e90d3861e5c03"} Oct 07 11:40:35 crc kubenswrapper[4700]: I1007 11:40:35.882277 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-9lkqh" event={"ID":"fea0db12-6a5a-4452-bc47-31df9a1bb76e","Type":"ContainerStarted","Data":"73e1a2c032af3c8d472fd0dcf2d9b74acdd3d8bd3d94baeece582ee4df5a80a4"} Oct 07 11:40:35 crc kubenswrapper[4700]: I1007 11:40:35.885075 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1342cb3b-48cc-4b55-8f08-476ca7d4d784","Type":"ContainerStarted","Data":"c2619bfb1bc909a892d829a483670ee86e3a81085ad054d323cd766206d72fea"} Oct 07 11:40:35 crc kubenswrapper[4700]: I1007 11:40:35.885127 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1342cb3b-48cc-4b55-8f08-476ca7d4d784","Type":"ContainerStarted","Data":"0ab38e650022c4b9a60f517061cb512efa886ae3b59d2ada49a039c7772d35f1"} Oct 07 11:40:35 crc kubenswrapper[4700]: I1007 11:40:35.914238 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-9lkqh" podStartSLOduration=1.914215899 podStartE2EDuration="1.914215899s" podCreationTimestamp="2025-10-07 11:40:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:40:35.905257115 +0000 UTC m=+1202.701656094" watchObservedRunningTime="2025-10-07 11:40:35.914215899 +0000 UTC m=+1202.710614888" Oct 07 11:40:35 crc kubenswrapper[4700]: I1007 11:40:35.937510 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.937486556 podStartE2EDuration="2.937486556s" podCreationTimestamp="2025-10-07 11:40:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:40:35.922104314 +0000 UTC m=+1202.718503373" watchObservedRunningTime="2025-10-07 11:40:35.937486556 +0000 UTC m=+1202.733885545" Oct 07 11:40:37 crc kubenswrapper[4700]: I1007 11:40:37.309050 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79b5d74c8c-c4m94" Oct 07 11:40:37 crc kubenswrapper[4700]: I1007 11:40:37.461395 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fbc4d444f-c9svf"] Oct 07 11:40:37 crc kubenswrapper[4700]: I1007 11:40:37.462016 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5fbc4d444f-c9svf" podUID="bd2c5b31-ee31-445b-8f37-5f4def71e84e" containerName="dnsmasq-dns" containerID="cri-o://9acc0339b9a2fc597468ca200b34801b9e52360bb8f3b2bf05a27c107357cbb1" gracePeriod=10 Oct 07 11:40:37 crc kubenswrapper[4700]: I1007 11:40:37.919674 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5193cebf-c7b2-4e53-8dc1-d3c37a551e03","Type":"ContainerStarted","Data":"477e79ecd61b79976efd124fa13866daea394e5cb35069114ff2658be2286e74"} Oct 07 11:40:37 crc kubenswrapper[4700]: I1007 11:40:37.920488 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 11:40:37 crc kubenswrapper[4700]: I1007 11:40:37.923283 4700 generic.go:334] "Generic (PLEG): container finished" podID="bd2c5b31-ee31-445b-8f37-5f4def71e84e" containerID="9acc0339b9a2fc597468ca200b34801b9e52360bb8f3b2bf05a27c107357cbb1" exitCode=0 Oct 07 11:40:37 crc kubenswrapper[4700]: I1007 11:40:37.923364 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fbc4d444f-c9svf" event={"ID":"bd2c5b31-ee31-445b-8f37-5f4def71e84e","Type":"ContainerDied","Data":"9acc0339b9a2fc597468ca200b34801b9e52360bb8f3b2bf05a27c107357cbb1"} Oct 07 11:40:37 crc kubenswrapper[4700]: I1007 11:40:37.923537 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fbc4d444f-c9svf" event={"ID":"bd2c5b31-ee31-445b-8f37-5f4def71e84e","Type":"ContainerDied","Data":"abf76f66574295971ce593d81a56901c03e88d4b07599a4002e68b2ebdef8db8"} Oct 07 11:40:37 crc kubenswrapper[4700]: I1007 11:40:37.923607 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abf76f66574295971ce593d81a56901c03e88d4b07599a4002e68b2ebdef8db8" Oct 07 11:40:37 crc kubenswrapper[4700]: I1007 11:40:37.948230 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.865141081 podStartE2EDuration="6.948212172s" podCreationTimestamp="2025-10-07 11:40:31 +0000 UTC" firstStartedPulling="2025-10-07 11:40:32.731746822 +0000 UTC m=+1199.528145851" lastFinishedPulling="2025-10-07 11:40:36.814817943 +0000 UTC m=+1203.611216942" observedRunningTime="2025-10-07 11:40:37.948141101 +0000 UTC m=+1204.744540090" watchObservedRunningTime="2025-10-07 11:40:37.948212172 +0000 UTC m=+1204.744611161" Oct 07 11:40:38 crc kubenswrapper[4700]: I1007 11:40:38.001537 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fbc4d444f-c9svf" Oct 07 11:40:38 crc kubenswrapper[4700]: I1007 11:40:38.177955 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd2c5b31-ee31-445b-8f37-5f4def71e84e-config\") pod \"bd2c5b31-ee31-445b-8f37-5f4def71e84e\" (UID: \"bd2c5b31-ee31-445b-8f37-5f4def71e84e\") " Oct 07 11:40:38 crc kubenswrapper[4700]: I1007 11:40:38.178283 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd2c5b31-ee31-445b-8f37-5f4def71e84e-ovsdbserver-sb\") pod \"bd2c5b31-ee31-445b-8f37-5f4def71e84e\" (UID: \"bd2c5b31-ee31-445b-8f37-5f4def71e84e\") " Oct 07 11:40:38 crc kubenswrapper[4700]: I1007 11:40:38.178326 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd2c5b31-ee31-445b-8f37-5f4def71e84e-ovsdbserver-nb\") pod \"bd2c5b31-ee31-445b-8f37-5f4def71e84e\" (UID: \"bd2c5b31-ee31-445b-8f37-5f4def71e84e\") " Oct 07 11:40:38 crc kubenswrapper[4700]: I1007 11:40:38.178497 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd2c5b31-ee31-445b-8f37-5f4def71e84e-dns-svc\") pod \"bd2c5b31-ee31-445b-8f37-5f4def71e84e\" (UID: \"bd2c5b31-ee31-445b-8f37-5f4def71e84e\") " Oct 07 11:40:38 crc kubenswrapper[4700]: I1007 11:40:38.178606 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bd2c5b31-ee31-445b-8f37-5f4def71e84e-dns-swift-storage-0\") pod \"bd2c5b31-ee31-445b-8f37-5f4def71e84e\" (UID: \"bd2c5b31-ee31-445b-8f37-5f4def71e84e\") " Oct 07 11:40:38 crc kubenswrapper[4700]: I1007 11:40:38.178727 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkgc4\" (UniqueName: \"kubernetes.io/projected/bd2c5b31-ee31-445b-8f37-5f4def71e84e-kube-api-access-gkgc4\") pod \"bd2c5b31-ee31-445b-8f37-5f4def71e84e\" (UID: \"bd2c5b31-ee31-445b-8f37-5f4def71e84e\") " Oct 07 11:40:38 crc kubenswrapper[4700]: I1007 11:40:38.184554 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd2c5b31-ee31-445b-8f37-5f4def71e84e-kube-api-access-gkgc4" (OuterVolumeSpecName: "kube-api-access-gkgc4") pod "bd2c5b31-ee31-445b-8f37-5f4def71e84e" (UID: "bd2c5b31-ee31-445b-8f37-5f4def71e84e"). InnerVolumeSpecName "kube-api-access-gkgc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:40:38 crc kubenswrapper[4700]: I1007 11:40:38.231153 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd2c5b31-ee31-445b-8f37-5f4def71e84e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bd2c5b31-ee31-445b-8f37-5f4def71e84e" (UID: "bd2c5b31-ee31-445b-8f37-5f4def71e84e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:40:38 crc kubenswrapper[4700]: I1007 11:40:38.238868 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd2c5b31-ee31-445b-8f37-5f4def71e84e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bd2c5b31-ee31-445b-8f37-5f4def71e84e" (UID: "bd2c5b31-ee31-445b-8f37-5f4def71e84e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:40:38 crc kubenswrapper[4700]: I1007 11:40:38.241621 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd2c5b31-ee31-445b-8f37-5f4def71e84e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bd2c5b31-ee31-445b-8f37-5f4def71e84e" (UID: "bd2c5b31-ee31-445b-8f37-5f4def71e84e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:40:38 crc kubenswrapper[4700]: I1007 11:40:38.245760 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd2c5b31-ee31-445b-8f37-5f4def71e84e-config" (OuterVolumeSpecName: "config") pod "bd2c5b31-ee31-445b-8f37-5f4def71e84e" (UID: "bd2c5b31-ee31-445b-8f37-5f4def71e84e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:40:38 crc kubenswrapper[4700]: I1007 11:40:38.262149 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd2c5b31-ee31-445b-8f37-5f4def71e84e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bd2c5b31-ee31-445b-8f37-5f4def71e84e" (UID: "bd2c5b31-ee31-445b-8f37-5f4def71e84e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:40:38 crc kubenswrapper[4700]: I1007 11:40:38.281568 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkgc4\" (UniqueName: \"kubernetes.io/projected/bd2c5b31-ee31-445b-8f37-5f4def71e84e-kube-api-access-gkgc4\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:38 crc kubenswrapper[4700]: I1007 11:40:38.281799 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd2c5b31-ee31-445b-8f37-5f4def71e84e-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:38 crc kubenswrapper[4700]: I1007 11:40:38.281889 4700 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd2c5b31-ee31-445b-8f37-5f4def71e84e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:38 crc kubenswrapper[4700]: I1007 11:40:38.281971 4700 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd2c5b31-ee31-445b-8f37-5f4def71e84e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:38 crc kubenswrapper[4700]: I1007 11:40:38.282161 4700 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd2c5b31-ee31-445b-8f37-5f4def71e84e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:38 crc kubenswrapper[4700]: I1007 11:40:38.282249 4700 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bd2c5b31-ee31-445b-8f37-5f4def71e84e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:38 crc kubenswrapper[4700]: I1007 11:40:38.934809 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fbc4d444f-c9svf" Oct 07 11:40:39 crc kubenswrapper[4700]: I1007 11:40:39.014481 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fbc4d444f-c9svf"] Oct 07 11:40:39 crc kubenswrapper[4700]: I1007 11:40:39.030712 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fbc4d444f-c9svf"] Oct 07 11:40:39 crc kubenswrapper[4700]: I1007 11:40:39.968930 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd2c5b31-ee31-445b-8f37-5f4def71e84e" path="/var/lib/kubelet/pods/bd2c5b31-ee31-445b-8f37-5f4def71e84e/volumes" Oct 07 11:40:40 crc kubenswrapper[4700]: I1007 11:40:40.956649 4700 generic.go:334] "Generic (PLEG): container finished" podID="fea0db12-6a5a-4452-bc47-31df9a1bb76e" containerID="1464cd0c52b0ed5ed8665b4cfe7e385c1f777085b1d80c27077e90d3861e5c03" exitCode=0 Oct 07 11:40:40 crc kubenswrapper[4700]: I1007 11:40:40.956698 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-9lkqh" event={"ID":"fea0db12-6a5a-4452-bc47-31df9a1bb76e","Type":"ContainerDied","Data":"1464cd0c52b0ed5ed8665b4cfe7e385c1f777085b1d80c27077e90d3861e5c03"} Oct 07 11:40:42 crc kubenswrapper[4700]: I1007 11:40:42.336882 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-9lkqh" Oct 07 11:40:42 crc kubenswrapper[4700]: I1007 11:40:42.461383 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s7qp\" (UniqueName: \"kubernetes.io/projected/fea0db12-6a5a-4452-bc47-31df9a1bb76e-kube-api-access-5s7qp\") pod \"fea0db12-6a5a-4452-bc47-31df9a1bb76e\" (UID: \"fea0db12-6a5a-4452-bc47-31df9a1bb76e\") " Oct 07 11:40:42 crc kubenswrapper[4700]: I1007 11:40:42.461477 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fea0db12-6a5a-4452-bc47-31df9a1bb76e-config-data\") pod \"fea0db12-6a5a-4452-bc47-31df9a1bb76e\" (UID: \"fea0db12-6a5a-4452-bc47-31df9a1bb76e\") " Oct 07 11:40:42 crc kubenswrapper[4700]: I1007 11:40:42.461539 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fea0db12-6a5a-4452-bc47-31df9a1bb76e-scripts\") pod \"fea0db12-6a5a-4452-bc47-31df9a1bb76e\" (UID: \"fea0db12-6a5a-4452-bc47-31df9a1bb76e\") " Oct 07 11:40:42 crc kubenswrapper[4700]: I1007 11:40:42.461608 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fea0db12-6a5a-4452-bc47-31df9a1bb76e-combined-ca-bundle\") pod \"fea0db12-6a5a-4452-bc47-31df9a1bb76e\" (UID: \"fea0db12-6a5a-4452-bc47-31df9a1bb76e\") " Oct 07 11:40:42 crc kubenswrapper[4700]: I1007 11:40:42.475477 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fea0db12-6a5a-4452-bc47-31df9a1bb76e-scripts" (OuterVolumeSpecName: "scripts") pod "fea0db12-6a5a-4452-bc47-31df9a1bb76e" (UID: "fea0db12-6a5a-4452-bc47-31df9a1bb76e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:40:42 crc kubenswrapper[4700]: I1007 11:40:42.475499 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fea0db12-6a5a-4452-bc47-31df9a1bb76e-kube-api-access-5s7qp" (OuterVolumeSpecName: "kube-api-access-5s7qp") pod "fea0db12-6a5a-4452-bc47-31df9a1bb76e" (UID: "fea0db12-6a5a-4452-bc47-31df9a1bb76e"). InnerVolumeSpecName "kube-api-access-5s7qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:40:42 crc kubenswrapper[4700]: I1007 11:40:42.511199 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fea0db12-6a5a-4452-bc47-31df9a1bb76e-config-data" (OuterVolumeSpecName: "config-data") pod "fea0db12-6a5a-4452-bc47-31df9a1bb76e" (UID: "fea0db12-6a5a-4452-bc47-31df9a1bb76e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:40:42 crc kubenswrapper[4700]: I1007 11:40:42.525644 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fea0db12-6a5a-4452-bc47-31df9a1bb76e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fea0db12-6a5a-4452-bc47-31df9a1bb76e" (UID: "fea0db12-6a5a-4452-bc47-31df9a1bb76e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:40:42 crc kubenswrapper[4700]: I1007 11:40:42.564124 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5s7qp\" (UniqueName: \"kubernetes.io/projected/fea0db12-6a5a-4452-bc47-31df9a1bb76e-kube-api-access-5s7qp\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:42 crc kubenswrapper[4700]: I1007 11:40:42.564153 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fea0db12-6a5a-4452-bc47-31df9a1bb76e-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:42 crc kubenswrapper[4700]: I1007 11:40:42.564161 4700 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fea0db12-6a5a-4452-bc47-31df9a1bb76e-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:42 crc kubenswrapper[4700]: I1007 11:40:42.564172 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fea0db12-6a5a-4452-bc47-31df9a1bb76e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:42 crc kubenswrapper[4700]: I1007 11:40:42.981961 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-9lkqh" event={"ID":"fea0db12-6a5a-4452-bc47-31df9a1bb76e","Type":"ContainerDied","Data":"73e1a2c032af3c8d472fd0dcf2d9b74acdd3d8bd3d94baeece582ee4df5a80a4"} Oct 07 11:40:42 crc kubenswrapper[4700]: I1007 11:40:42.982003 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73e1a2c032af3c8d472fd0dcf2d9b74acdd3d8bd3d94baeece582ee4df5a80a4" Oct 07 11:40:42 crc kubenswrapper[4700]: I1007 11:40:42.982079 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-9lkqh" Oct 07 11:40:43 crc kubenswrapper[4700]: I1007 11:40:43.180641 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 11:40:43 crc kubenswrapper[4700]: I1007 11:40:43.180919 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1342cb3b-48cc-4b55-8f08-476ca7d4d784" containerName="nova-api-log" containerID="cri-o://0ab38e650022c4b9a60f517061cb512efa886ae3b59d2ada49a039c7772d35f1" gracePeriod=30 Oct 07 11:40:43 crc kubenswrapper[4700]: I1007 11:40:43.180990 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1342cb3b-48cc-4b55-8f08-476ca7d4d784" containerName="nova-api-api" containerID="cri-o://c2619bfb1bc909a892d829a483670ee86e3a81085ad054d323cd766206d72fea" gracePeriod=30 Oct 07 11:40:43 crc kubenswrapper[4700]: I1007 11:40:43.209408 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 11:40:43 crc kubenswrapper[4700]: I1007 11:40:43.209779 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="81608928-68ca-42fc-9839-c19d6efdb0f1" containerName="nova-scheduler-scheduler" containerID="cri-o://54cd06d47f27c76e44fa917539a672e3c0613f3b43ae5f0c3445b0f6b5e798dc" gracePeriod=30 Oct 07 11:40:43 crc kubenswrapper[4700]: I1007 11:40:43.224197 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 11:40:43 crc kubenswrapper[4700]: I1007 11:40:43.224504 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="13922810-c6e2-4774-b143-a1aadd32210b" containerName="nova-metadata-log" containerID="cri-o://f6def2a9779086cf4b364f91fea389c3b9c73adefa706e818fe269f437314ab5" gracePeriod=30 Oct 07 11:40:43 crc kubenswrapper[4700]: I1007 11:40:43.224655 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="13922810-c6e2-4774-b143-a1aadd32210b" containerName="nova-metadata-metadata" containerID="cri-o://3a158e372c8eecc57c570519337623538a48b0fcb5d43bec1ba34297cc3c49d3" gracePeriod=30 Oct 07 11:40:43 crc kubenswrapper[4700]: I1007 11:40:43.998680 4700 generic.go:334] "Generic (PLEG): container finished" podID="13922810-c6e2-4774-b143-a1aadd32210b" containerID="f6def2a9779086cf4b364f91fea389c3b9c73adefa706e818fe269f437314ab5" exitCode=143 Oct 07 11:40:44 crc kubenswrapper[4700]: I1007 11:40:43.998886 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"13922810-c6e2-4774-b143-a1aadd32210b","Type":"ContainerDied","Data":"f6def2a9779086cf4b364f91fea389c3b9c73adefa706e818fe269f437314ab5"} Oct 07 11:40:44 crc kubenswrapper[4700]: I1007 11:40:44.004858 4700 generic.go:334] "Generic (PLEG): container finished" podID="1342cb3b-48cc-4b55-8f08-476ca7d4d784" containerID="c2619bfb1bc909a892d829a483670ee86e3a81085ad054d323cd766206d72fea" exitCode=0 Oct 07 11:40:44 crc kubenswrapper[4700]: I1007 11:40:44.004886 4700 generic.go:334] "Generic (PLEG): container finished" podID="1342cb3b-48cc-4b55-8f08-476ca7d4d784" containerID="0ab38e650022c4b9a60f517061cb512efa886ae3b59d2ada49a039c7772d35f1" exitCode=143 Oct 07 11:40:44 crc kubenswrapper[4700]: I1007 11:40:44.004936 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1342cb3b-48cc-4b55-8f08-476ca7d4d784","Type":"ContainerDied","Data":"c2619bfb1bc909a892d829a483670ee86e3a81085ad054d323cd766206d72fea"} Oct 07 11:40:44 crc kubenswrapper[4700]: I1007 11:40:44.004958 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1342cb3b-48cc-4b55-8f08-476ca7d4d784","Type":"ContainerDied","Data":"0ab38e650022c4b9a60f517061cb512efa886ae3b59d2ada49a039c7772d35f1"} Oct 07 11:40:44 crc kubenswrapper[4700]: I1007 11:40:44.090660 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 11:40:44 crc kubenswrapper[4700]: E1007 11:40:44.173966 4700 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="54cd06d47f27c76e44fa917539a672e3c0613f3b43ae5f0c3445b0f6b5e798dc" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 11:40:44 crc kubenswrapper[4700]: E1007 11:40:44.175428 4700 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="54cd06d47f27c76e44fa917539a672e3c0613f3b43ae5f0c3445b0f6b5e798dc" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 11:40:44 crc kubenswrapper[4700]: E1007 11:40:44.177755 4700 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="54cd06d47f27c76e44fa917539a672e3c0613f3b43ae5f0c3445b0f6b5e798dc" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 11:40:44 crc kubenswrapper[4700]: E1007 11:40:44.177782 4700 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="81608928-68ca-42fc-9839-c19d6efdb0f1" containerName="nova-scheduler-scheduler" Oct 07 11:40:44 crc kubenswrapper[4700]: I1007 11:40:44.193182 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvgwv\" (UniqueName: \"kubernetes.io/projected/1342cb3b-48cc-4b55-8f08-476ca7d4d784-kube-api-access-kvgwv\") pod \"1342cb3b-48cc-4b55-8f08-476ca7d4d784\" (UID: \"1342cb3b-48cc-4b55-8f08-476ca7d4d784\") " Oct 07 11:40:44 crc kubenswrapper[4700]: I1007 11:40:44.193402 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1342cb3b-48cc-4b55-8f08-476ca7d4d784-internal-tls-certs\") pod \"1342cb3b-48cc-4b55-8f08-476ca7d4d784\" (UID: \"1342cb3b-48cc-4b55-8f08-476ca7d4d784\") " Oct 07 11:40:44 crc kubenswrapper[4700]: I1007 11:40:44.193428 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1342cb3b-48cc-4b55-8f08-476ca7d4d784-logs\") pod \"1342cb3b-48cc-4b55-8f08-476ca7d4d784\" (UID: \"1342cb3b-48cc-4b55-8f08-476ca7d4d784\") " Oct 07 11:40:44 crc kubenswrapper[4700]: I1007 11:40:44.193486 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1342cb3b-48cc-4b55-8f08-476ca7d4d784-combined-ca-bundle\") pod \"1342cb3b-48cc-4b55-8f08-476ca7d4d784\" (UID: \"1342cb3b-48cc-4b55-8f08-476ca7d4d784\") " Oct 07 11:40:44 crc kubenswrapper[4700]: I1007 11:40:44.193533 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1342cb3b-48cc-4b55-8f08-476ca7d4d784-public-tls-certs\") pod \"1342cb3b-48cc-4b55-8f08-476ca7d4d784\" (UID: \"1342cb3b-48cc-4b55-8f08-476ca7d4d784\") " Oct 07 11:40:44 crc kubenswrapper[4700]: I1007 11:40:44.193616 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1342cb3b-48cc-4b55-8f08-476ca7d4d784-config-data\") pod \"1342cb3b-48cc-4b55-8f08-476ca7d4d784\" (UID: \"1342cb3b-48cc-4b55-8f08-476ca7d4d784\") " Oct 07 11:40:44 crc kubenswrapper[4700]: I1007 11:40:44.193744 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1342cb3b-48cc-4b55-8f08-476ca7d4d784-logs" (OuterVolumeSpecName: "logs") pod "1342cb3b-48cc-4b55-8f08-476ca7d4d784" (UID: "1342cb3b-48cc-4b55-8f08-476ca7d4d784"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:40:44 crc kubenswrapper[4700]: I1007 11:40:44.194917 4700 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1342cb3b-48cc-4b55-8f08-476ca7d4d784-logs\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:44 crc kubenswrapper[4700]: I1007 11:40:44.199952 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1342cb3b-48cc-4b55-8f08-476ca7d4d784-kube-api-access-kvgwv" (OuterVolumeSpecName: "kube-api-access-kvgwv") pod "1342cb3b-48cc-4b55-8f08-476ca7d4d784" (UID: "1342cb3b-48cc-4b55-8f08-476ca7d4d784"). InnerVolumeSpecName "kube-api-access-kvgwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:40:44 crc kubenswrapper[4700]: I1007 11:40:44.222964 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1342cb3b-48cc-4b55-8f08-476ca7d4d784-config-data" (OuterVolumeSpecName: "config-data") pod "1342cb3b-48cc-4b55-8f08-476ca7d4d784" (UID: "1342cb3b-48cc-4b55-8f08-476ca7d4d784"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:40:44 crc kubenswrapper[4700]: I1007 11:40:44.236818 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1342cb3b-48cc-4b55-8f08-476ca7d4d784-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1342cb3b-48cc-4b55-8f08-476ca7d4d784" (UID: "1342cb3b-48cc-4b55-8f08-476ca7d4d784"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:40:44 crc kubenswrapper[4700]: I1007 11:40:44.250576 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1342cb3b-48cc-4b55-8f08-476ca7d4d784-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1342cb3b-48cc-4b55-8f08-476ca7d4d784" (UID: "1342cb3b-48cc-4b55-8f08-476ca7d4d784"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:40:44 crc kubenswrapper[4700]: I1007 11:40:44.251205 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1342cb3b-48cc-4b55-8f08-476ca7d4d784-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1342cb3b-48cc-4b55-8f08-476ca7d4d784" (UID: "1342cb3b-48cc-4b55-8f08-476ca7d4d784"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:40:44 crc kubenswrapper[4700]: I1007 11:40:44.296415 4700 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1342cb3b-48cc-4b55-8f08-476ca7d4d784-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:44 crc kubenswrapper[4700]: I1007 11:40:44.296594 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1342cb3b-48cc-4b55-8f08-476ca7d4d784-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:44 crc kubenswrapper[4700]: I1007 11:40:44.296645 4700 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1342cb3b-48cc-4b55-8f08-476ca7d4d784-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:44 crc kubenswrapper[4700]: I1007 11:40:44.296692 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1342cb3b-48cc-4b55-8f08-476ca7d4d784-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:44 crc kubenswrapper[4700]: I1007 11:40:44.296739 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvgwv\" (UniqueName: \"kubernetes.io/projected/1342cb3b-48cc-4b55-8f08-476ca7d4d784-kube-api-access-kvgwv\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.016849 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1342cb3b-48cc-4b55-8f08-476ca7d4d784","Type":"ContainerDied","Data":"a45fe913b7e36c77c02fc5939a097085262443450d3986ad63eccbc27d812418"} Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.017265 4700 scope.go:117] "RemoveContainer" containerID="c2619bfb1bc909a892d829a483670ee86e3a81085ad054d323cd766206d72fea" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.016881 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.050270 4700 scope.go:117] "RemoveContainer" containerID="0ab38e650022c4b9a60f517061cb512efa886ae3b59d2ada49a039c7772d35f1" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.066389 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.106462 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 07 11:40:45 crc kubenswrapper[4700]: E1007 11:40:45.115517 4700 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1342cb3b_48cc_4b55_8f08_476ca7d4d784.slice\": RecentStats: unable to find data in memory cache]" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.119605 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 07 11:40:45 crc kubenswrapper[4700]: E1007 11:40:45.120545 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd2c5b31-ee31-445b-8f37-5f4def71e84e" containerName="init" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.120587 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd2c5b31-ee31-445b-8f37-5f4def71e84e" containerName="init" Oct 07 11:40:45 crc kubenswrapper[4700]: E1007 11:40:45.120615 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1342cb3b-48cc-4b55-8f08-476ca7d4d784" containerName="nova-api-log" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.120625 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="1342cb3b-48cc-4b55-8f08-476ca7d4d784" containerName="nova-api-log" Oct 07 11:40:45 crc kubenswrapper[4700]: E1007 11:40:45.120686 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1342cb3b-48cc-4b55-8f08-476ca7d4d784" containerName="nova-api-api" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.120696 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="1342cb3b-48cc-4b55-8f08-476ca7d4d784" containerName="nova-api-api" Oct 07 11:40:45 crc kubenswrapper[4700]: E1007 11:40:45.120725 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd2c5b31-ee31-445b-8f37-5f4def71e84e" containerName="dnsmasq-dns" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.120733 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd2c5b31-ee31-445b-8f37-5f4def71e84e" containerName="dnsmasq-dns" Oct 07 11:40:45 crc kubenswrapper[4700]: E1007 11:40:45.120768 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fea0db12-6a5a-4452-bc47-31df9a1bb76e" containerName="nova-manage" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.120775 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="fea0db12-6a5a-4452-bc47-31df9a1bb76e" containerName="nova-manage" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.121222 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd2c5b31-ee31-445b-8f37-5f4def71e84e" containerName="dnsmasq-dns" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.121265 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="1342cb3b-48cc-4b55-8f08-476ca7d4d784" containerName="nova-api-log" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.121332 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="1342cb3b-48cc-4b55-8f08-476ca7d4d784" containerName="nova-api-api" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.121358 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="fea0db12-6a5a-4452-bc47-31df9a1bb76e" containerName="nova-manage" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.124042 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.126359 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.126647 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.126811 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.132078 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.221265 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88-logs\") pod \"nova-api-0\" (UID: \"2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88\") " pod="openstack/nova-api-0" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.221659 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88-public-tls-certs\") pod \"nova-api-0\" (UID: \"2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88\") " pod="openstack/nova-api-0" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.221801 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88\") " pod="openstack/nova-api-0" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.221959 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88\") " pod="openstack/nova-api-0" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.222069 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88-config-data\") pod \"nova-api-0\" (UID: \"2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88\") " pod="openstack/nova-api-0" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.222167 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6c4b\" (UniqueName: \"kubernetes.io/projected/2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88-kube-api-access-b6c4b\") pod \"nova-api-0\" (UID: \"2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88\") " pod="openstack/nova-api-0" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.323264 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88\") " pod="openstack/nova-api-0" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.323625 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88\") " pod="openstack/nova-api-0" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.323777 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88-config-data\") pod \"nova-api-0\" (UID: \"2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88\") " pod="openstack/nova-api-0" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.323872 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6c4b\" (UniqueName: \"kubernetes.io/projected/2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88-kube-api-access-b6c4b\") pod \"nova-api-0\" (UID: \"2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88\") " pod="openstack/nova-api-0" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.323972 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88-logs\") pod \"nova-api-0\" (UID: \"2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88\") " pod="openstack/nova-api-0" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.324183 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88-public-tls-certs\") pod \"nova-api-0\" (UID: \"2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88\") " pod="openstack/nova-api-0" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.325104 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88-logs\") pod \"nova-api-0\" (UID: \"2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88\") " pod="openstack/nova-api-0" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.329930 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88-public-tls-certs\") pod \"nova-api-0\" (UID: \"2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88\") " pod="openstack/nova-api-0" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.330833 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88\") " pod="openstack/nova-api-0" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.333099 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88\") " pod="openstack/nova-api-0" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.346300 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88-config-data\") pod \"nova-api-0\" (UID: \"2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88\") " pod="openstack/nova-api-0" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.347351 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6c4b\" (UniqueName: \"kubernetes.io/projected/2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88-kube-api-access-b6c4b\") pod \"nova-api-0\" (UID: \"2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88\") " pod="openstack/nova-api-0" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.450535 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.774955 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 11:40:45 crc kubenswrapper[4700]: I1007 11:40:45.970895 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1342cb3b-48cc-4b55-8f08-476ca7d4d784" path="/var/lib/kubelet/pods/1342cb3b-48cc-4b55-8f08-476ca7d4d784/volumes" Oct 07 11:40:46 crc kubenswrapper[4700]: I1007 11:40:46.029174 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88","Type":"ContainerStarted","Data":"ef993fed98bf1a57a29bd520808e0eadceed13e99dae093e38834d0ad7fa68b6"} Oct 07 11:40:46 crc kubenswrapper[4700]: I1007 11:40:46.029218 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88","Type":"ContainerStarted","Data":"0b708bc05465a7070416251cf49ce26868276213994aee91a1a87c112f0d4033"} Oct 07 11:40:46 crc kubenswrapper[4700]: I1007 11:40:46.381500 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="13922810-c6e2-4774-b143-a1aadd32210b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": read tcp 10.217.0.2:36188->10.217.0.203:8775: read: connection reset by peer" Oct 07 11:40:46 crc kubenswrapper[4700]: I1007 11:40:46.381578 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="13922810-c6e2-4774-b143-a1aadd32210b" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": read tcp 10.217.0.2:36192->10.217.0.203:8775: read: connection reset by peer" Oct 07 11:40:46 crc kubenswrapper[4700]: I1007 11:40:46.806702 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 11:40:46 crc kubenswrapper[4700]: I1007 11:40:46.857331 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp5t5\" (UniqueName: \"kubernetes.io/projected/13922810-c6e2-4774-b143-a1aadd32210b-kube-api-access-kp5t5\") pod \"13922810-c6e2-4774-b143-a1aadd32210b\" (UID: \"13922810-c6e2-4774-b143-a1aadd32210b\") " Oct 07 11:40:46 crc kubenswrapper[4700]: I1007 11:40:46.857388 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13922810-c6e2-4774-b143-a1aadd32210b-combined-ca-bundle\") pod \"13922810-c6e2-4774-b143-a1aadd32210b\" (UID: \"13922810-c6e2-4774-b143-a1aadd32210b\") " Oct 07 11:40:46 crc kubenswrapper[4700]: I1007 11:40:46.857514 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13922810-c6e2-4774-b143-a1aadd32210b-config-data\") pod \"13922810-c6e2-4774-b143-a1aadd32210b\" (UID: \"13922810-c6e2-4774-b143-a1aadd32210b\") " Oct 07 11:40:46 crc kubenswrapper[4700]: I1007 11:40:46.857613 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13922810-c6e2-4774-b143-a1aadd32210b-logs\") pod \"13922810-c6e2-4774-b143-a1aadd32210b\" (UID: \"13922810-c6e2-4774-b143-a1aadd32210b\") " Oct 07 11:40:46 crc kubenswrapper[4700]: I1007 11:40:46.857638 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/13922810-c6e2-4774-b143-a1aadd32210b-nova-metadata-tls-certs\") pod \"13922810-c6e2-4774-b143-a1aadd32210b\" (UID: \"13922810-c6e2-4774-b143-a1aadd32210b\") " Oct 07 11:40:46 crc kubenswrapper[4700]: I1007 11:40:46.859447 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13922810-c6e2-4774-b143-a1aadd32210b-logs" (OuterVolumeSpecName: "logs") pod "13922810-c6e2-4774-b143-a1aadd32210b" (UID: "13922810-c6e2-4774-b143-a1aadd32210b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:40:46 crc kubenswrapper[4700]: I1007 11:40:46.886177 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13922810-c6e2-4774-b143-a1aadd32210b-kube-api-access-kp5t5" (OuterVolumeSpecName: "kube-api-access-kp5t5") pod "13922810-c6e2-4774-b143-a1aadd32210b" (UID: "13922810-c6e2-4774-b143-a1aadd32210b"). InnerVolumeSpecName "kube-api-access-kp5t5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:40:46 crc kubenswrapper[4700]: I1007 11:40:46.905639 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13922810-c6e2-4774-b143-a1aadd32210b-config-data" (OuterVolumeSpecName: "config-data") pod "13922810-c6e2-4774-b143-a1aadd32210b" (UID: "13922810-c6e2-4774-b143-a1aadd32210b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:40:46 crc kubenswrapper[4700]: I1007 11:40:46.906437 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13922810-c6e2-4774-b143-a1aadd32210b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13922810-c6e2-4774-b143-a1aadd32210b" (UID: "13922810-c6e2-4774-b143-a1aadd32210b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:40:46 crc kubenswrapper[4700]: I1007 11:40:46.915784 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13922810-c6e2-4774-b143-a1aadd32210b-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "13922810-c6e2-4774-b143-a1aadd32210b" (UID: "13922810-c6e2-4774-b143-a1aadd32210b"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:40:46 crc kubenswrapper[4700]: I1007 11:40:46.959015 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kp5t5\" (UniqueName: \"kubernetes.io/projected/13922810-c6e2-4774-b143-a1aadd32210b-kube-api-access-kp5t5\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:46 crc kubenswrapper[4700]: I1007 11:40:46.959048 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13922810-c6e2-4774-b143-a1aadd32210b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:46 crc kubenswrapper[4700]: I1007 11:40:46.959061 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13922810-c6e2-4774-b143-a1aadd32210b-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:46 crc kubenswrapper[4700]: I1007 11:40:46.959073 4700 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13922810-c6e2-4774-b143-a1aadd32210b-logs\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:46 crc kubenswrapper[4700]: I1007 11:40:46.959086 4700 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/13922810-c6e2-4774-b143-a1aadd32210b-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.039812 4700 generic.go:334] "Generic (PLEG): container finished" podID="13922810-c6e2-4774-b143-a1aadd32210b" containerID="3a158e372c8eecc57c570519337623538a48b0fcb5d43bec1ba34297cc3c49d3" exitCode=0 Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.039855 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"13922810-c6e2-4774-b143-a1aadd32210b","Type":"ContainerDied","Data":"3a158e372c8eecc57c570519337623538a48b0fcb5d43bec1ba34297cc3c49d3"} Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.039902 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"13922810-c6e2-4774-b143-a1aadd32210b","Type":"ContainerDied","Data":"2a52d001e1fa210b393fa3e7b5e1b668d8bbf8da3c2df86dbb1c4ab45a67493d"} Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.039898 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.039926 4700 scope.go:117] "RemoveContainer" containerID="3a158e372c8eecc57c570519337623538a48b0fcb5d43bec1ba34297cc3c49d3" Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.042882 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88","Type":"ContainerStarted","Data":"013677261d1aeff03e6d3811b64af6e9039c0a3b9b7f4b47c1ac972c69228c61"} Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.073039 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.073023945 podStartE2EDuration="2.073023945s" podCreationTimestamp="2025-10-07 11:40:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:40:47.067558843 +0000 UTC m=+1213.863957832" watchObservedRunningTime="2025-10-07 11:40:47.073023945 +0000 UTC m=+1213.869422934" Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.076001 4700 scope.go:117] "RemoveContainer" containerID="f6def2a9779086cf4b364f91fea389c3b9c73adefa706e818fe269f437314ab5" Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.093582 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.113545 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.114517 4700 scope.go:117] "RemoveContainer" containerID="3a158e372c8eecc57c570519337623538a48b0fcb5d43bec1ba34297cc3c49d3" Oct 07 11:40:47 crc kubenswrapper[4700]: E1007 11:40:47.116774 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a158e372c8eecc57c570519337623538a48b0fcb5d43bec1ba34297cc3c49d3\": container with ID starting with 3a158e372c8eecc57c570519337623538a48b0fcb5d43bec1ba34297cc3c49d3 not found: ID does not exist" containerID="3a158e372c8eecc57c570519337623538a48b0fcb5d43bec1ba34297cc3c49d3" Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.116820 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a158e372c8eecc57c570519337623538a48b0fcb5d43bec1ba34297cc3c49d3"} err="failed to get container status \"3a158e372c8eecc57c570519337623538a48b0fcb5d43bec1ba34297cc3c49d3\": rpc error: code = NotFound desc = could not find container \"3a158e372c8eecc57c570519337623538a48b0fcb5d43bec1ba34297cc3c49d3\": container with ID starting with 3a158e372c8eecc57c570519337623538a48b0fcb5d43bec1ba34297cc3c49d3 not found: ID does not exist" Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.116851 4700 scope.go:117] "RemoveContainer" containerID="f6def2a9779086cf4b364f91fea389c3b9c73adefa706e818fe269f437314ab5" Oct 07 11:40:47 crc kubenswrapper[4700]: E1007 11:40:47.117181 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6def2a9779086cf4b364f91fea389c3b9c73adefa706e818fe269f437314ab5\": container with ID starting with f6def2a9779086cf4b364f91fea389c3b9c73adefa706e818fe269f437314ab5 not found: ID does not exist" containerID="f6def2a9779086cf4b364f91fea389c3b9c73adefa706e818fe269f437314ab5" Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.117202 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6def2a9779086cf4b364f91fea389c3b9c73adefa706e818fe269f437314ab5"} err="failed to get container status \"f6def2a9779086cf4b364f91fea389c3b9c73adefa706e818fe269f437314ab5\": rpc error: code = NotFound desc = could not find container \"f6def2a9779086cf4b364f91fea389c3b9c73adefa706e818fe269f437314ab5\": container with ID starting with f6def2a9779086cf4b364f91fea389c3b9c73adefa706e818fe269f437314ab5 not found: ID does not exist" Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.131257 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 07 11:40:47 crc kubenswrapper[4700]: E1007 11:40:47.131716 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13922810-c6e2-4774-b143-a1aadd32210b" containerName="nova-metadata-log" Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.131733 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="13922810-c6e2-4774-b143-a1aadd32210b" containerName="nova-metadata-log" Oct 07 11:40:47 crc kubenswrapper[4700]: E1007 11:40:47.131765 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13922810-c6e2-4774-b143-a1aadd32210b" containerName="nova-metadata-metadata" Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.131771 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="13922810-c6e2-4774-b143-a1aadd32210b" containerName="nova-metadata-metadata" Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.131976 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="13922810-c6e2-4774-b143-a1aadd32210b" containerName="nova-metadata-metadata" Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.131993 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="13922810-c6e2-4774-b143-a1aadd32210b" containerName="nova-metadata-log" Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.132966 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.135247 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.135521 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.138654 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.268546 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adba6450-c198-456b-a139-67d93e54847b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"adba6450-c198-456b-a139-67d93e54847b\") " pod="openstack/nova-metadata-0" Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.268614 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adba6450-c198-456b-a139-67d93e54847b-config-data\") pod \"nova-metadata-0\" (UID: \"adba6450-c198-456b-a139-67d93e54847b\") " pod="openstack/nova-metadata-0" Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.269426 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntrsg\" (UniqueName: \"kubernetes.io/projected/adba6450-c198-456b-a139-67d93e54847b-kube-api-access-ntrsg\") pod \"nova-metadata-0\" (UID: \"adba6450-c198-456b-a139-67d93e54847b\") " pod="openstack/nova-metadata-0" Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.269595 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adba6450-c198-456b-a139-67d93e54847b-logs\") pod \"nova-metadata-0\" (UID: \"adba6450-c198-456b-a139-67d93e54847b\") " pod="openstack/nova-metadata-0" Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.269732 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/adba6450-c198-456b-a139-67d93e54847b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"adba6450-c198-456b-a139-67d93e54847b\") " pod="openstack/nova-metadata-0" Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.371131 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntrsg\" (UniqueName: \"kubernetes.io/projected/adba6450-c198-456b-a139-67d93e54847b-kube-api-access-ntrsg\") pod \"nova-metadata-0\" (UID: \"adba6450-c198-456b-a139-67d93e54847b\") " pod="openstack/nova-metadata-0" Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.371247 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adba6450-c198-456b-a139-67d93e54847b-logs\") pod \"nova-metadata-0\" (UID: \"adba6450-c198-456b-a139-67d93e54847b\") " pod="openstack/nova-metadata-0" Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.371361 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/adba6450-c198-456b-a139-67d93e54847b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"adba6450-c198-456b-a139-67d93e54847b\") " pod="openstack/nova-metadata-0" Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.371424 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adba6450-c198-456b-a139-67d93e54847b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"adba6450-c198-456b-a139-67d93e54847b\") " pod="openstack/nova-metadata-0" Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.371462 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adba6450-c198-456b-a139-67d93e54847b-config-data\") pod \"nova-metadata-0\" (UID: \"adba6450-c198-456b-a139-67d93e54847b\") " pod="openstack/nova-metadata-0" Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.371851 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adba6450-c198-456b-a139-67d93e54847b-logs\") pod \"nova-metadata-0\" (UID: \"adba6450-c198-456b-a139-67d93e54847b\") " pod="openstack/nova-metadata-0" Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.376367 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adba6450-c198-456b-a139-67d93e54847b-config-data\") pod \"nova-metadata-0\" (UID: \"adba6450-c198-456b-a139-67d93e54847b\") " pod="openstack/nova-metadata-0" Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.376766 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/adba6450-c198-456b-a139-67d93e54847b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"adba6450-c198-456b-a139-67d93e54847b\") " pod="openstack/nova-metadata-0" Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.389872 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adba6450-c198-456b-a139-67d93e54847b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"adba6450-c198-456b-a139-67d93e54847b\") " pod="openstack/nova-metadata-0" Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.416001 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntrsg\" (UniqueName: \"kubernetes.io/projected/adba6450-c198-456b-a139-67d93e54847b-kube-api-access-ntrsg\") pod \"nova-metadata-0\" (UID: \"adba6450-c198-456b-a139-67d93e54847b\") " pod="openstack/nova-metadata-0" Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.483441 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.980348 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13922810-c6e2-4774-b143-a1aadd32210b" path="/var/lib/kubelet/pods/13922810-c6e2-4774-b143-a1aadd32210b/volumes" Oct 07 11:40:47 crc kubenswrapper[4700]: I1007 11:40:47.988106 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 11:40:48 crc kubenswrapper[4700]: I1007 11:40:48.055778 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"adba6450-c198-456b-a139-67d93e54847b","Type":"ContainerStarted","Data":"188133f98931f512d348595138072dd275a798183aa49afba43945c8b0af1610"} Oct 07 11:40:48 crc kubenswrapper[4700]: I1007 11:40:48.777653 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 11:40:48 crc kubenswrapper[4700]: I1007 11:40:48.904714 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81608928-68ca-42fc-9839-c19d6efdb0f1-config-data\") pod \"81608928-68ca-42fc-9839-c19d6efdb0f1\" (UID: \"81608928-68ca-42fc-9839-c19d6efdb0f1\") " Oct 07 11:40:48 crc kubenswrapper[4700]: I1007 11:40:48.904868 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81608928-68ca-42fc-9839-c19d6efdb0f1-combined-ca-bundle\") pod \"81608928-68ca-42fc-9839-c19d6efdb0f1\" (UID: \"81608928-68ca-42fc-9839-c19d6efdb0f1\") " Oct 07 11:40:48 crc kubenswrapper[4700]: I1007 11:40:48.904908 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nf7hm\" (UniqueName: \"kubernetes.io/projected/81608928-68ca-42fc-9839-c19d6efdb0f1-kube-api-access-nf7hm\") pod \"81608928-68ca-42fc-9839-c19d6efdb0f1\" (UID: \"81608928-68ca-42fc-9839-c19d6efdb0f1\") " Oct 07 11:40:48 crc kubenswrapper[4700]: I1007 11:40:48.910990 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81608928-68ca-42fc-9839-c19d6efdb0f1-kube-api-access-nf7hm" (OuterVolumeSpecName: "kube-api-access-nf7hm") pod "81608928-68ca-42fc-9839-c19d6efdb0f1" (UID: "81608928-68ca-42fc-9839-c19d6efdb0f1"). InnerVolumeSpecName "kube-api-access-nf7hm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:40:48 crc kubenswrapper[4700]: I1007 11:40:48.951698 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81608928-68ca-42fc-9839-c19d6efdb0f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81608928-68ca-42fc-9839-c19d6efdb0f1" (UID: "81608928-68ca-42fc-9839-c19d6efdb0f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:40:48 crc kubenswrapper[4700]: I1007 11:40:48.951924 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81608928-68ca-42fc-9839-c19d6efdb0f1-config-data" (OuterVolumeSpecName: "config-data") pod "81608928-68ca-42fc-9839-c19d6efdb0f1" (UID: "81608928-68ca-42fc-9839-c19d6efdb0f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:40:49 crc kubenswrapper[4700]: I1007 11:40:49.007971 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81608928-68ca-42fc-9839-c19d6efdb0f1-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:49 crc kubenswrapper[4700]: I1007 11:40:49.008008 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81608928-68ca-42fc-9839-c19d6efdb0f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:49 crc kubenswrapper[4700]: I1007 11:40:49.008026 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nf7hm\" (UniqueName: \"kubernetes.io/projected/81608928-68ca-42fc-9839-c19d6efdb0f1-kube-api-access-nf7hm\") on node \"crc\" DevicePath \"\"" Oct 07 11:40:49 crc kubenswrapper[4700]: I1007 11:40:49.066944 4700 generic.go:334] "Generic (PLEG): container finished" podID="81608928-68ca-42fc-9839-c19d6efdb0f1" containerID="54cd06d47f27c76e44fa917539a672e3c0613f3b43ae5f0c3445b0f6b5e798dc" exitCode=0 Oct 07 11:40:49 crc kubenswrapper[4700]: I1007 11:40:49.067269 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 11:40:49 crc kubenswrapper[4700]: I1007 11:40:49.069883 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"81608928-68ca-42fc-9839-c19d6efdb0f1","Type":"ContainerDied","Data":"54cd06d47f27c76e44fa917539a672e3c0613f3b43ae5f0c3445b0f6b5e798dc"} Oct 07 11:40:49 crc kubenswrapper[4700]: I1007 11:40:49.070069 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"81608928-68ca-42fc-9839-c19d6efdb0f1","Type":"ContainerDied","Data":"45bf8fc13a710b2a3a9366073caaef4859071654cc7140ea8c09697825fe0096"} Oct 07 11:40:49 crc kubenswrapper[4700]: I1007 11:40:49.070153 4700 scope.go:117] "RemoveContainer" containerID="54cd06d47f27c76e44fa917539a672e3c0613f3b43ae5f0c3445b0f6b5e798dc" Oct 07 11:40:49 crc kubenswrapper[4700]: I1007 11:40:49.072356 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"adba6450-c198-456b-a139-67d93e54847b","Type":"ContainerStarted","Data":"1c565ed83c63a30278381966e7584fc7e71bc724ca0895e20da8f9574ea544d9"} Oct 07 11:40:49 crc kubenswrapper[4700]: I1007 11:40:49.072398 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"adba6450-c198-456b-a139-67d93e54847b","Type":"ContainerStarted","Data":"444025b25dee99a2bd822013c8b8b78826ce843fb687fe5a5c99cad3065c33f3"} Oct 07 11:40:49 crc kubenswrapper[4700]: I1007 11:40:49.104695 4700 scope.go:117] "RemoveContainer" containerID="54cd06d47f27c76e44fa917539a672e3c0613f3b43ae5f0c3445b0f6b5e798dc" Oct 07 11:40:49 crc kubenswrapper[4700]: E1007 11:40:49.105659 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54cd06d47f27c76e44fa917539a672e3c0613f3b43ae5f0c3445b0f6b5e798dc\": container with ID starting with 54cd06d47f27c76e44fa917539a672e3c0613f3b43ae5f0c3445b0f6b5e798dc not found: ID does not exist" containerID="54cd06d47f27c76e44fa917539a672e3c0613f3b43ae5f0c3445b0f6b5e798dc" Oct 07 11:40:49 crc kubenswrapper[4700]: I1007 11:40:49.105699 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54cd06d47f27c76e44fa917539a672e3c0613f3b43ae5f0c3445b0f6b5e798dc"} err="failed to get container status \"54cd06d47f27c76e44fa917539a672e3c0613f3b43ae5f0c3445b0f6b5e798dc\": rpc error: code = NotFound desc = could not find container \"54cd06d47f27c76e44fa917539a672e3c0613f3b43ae5f0c3445b0f6b5e798dc\": container with ID starting with 54cd06d47f27c76e44fa917539a672e3c0613f3b43ae5f0c3445b0f6b5e798dc not found: ID does not exist" Oct 07 11:40:49 crc kubenswrapper[4700]: I1007 11:40:49.126131 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.126111038 podStartE2EDuration="2.126111038s" podCreationTimestamp="2025-10-07 11:40:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:40:49.109547166 +0000 UTC m=+1215.905946155" watchObservedRunningTime="2025-10-07 11:40:49.126111038 +0000 UTC m=+1215.922510027" Oct 07 11:40:49 crc kubenswrapper[4700]: I1007 11:40:49.151390 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 11:40:49 crc kubenswrapper[4700]: I1007 11:40:49.165413 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 11:40:49 crc kubenswrapper[4700]: I1007 11:40:49.185351 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 11:40:49 crc kubenswrapper[4700]: E1007 11:40:49.185766 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81608928-68ca-42fc-9839-c19d6efdb0f1" containerName="nova-scheduler-scheduler" Oct 07 11:40:49 crc kubenswrapper[4700]: I1007 11:40:49.185781 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="81608928-68ca-42fc-9839-c19d6efdb0f1" containerName="nova-scheduler-scheduler" Oct 07 11:40:49 crc kubenswrapper[4700]: I1007 11:40:49.185963 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="81608928-68ca-42fc-9839-c19d6efdb0f1" containerName="nova-scheduler-scheduler" Oct 07 11:40:49 crc kubenswrapper[4700]: I1007 11:40:49.186636 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 11:40:49 crc kubenswrapper[4700]: I1007 11:40:49.191651 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 07 11:40:49 crc kubenswrapper[4700]: I1007 11:40:49.198103 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 11:40:49 crc kubenswrapper[4700]: I1007 11:40:49.220554 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7ht9\" (UniqueName: \"kubernetes.io/projected/5785d364-839d-453a-a35f-b95ea89c2152-kube-api-access-n7ht9\") pod \"nova-scheduler-0\" (UID: \"5785d364-839d-453a-a35f-b95ea89c2152\") " pod="openstack/nova-scheduler-0" Oct 07 11:40:49 crc kubenswrapper[4700]: I1007 11:40:49.220747 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5785d364-839d-453a-a35f-b95ea89c2152-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5785d364-839d-453a-a35f-b95ea89c2152\") " pod="openstack/nova-scheduler-0" Oct 07 11:40:49 crc kubenswrapper[4700]: I1007 11:40:49.220957 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5785d364-839d-453a-a35f-b95ea89c2152-config-data\") pod \"nova-scheduler-0\" (UID: \"5785d364-839d-453a-a35f-b95ea89c2152\") " pod="openstack/nova-scheduler-0" Oct 07 11:40:49 crc kubenswrapper[4700]: I1007 11:40:49.322625 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7ht9\" (UniqueName: \"kubernetes.io/projected/5785d364-839d-453a-a35f-b95ea89c2152-kube-api-access-n7ht9\") pod \"nova-scheduler-0\" (UID: \"5785d364-839d-453a-a35f-b95ea89c2152\") " pod="openstack/nova-scheduler-0" Oct 07 11:40:49 crc kubenswrapper[4700]: I1007 11:40:49.322740 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5785d364-839d-453a-a35f-b95ea89c2152-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5785d364-839d-453a-a35f-b95ea89c2152\") " pod="openstack/nova-scheduler-0" Oct 07 11:40:49 crc kubenswrapper[4700]: I1007 11:40:49.322827 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5785d364-839d-453a-a35f-b95ea89c2152-config-data\") pod \"nova-scheduler-0\" (UID: \"5785d364-839d-453a-a35f-b95ea89c2152\") " pod="openstack/nova-scheduler-0" Oct 07 11:40:49 crc kubenswrapper[4700]: I1007 11:40:49.326692 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5785d364-839d-453a-a35f-b95ea89c2152-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5785d364-839d-453a-a35f-b95ea89c2152\") " pod="openstack/nova-scheduler-0" Oct 07 11:40:49 crc kubenswrapper[4700]: I1007 11:40:49.326865 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5785d364-839d-453a-a35f-b95ea89c2152-config-data\") pod \"nova-scheduler-0\" (UID: \"5785d364-839d-453a-a35f-b95ea89c2152\") " pod="openstack/nova-scheduler-0" Oct 07 11:40:49 crc kubenswrapper[4700]: I1007 11:40:49.337297 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7ht9\" (UniqueName: \"kubernetes.io/projected/5785d364-839d-453a-a35f-b95ea89c2152-kube-api-access-n7ht9\") pod \"nova-scheduler-0\" (UID: \"5785d364-839d-453a-a35f-b95ea89c2152\") " pod="openstack/nova-scheduler-0" Oct 07 11:40:49 crc kubenswrapper[4700]: I1007 11:40:49.537952 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 11:40:49 crc kubenswrapper[4700]: I1007 11:40:49.969985 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81608928-68ca-42fc-9839-c19d6efdb0f1" path="/var/lib/kubelet/pods/81608928-68ca-42fc-9839-c19d6efdb0f1/volumes" Oct 07 11:40:50 crc kubenswrapper[4700]: I1007 11:40:50.057991 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 11:40:50 crc kubenswrapper[4700]: W1007 11:40:50.063362 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5785d364_839d_453a_a35f_b95ea89c2152.slice/crio-5309e08c30b1e0c5807389be2ac075c9d0c4dc95116a66933cf369a31a46bb9a WatchSource:0}: Error finding container 5309e08c30b1e0c5807389be2ac075c9d0c4dc95116a66933cf369a31a46bb9a: Status 404 returned error can't find the container with id 5309e08c30b1e0c5807389be2ac075c9d0c4dc95116a66933cf369a31a46bb9a Oct 07 11:40:50 crc kubenswrapper[4700]: I1007 11:40:50.086247 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5785d364-839d-453a-a35f-b95ea89c2152","Type":"ContainerStarted","Data":"5309e08c30b1e0c5807389be2ac075c9d0c4dc95116a66933cf369a31a46bb9a"} Oct 07 11:40:51 crc kubenswrapper[4700]: I1007 11:40:51.109356 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5785d364-839d-453a-a35f-b95ea89c2152","Type":"ContainerStarted","Data":"6fda328357d01e973ea929b867d8f9131367856ec99a7a0ee7c4c96d941114d1"} Oct 07 11:40:51 crc kubenswrapper[4700]: I1007 11:40:51.137433 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.13741147 podStartE2EDuration="2.13741147s" podCreationTimestamp="2025-10-07 11:40:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:40:51.128663611 +0000 UTC m=+1217.925062630" watchObservedRunningTime="2025-10-07 11:40:51.13741147 +0000 UTC m=+1217.933810479" Oct 07 11:40:52 crc kubenswrapper[4700]: I1007 11:40:52.484387 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 11:40:52 crc kubenswrapper[4700]: I1007 11:40:52.484662 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 11:40:54 crc kubenswrapper[4700]: I1007 11:40:54.538662 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 07 11:40:55 crc kubenswrapper[4700]: I1007 11:40:55.451083 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 11:40:55 crc kubenswrapper[4700]: I1007 11:40:55.451429 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 11:40:56 crc kubenswrapper[4700]: I1007 11:40:56.470570 4700 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.213:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 11:40:56 crc kubenswrapper[4700]: I1007 11:40:56.470580 4700 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.213:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 11:40:57 crc kubenswrapper[4700]: I1007 11:40:57.483657 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 07 11:40:57 crc kubenswrapper[4700]: I1007 11:40:57.484017 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 07 11:40:58 crc kubenswrapper[4700]: I1007 11:40:58.502441 4700 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="adba6450-c198-456b-a139-67d93e54847b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.214:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 11:40:58 crc kubenswrapper[4700]: I1007 11:40:58.502481 4700 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="adba6450-c198-456b-a139-67d93e54847b" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.214:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 11:40:59 crc kubenswrapper[4700]: I1007 11:40:59.538803 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 07 11:40:59 crc kubenswrapper[4700]: I1007 11:40:59.580639 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 07 11:41:00 crc kubenswrapper[4700]: I1007 11:41:00.300677 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 07 11:41:02 crc kubenswrapper[4700]: I1007 11:41:02.259641 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 07 11:41:05 crc kubenswrapper[4700]: I1007 11:41:05.461718 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 07 11:41:05 crc kubenswrapper[4700]: I1007 11:41:05.462095 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 07 11:41:05 crc kubenswrapper[4700]: I1007 11:41:05.462289 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 07 11:41:05 crc kubenswrapper[4700]: I1007 11:41:05.462593 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 07 11:41:05 crc kubenswrapper[4700]: I1007 11:41:05.472753 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 07 11:41:05 crc kubenswrapper[4700]: I1007 11:41:05.475294 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 07 11:41:07 crc kubenswrapper[4700]: I1007 11:41:07.495965 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 07 11:41:07 crc kubenswrapper[4700]: I1007 11:41:07.497166 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 07 11:41:07 crc kubenswrapper[4700]: I1007 11:41:07.505448 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 07 11:41:08 crc kubenswrapper[4700]: I1007 11:41:08.335871 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 07 11:41:16 crc kubenswrapper[4700]: I1007 11:41:16.295592 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 11:41:17 crc kubenswrapper[4700]: I1007 11:41:17.337940 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 11:41:20 crc kubenswrapper[4700]: I1007 11:41:20.164921 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="ef7fab2e-f9fb-429f-9d47-e03f68165a13" containerName="rabbitmq" containerID="cri-o://726411ac54738387dce7118ef2b69e0ad2b1bb7ebb6f7ad31f7c1fec21f41203" gracePeriod=604797 Oct 07 11:41:21 crc kubenswrapper[4700]: I1007 11:41:21.279179 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="ca1c2675-0718-4979-98b8-9227bc9c5f18" containerName="rabbitmq" containerID="cri-o://a21209fc50993df12343081ce8aa6918e4133ef130b5b9f4e4132912fb4c2658" gracePeriod=604797 Oct 07 11:41:26 crc kubenswrapper[4700]: I1007 11:41:26.520334 4700 generic.go:334] "Generic (PLEG): container finished" podID="ef7fab2e-f9fb-429f-9d47-e03f68165a13" containerID="726411ac54738387dce7118ef2b69e0ad2b1bb7ebb6f7ad31f7c1fec21f41203" exitCode=0 Oct 07 11:41:26 crc kubenswrapper[4700]: I1007 11:41:26.520704 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ef7fab2e-f9fb-429f-9d47-e03f68165a13","Type":"ContainerDied","Data":"726411ac54738387dce7118ef2b69e0ad2b1bb7ebb6f7ad31f7c1fec21f41203"} Oct 07 11:41:26 crc kubenswrapper[4700]: I1007 11:41:26.822691 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 11:41:26 crc kubenswrapper[4700]: I1007 11:41:26.905618 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ef7fab2e-f9fb-429f-9d47-e03f68165a13-server-conf\") pod \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " Oct 07 11:41:26 crc kubenswrapper[4700]: I1007 11:41:26.905695 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4djh2\" (UniqueName: \"kubernetes.io/projected/ef7fab2e-f9fb-429f-9d47-e03f68165a13-kube-api-access-4djh2\") pod \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " Oct 07 11:41:26 crc kubenswrapper[4700]: I1007 11:41:26.905778 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ef7fab2e-f9fb-429f-9d47-e03f68165a13-rabbitmq-confd\") pod \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " Oct 07 11:41:26 crc kubenswrapper[4700]: I1007 11:41:26.905807 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ef7fab2e-f9fb-429f-9d47-e03f68165a13-rabbitmq-erlang-cookie\") pod \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " Oct 07 11:41:26 crc kubenswrapper[4700]: I1007 11:41:26.905843 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ef7fab2e-f9fb-429f-9d47-e03f68165a13-erlang-cookie-secret\") pod \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " Oct 07 11:41:26 crc kubenswrapper[4700]: I1007 11:41:26.905869 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ef7fab2e-f9fb-429f-9d47-e03f68165a13-rabbitmq-tls\") pod \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " Oct 07 11:41:26 crc kubenswrapper[4700]: I1007 11:41:26.905890 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ef7fab2e-f9fb-429f-9d47-e03f68165a13-plugins-conf\") pod \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " Oct 07 11:41:26 crc kubenswrapper[4700]: I1007 11:41:26.905905 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ef7fab2e-f9fb-429f-9d47-e03f68165a13-rabbitmq-plugins\") pod \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " Oct 07 11:41:26 crc kubenswrapper[4700]: I1007 11:41:26.905962 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " Oct 07 11:41:26 crc kubenswrapper[4700]: I1007 11:41:26.906002 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef7fab2e-f9fb-429f-9d47-e03f68165a13-config-data\") pod \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " Oct 07 11:41:26 crc kubenswrapper[4700]: I1007 11:41:26.906069 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ef7fab2e-f9fb-429f-9d47-e03f68165a13-pod-info\") pod \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\" (UID: \"ef7fab2e-f9fb-429f-9d47-e03f68165a13\") " Oct 07 11:41:26 crc kubenswrapper[4700]: I1007 11:41:26.907547 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef7fab2e-f9fb-429f-9d47-e03f68165a13-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "ef7fab2e-f9fb-429f-9d47-e03f68165a13" (UID: "ef7fab2e-f9fb-429f-9d47-e03f68165a13"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:41:26 crc kubenswrapper[4700]: I1007 11:41:26.908475 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef7fab2e-f9fb-429f-9d47-e03f68165a13-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "ef7fab2e-f9fb-429f-9d47-e03f68165a13" (UID: "ef7fab2e-f9fb-429f-9d47-e03f68165a13"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:41:26 crc kubenswrapper[4700]: I1007 11:41:26.910875 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef7fab2e-f9fb-429f-9d47-e03f68165a13-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "ef7fab2e-f9fb-429f-9d47-e03f68165a13" (UID: "ef7fab2e-f9fb-429f-9d47-e03f68165a13"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:41:26 crc kubenswrapper[4700]: I1007 11:41:26.917435 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef7fab2e-f9fb-429f-9d47-e03f68165a13-kube-api-access-4djh2" (OuterVolumeSpecName: "kube-api-access-4djh2") pod "ef7fab2e-f9fb-429f-9d47-e03f68165a13" (UID: "ef7fab2e-f9fb-429f-9d47-e03f68165a13"). InnerVolumeSpecName "kube-api-access-4djh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:41:26 crc kubenswrapper[4700]: I1007 11:41:26.923912 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef7fab2e-f9fb-429f-9d47-e03f68165a13-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "ef7fab2e-f9fb-429f-9d47-e03f68165a13" (UID: "ef7fab2e-f9fb-429f-9d47-e03f68165a13"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:41:26 crc kubenswrapper[4700]: I1007 11:41:26.924034 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "persistence") pod "ef7fab2e-f9fb-429f-9d47-e03f68165a13" (UID: "ef7fab2e-f9fb-429f-9d47-e03f68165a13"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 11:41:26 crc kubenswrapper[4700]: I1007 11:41:26.945545 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef7fab2e-f9fb-429f-9d47-e03f68165a13-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "ef7fab2e-f9fb-429f-9d47-e03f68165a13" (UID: "ef7fab2e-f9fb-429f-9d47-e03f68165a13"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:41:26 crc kubenswrapper[4700]: I1007 11:41:26.952437 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/ef7fab2e-f9fb-429f-9d47-e03f68165a13-pod-info" (OuterVolumeSpecName: "pod-info") pod "ef7fab2e-f9fb-429f-9d47-e03f68165a13" (UID: "ef7fab2e-f9fb-429f-9d47-e03f68165a13"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 07 11:41:26 crc kubenswrapper[4700]: I1007 11:41:26.965127 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef7fab2e-f9fb-429f-9d47-e03f68165a13-config-data" (OuterVolumeSpecName: "config-data") pod "ef7fab2e-f9fb-429f-9d47-e03f68165a13" (UID: "ef7fab2e-f9fb-429f-9d47-e03f68165a13"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:41:26 crc kubenswrapper[4700]: I1007 11:41:26.990864 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef7fab2e-f9fb-429f-9d47-e03f68165a13-server-conf" (OuterVolumeSpecName: "server-conf") pod "ef7fab2e-f9fb-429f-9d47-e03f68165a13" (UID: "ef7fab2e-f9fb-429f-9d47-e03f68165a13"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.008673 4700 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ef7fab2e-f9fb-429f-9d47-e03f68165a13-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.008712 4700 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ef7fab2e-f9fb-429f-9d47-e03f68165a13-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.008727 4700 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ef7fab2e-f9fb-429f-9d47-e03f68165a13-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.008742 4700 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ef7fab2e-f9fb-429f-9d47-e03f68165a13-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.008753 4700 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ef7fab2e-f9fb-429f-9d47-e03f68165a13-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.008787 4700 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.008799 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef7fab2e-f9fb-429f-9d47-e03f68165a13-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.008811 4700 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ef7fab2e-f9fb-429f-9d47-e03f68165a13-pod-info\") on node \"crc\" DevicePath \"\"" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.008823 4700 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ef7fab2e-f9fb-429f-9d47-e03f68165a13-server-conf\") on node \"crc\" DevicePath \"\"" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.008835 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4djh2\" (UniqueName: \"kubernetes.io/projected/ef7fab2e-f9fb-429f-9d47-e03f68165a13-kube-api-access-4djh2\") on node \"crc\" DevicePath \"\"" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.049691 4700 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.078561 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef7fab2e-f9fb-429f-9d47-e03f68165a13-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "ef7fab2e-f9fb-429f-9d47-e03f68165a13" (UID: "ef7fab2e-f9fb-429f-9d47-e03f68165a13"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.110725 4700 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ef7fab2e-f9fb-429f-9d47-e03f68165a13-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.110754 4700 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.542007 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.542011 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ef7fab2e-f9fb-429f-9d47-e03f68165a13","Type":"ContainerDied","Data":"f94595c8f81952bd3c5ae2e91916c5628b3b15be08690c6ed5e0048ae938e164"} Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.543147 4700 scope.go:117] "RemoveContainer" containerID="726411ac54738387dce7118ef2b69e0ad2b1bb7ebb6f7ad31f7c1fec21f41203" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.557589 4700 generic.go:334] "Generic (PLEG): container finished" podID="ca1c2675-0718-4979-98b8-9227bc9c5f18" containerID="a21209fc50993df12343081ce8aa6918e4133ef130b5b9f4e4132912fb4c2658" exitCode=0 Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.557646 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ca1c2675-0718-4979-98b8-9227bc9c5f18","Type":"ContainerDied","Data":"a21209fc50993df12343081ce8aa6918e4133ef130b5b9f4e4132912fb4c2658"} Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.591008 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.602748 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.613084 4700 scope.go:117] "RemoveContainer" containerID="3ed13cf2f5a8cd7c7fa788fceb5e49f8318d6db89d3b8d267bcf7127ecee2337" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.628862 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 11:41:27 crc kubenswrapper[4700]: E1007 11:41:27.629284 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef7fab2e-f9fb-429f-9d47-e03f68165a13" containerName="rabbitmq" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.629358 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef7fab2e-f9fb-429f-9d47-e03f68165a13" containerName="rabbitmq" Oct 07 11:41:27 crc kubenswrapper[4700]: E1007 11:41:27.629388 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef7fab2e-f9fb-429f-9d47-e03f68165a13" containerName="setup-container" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.629395 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef7fab2e-f9fb-429f-9d47-e03f68165a13" containerName="setup-container" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.629604 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef7fab2e-f9fb-429f-9d47-e03f68165a13" containerName="rabbitmq" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.630629 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.633729 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.633924 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-vd2wl" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.634052 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.634189 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.634329 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.634340 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.634559 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.665888 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.721893 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/509a6d73-2ff1-43f5-aa66-97d3a7d10e88-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"509a6d73-2ff1-43f5-aa66-97d3a7d10e88\") " pod="openstack/rabbitmq-server-0" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.721971 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/509a6d73-2ff1-43f5-aa66-97d3a7d10e88-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"509a6d73-2ff1-43f5-aa66-97d3a7d10e88\") " pod="openstack/rabbitmq-server-0" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.721999 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/509a6d73-2ff1-43f5-aa66-97d3a7d10e88-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"509a6d73-2ff1-43f5-aa66-97d3a7d10e88\") " pod="openstack/rabbitmq-server-0" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.722025 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/509a6d73-2ff1-43f5-aa66-97d3a7d10e88-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"509a6d73-2ff1-43f5-aa66-97d3a7d10e88\") " pod="openstack/rabbitmq-server-0" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.722209 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxm72\" (UniqueName: \"kubernetes.io/projected/509a6d73-2ff1-43f5-aa66-97d3a7d10e88-kube-api-access-nxm72\") pod \"rabbitmq-server-0\" (UID: \"509a6d73-2ff1-43f5-aa66-97d3a7d10e88\") " pod="openstack/rabbitmq-server-0" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.722260 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/509a6d73-2ff1-43f5-aa66-97d3a7d10e88-config-data\") pod \"rabbitmq-server-0\" (UID: \"509a6d73-2ff1-43f5-aa66-97d3a7d10e88\") " pod="openstack/rabbitmq-server-0" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.722315 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/509a6d73-2ff1-43f5-aa66-97d3a7d10e88-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"509a6d73-2ff1-43f5-aa66-97d3a7d10e88\") " pod="openstack/rabbitmq-server-0" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.722397 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/509a6d73-2ff1-43f5-aa66-97d3a7d10e88-pod-info\") pod \"rabbitmq-server-0\" (UID: \"509a6d73-2ff1-43f5-aa66-97d3a7d10e88\") " pod="openstack/rabbitmq-server-0" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.722474 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/509a6d73-2ff1-43f5-aa66-97d3a7d10e88-server-conf\") pod \"rabbitmq-server-0\" (UID: \"509a6d73-2ff1-43f5-aa66-97d3a7d10e88\") " pod="openstack/rabbitmq-server-0" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.722514 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/509a6d73-2ff1-43f5-aa66-97d3a7d10e88-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"509a6d73-2ff1-43f5-aa66-97d3a7d10e88\") " pod="openstack/rabbitmq-server-0" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.722597 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"509a6d73-2ff1-43f5-aa66-97d3a7d10e88\") " pod="openstack/rabbitmq-server-0" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.824485 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/509a6d73-2ff1-43f5-aa66-97d3a7d10e88-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"509a6d73-2ff1-43f5-aa66-97d3a7d10e88\") " pod="openstack/rabbitmq-server-0" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.824544 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/509a6d73-2ff1-43f5-aa66-97d3a7d10e88-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"509a6d73-2ff1-43f5-aa66-97d3a7d10e88\") " pod="openstack/rabbitmq-server-0" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.824573 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/509a6d73-2ff1-43f5-aa66-97d3a7d10e88-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"509a6d73-2ff1-43f5-aa66-97d3a7d10e88\") " pod="openstack/rabbitmq-server-0" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.824615 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/509a6d73-2ff1-43f5-aa66-97d3a7d10e88-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"509a6d73-2ff1-43f5-aa66-97d3a7d10e88\") " pod="openstack/rabbitmq-server-0" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.824673 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxm72\" (UniqueName: \"kubernetes.io/projected/509a6d73-2ff1-43f5-aa66-97d3a7d10e88-kube-api-access-nxm72\") pod \"rabbitmq-server-0\" (UID: \"509a6d73-2ff1-43f5-aa66-97d3a7d10e88\") " pod="openstack/rabbitmq-server-0" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.824702 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/509a6d73-2ff1-43f5-aa66-97d3a7d10e88-config-data\") pod \"rabbitmq-server-0\" (UID: \"509a6d73-2ff1-43f5-aa66-97d3a7d10e88\") " pod="openstack/rabbitmq-server-0" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.824729 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/509a6d73-2ff1-43f5-aa66-97d3a7d10e88-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"509a6d73-2ff1-43f5-aa66-97d3a7d10e88\") " pod="openstack/rabbitmq-server-0" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.824776 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/509a6d73-2ff1-43f5-aa66-97d3a7d10e88-pod-info\") pod \"rabbitmq-server-0\" (UID: \"509a6d73-2ff1-43f5-aa66-97d3a7d10e88\") " pod="openstack/rabbitmq-server-0" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.824809 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/509a6d73-2ff1-43f5-aa66-97d3a7d10e88-server-conf\") pod \"rabbitmq-server-0\" (UID: \"509a6d73-2ff1-43f5-aa66-97d3a7d10e88\") " pod="openstack/rabbitmq-server-0" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.824831 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/509a6d73-2ff1-43f5-aa66-97d3a7d10e88-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"509a6d73-2ff1-43f5-aa66-97d3a7d10e88\") " pod="openstack/rabbitmq-server-0" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.824885 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"509a6d73-2ff1-43f5-aa66-97d3a7d10e88\") " pod="openstack/rabbitmq-server-0" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.825199 4700 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"509a6d73-2ff1-43f5-aa66-97d3a7d10e88\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-server-0" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.825400 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/509a6d73-2ff1-43f5-aa66-97d3a7d10e88-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"509a6d73-2ff1-43f5-aa66-97d3a7d10e88\") " pod="openstack/rabbitmq-server-0" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.826033 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/509a6d73-2ff1-43f5-aa66-97d3a7d10e88-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"509a6d73-2ff1-43f5-aa66-97d3a7d10e88\") " pod="openstack/rabbitmq-server-0" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.826176 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/509a6d73-2ff1-43f5-aa66-97d3a7d10e88-config-data\") pod \"rabbitmq-server-0\" (UID: \"509a6d73-2ff1-43f5-aa66-97d3a7d10e88\") " pod="openstack/rabbitmq-server-0" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.826284 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/509a6d73-2ff1-43f5-aa66-97d3a7d10e88-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"509a6d73-2ff1-43f5-aa66-97d3a7d10e88\") " pod="openstack/rabbitmq-server-0" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.827627 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/509a6d73-2ff1-43f5-aa66-97d3a7d10e88-server-conf\") pod \"rabbitmq-server-0\" (UID: \"509a6d73-2ff1-43f5-aa66-97d3a7d10e88\") " pod="openstack/rabbitmq-server-0" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.832405 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/509a6d73-2ff1-43f5-aa66-97d3a7d10e88-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"509a6d73-2ff1-43f5-aa66-97d3a7d10e88\") " pod="openstack/rabbitmq-server-0" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.832928 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/509a6d73-2ff1-43f5-aa66-97d3a7d10e88-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"509a6d73-2ff1-43f5-aa66-97d3a7d10e88\") " pod="openstack/rabbitmq-server-0" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.838952 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/509a6d73-2ff1-43f5-aa66-97d3a7d10e88-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"509a6d73-2ff1-43f5-aa66-97d3a7d10e88\") " pod="openstack/rabbitmq-server-0" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.843076 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/509a6d73-2ff1-43f5-aa66-97d3a7d10e88-pod-info\") pod \"rabbitmq-server-0\" (UID: \"509a6d73-2ff1-43f5-aa66-97d3a7d10e88\") " pod="openstack/rabbitmq-server-0" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.846845 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxm72\" (UniqueName: \"kubernetes.io/projected/509a6d73-2ff1-43f5-aa66-97d3a7d10e88-kube-api-access-nxm72\") pod \"rabbitmq-server-0\" (UID: \"509a6d73-2ff1-43f5-aa66-97d3a7d10e88\") " pod="openstack/rabbitmq-server-0" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.864350 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"509a6d73-2ff1-43f5-aa66-97d3a7d10e88\") " pod="openstack/rabbitmq-server-0" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.937208 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:27 crc kubenswrapper[4700]: I1007 11:41:27.986552 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef7fab2e-f9fb-429f-9d47-e03f68165a13" path="/var/lib/kubelet/pods/ef7fab2e-f9fb-429f-9d47-e03f68165a13/volumes" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.027713 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.027728 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ca1c2675-0718-4979-98b8-9227bc9c5f18-plugins-conf\") pod \"ca1c2675-0718-4979-98b8-9227bc9c5f18\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.027772 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ca1c2675-0718-4979-98b8-9227bc9c5f18-server-conf\") pod \"ca1c2675-0718-4979-98b8-9227bc9c5f18\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.027865 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fvnd\" (UniqueName: \"kubernetes.io/projected/ca1c2675-0718-4979-98b8-9227bc9c5f18-kube-api-access-9fvnd\") pod \"ca1c2675-0718-4979-98b8-9227bc9c5f18\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.027880 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ca1c2675-0718-4979-98b8-9227bc9c5f18\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.027905 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ca1c2675-0718-4979-98b8-9227bc9c5f18-rabbitmq-tls\") pod \"ca1c2675-0718-4979-98b8-9227bc9c5f18\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.027953 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ca1c2675-0718-4979-98b8-9227bc9c5f18-rabbitmq-erlang-cookie\") pod \"ca1c2675-0718-4979-98b8-9227bc9c5f18\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.028008 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ca1c2675-0718-4979-98b8-9227bc9c5f18-rabbitmq-plugins\") pod \"ca1c2675-0718-4979-98b8-9227bc9c5f18\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.028048 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ca1c2675-0718-4979-98b8-9227bc9c5f18-pod-info\") pod \"ca1c2675-0718-4979-98b8-9227bc9c5f18\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.028091 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ca1c2675-0718-4979-98b8-9227bc9c5f18-rabbitmq-confd\") pod \"ca1c2675-0718-4979-98b8-9227bc9c5f18\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.028128 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca1c2675-0718-4979-98b8-9227bc9c5f18-config-data\") pod \"ca1c2675-0718-4979-98b8-9227bc9c5f18\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.028150 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ca1c2675-0718-4979-98b8-9227bc9c5f18-erlang-cookie-secret\") pod \"ca1c2675-0718-4979-98b8-9227bc9c5f18\" (UID: \"ca1c2675-0718-4979-98b8-9227bc9c5f18\") " Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.029296 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca1c2675-0718-4979-98b8-9227bc9c5f18-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "ca1c2675-0718-4979-98b8-9227bc9c5f18" (UID: "ca1c2675-0718-4979-98b8-9227bc9c5f18"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.029929 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca1c2675-0718-4979-98b8-9227bc9c5f18-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "ca1c2675-0718-4979-98b8-9227bc9c5f18" (UID: "ca1c2675-0718-4979-98b8-9227bc9c5f18"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.031765 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca1c2675-0718-4979-98b8-9227bc9c5f18-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "ca1c2675-0718-4979-98b8-9227bc9c5f18" (UID: "ca1c2675-0718-4979-98b8-9227bc9c5f18"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.033207 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca1c2675-0718-4979-98b8-9227bc9c5f18-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "ca1c2675-0718-4979-98b8-9227bc9c5f18" (UID: "ca1c2675-0718-4979-98b8-9227bc9c5f18"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.033222 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca1c2675-0718-4979-98b8-9227bc9c5f18-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "ca1c2675-0718-4979-98b8-9227bc9c5f18" (UID: "ca1c2675-0718-4979-98b8-9227bc9c5f18"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.035870 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca1c2675-0718-4979-98b8-9227bc9c5f18-kube-api-access-9fvnd" (OuterVolumeSpecName: "kube-api-access-9fvnd") pod "ca1c2675-0718-4979-98b8-9227bc9c5f18" (UID: "ca1c2675-0718-4979-98b8-9227bc9c5f18"). InnerVolumeSpecName "kube-api-access-9fvnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.038091 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/ca1c2675-0718-4979-98b8-9227bc9c5f18-pod-info" (OuterVolumeSpecName: "pod-info") pod "ca1c2675-0718-4979-98b8-9227bc9c5f18" (UID: "ca1c2675-0718-4979-98b8-9227bc9c5f18"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.045375 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "persistence") pod "ca1c2675-0718-4979-98b8-9227bc9c5f18" (UID: "ca1c2675-0718-4979-98b8-9227bc9c5f18"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.070387 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca1c2675-0718-4979-98b8-9227bc9c5f18-config-data" (OuterVolumeSpecName: "config-data") pod "ca1c2675-0718-4979-98b8-9227bc9c5f18" (UID: "ca1c2675-0718-4979-98b8-9227bc9c5f18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.104439 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca1c2675-0718-4979-98b8-9227bc9c5f18-server-conf" (OuterVolumeSpecName: "server-conf") pod "ca1c2675-0718-4979-98b8-9227bc9c5f18" (UID: "ca1c2675-0718-4979-98b8-9227bc9c5f18"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.129780 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fvnd\" (UniqueName: \"kubernetes.io/projected/ca1c2675-0718-4979-98b8-9227bc9c5f18-kube-api-access-9fvnd\") on node \"crc\" DevicePath \"\"" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.129831 4700 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.129840 4700 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ca1c2675-0718-4979-98b8-9227bc9c5f18-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.129849 4700 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ca1c2675-0718-4979-98b8-9227bc9c5f18-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.129857 4700 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ca1c2675-0718-4979-98b8-9227bc9c5f18-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.129865 4700 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ca1c2675-0718-4979-98b8-9227bc9c5f18-pod-info\") on node \"crc\" DevicePath \"\"" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.129874 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca1c2675-0718-4979-98b8-9227bc9c5f18-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.129884 4700 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ca1c2675-0718-4979-98b8-9227bc9c5f18-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.129892 4700 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ca1c2675-0718-4979-98b8-9227bc9c5f18-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.129900 4700 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ca1c2675-0718-4979-98b8-9227bc9c5f18-server-conf\") on node \"crc\" DevicePath \"\"" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.158034 4700 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.173861 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca1c2675-0718-4979-98b8-9227bc9c5f18-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "ca1c2675-0718-4979-98b8-9227bc9c5f18" (UID: "ca1c2675-0718-4979-98b8-9227bc9c5f18"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.231684 4700 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ca1c2675-0718-4979-98b8-9227bc9c5f18-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.231712 4700 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.548425 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.571783 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"509a6d73-2ff1-43f5-aa66-97d3a7d10e88","Type":"ContainerStarted","Data":"5f73dc4e389e26a4dd4f86d7e602a07bcf934fc93e9aed4247348e190e830998"} Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.574095 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ca1c2675-0718-4979-98b8-9227bc9c5f18","Type":"ContainerDied","Data":"97dee82d6cecfb4d568af852269a802ff35fe4ba299cfd5641769b22d8a1468e"} Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.574143 4700 scope.go:117] "RemoveContainer" containerID="a21209fc50993df12343081ce8aa6918e4133ef130b5b9f4e4132912fb4c2658" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.574359 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.707897 4700 scope.go:117] "RemoveContainer" containerID="a46dc66bd8afce5928f040158bfee6805a39b6a68e5c8e88e1b819c3300cd2ab" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.730193 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.742199 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.766216 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 11:41:28 crc kubenswrapper[4700]: E1007 11:41:28.769759 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca1c2675-0718-4979-98b8-9227bc9c5f18" containerName="setup-container" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.769815 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca1c2675-0718-4979-98b8-9227bc9c5f18" containerName="setup-container" Oct 07 11:41:28 crc kubenswrapper[4700]: E1007 11:41:28.769826 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca1c2675-0718-4979-98b8-9227bc9c5f18" containerName="rabbitmq" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.769833 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca1c2675-0718-4979-98b8-9227bc9c5f18" containerName="rabbitmq" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.770096 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca1c2675-0718-4979-98b8-9227bc9c5f18" containerName="rabbitmq" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.776405 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.780281 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.780604 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.780787 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-g79lm" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.781240 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.781438 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.781618 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.782397 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.786337 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.849177 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/abea1f83-cad5-40e9-a9d7-543660436ae0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"abea1f83-cad5-40e9-a9d7-543660436ae0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.849538 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/abea1f83-cad5-40e9-a9d7-543660436ae0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"abea1f83-cad5-40e9-a9d7-543660436ae0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.849571 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"abea1f83-cad5-40e9-a9d7-543660436ae0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.849594 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/abea1f83-cad5-40e9-a9d7-543660436ae0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"abea1f83-cad5-40e9-a9d7-543660436ae0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.849612 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/abea1f83-cad5-40e9-a9d7-543660436ae0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"abea1f83-cad5-40e9-a9d7-543660436ae0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.849645 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/abea1f83-cad5-40e9-a9d7-543660436ae0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"abea1f83-cad5-40e9-a9d7-543660436ae0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.849700 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/abea1f83-cad5-40e9-a9d7-543660436ae0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"abea1f83-cad5-40e9-a9d7-543660436ae0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.849724 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/abea1f83-cad5-40e9-a9d7-543660436ae0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"abea1f83-cad5-40e9-a9d7-543660436ae0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.849759 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/abea1f83-cad5-40e9-a9d7-543660436ae0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"abea1f83-cad5-40e9-a9d7-543660436ae0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.849787 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgwkp\" (UniqueName: \"kubernetes.io/projected/abea1f83-cad5-40e9-a9d7-543660436ae0-kube-api-access-hgwkp\") pod \"rabbitmq-cell1-server-0\" (UID: \"abea1f83-cad5-40e9-a9d7-543660436ae0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.849836 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/abea1f83-cad5-40e9-a9d7-543660436ae0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"abea1f83-cad5-40e9-a9d7-543660436ae0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.952416 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/abea1f83-cad5-40e9-a9d7-543660436ae0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"abea1f83-cad5-40e9-a9d7-543660436ae0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.952531 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/abea1f83-cad5-40e9-a9d7-543660436ae0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"abea1f83-cad5-40e9-a9d7-543660436ae0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.952572 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"abea1f83-cad5-40e9-a9d7-543660436ae0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.952592 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/abea1f83-cad5-40e9-a9d7-543660436ae0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"abea1f83-cad5-40e9-a9d7-543660436ae0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.952610 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/abea1f83-cad5-40e9-a9d7-543660436ae0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"abea1f83-cad5-40e9-a9d7-543660436ae0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.952656 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/abea1f83-cad5-40e9-a9d7-543660436ae0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"abea1f83-cad5-40e9-a9d7-543660436ae0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.952698 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/abea1f83-cad5-40e9-a9d7-543660436ae0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"abea1f83-cad5-40e9-a9d7-543660436ae0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.952736 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/abea1f83-cad5-40e9-a9d7-543660436ae0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"abea1f83-cad5-40e9-a9d7-543660436ae0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.952769 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/abea1f83-cad5-40e9-a9d7-543660436ae0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"abea1f83-cad5-40e9-a9d7-543660436ae0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.952821 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgwkp\" (UniqueName: \"kubernetes.io/projected/abea1f83-cad5-40e9-a9d7-543660436ae0-kube-api-access-hgwkp\") pod \"rabbitmq-cell1-server-0\" (UID: \"abea1f83-cad5-40e9-a9d7-543660436ae0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.952894 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/abea1f83-cad5-40e9-a9d7-543660436ae0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"abea1f83-cad5-40e9-a9d7-543660436ae0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.953022 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/abea1f83-cad5-40e9-a9d7-543660436ae0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"abea1f83-cad5-40e9-a9d7-543660436ae0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.953040 4700 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"abea1f83-cad5-40e9-a9d7-543660436ae0\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.953444 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/abea1f83-cad5-40e9-a9d7-543660436ae0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"abea1f83-cad5-40e9-a9d7-543660436ae0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.953944 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/abea1f83-cad5-40e9-a9d7-543660436ae0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"abea1f83-cad5-40e9-a9d7-543660436ae0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.954662 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/abea1f83-cad5-40e9-a9d7-543660436ae0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"abea1f83-cad5-40e9-a9d7-543660436ae0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.957618 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/abea1f83-cad5-40e9-a9d7-543660436ae0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"abea1f83-cad5-40e9-a9d7-543660436ae0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.957645 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/abea1f83-cad5-40e9-a9d7-543660436ae0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"abea1f83-cad5-40e9-a9d7-543660436ae0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.959405 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/abea1f83-cad5-40e9-a9d7-543660436ae0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"abea1f83-cad5-40e9-a9d7-543660436ae0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.962443 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/abea1f83-cad5-40e9-a9d7-543660436ae0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"abea1f83-cad5-40e9-a9d7-543660436ae0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.965181 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/abea1f83-cad5-40e9-a9d7-543660436ae0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"abea1f83-cad5-40e9-a9d7-543660436ae0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.978266 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgwkp\" (UniqueName: \"kubernetes.io/projected/abea1f83-cad5-40e9-a9d7-543660436ae0-kube-api-access-hgwkp\") pod \"rabbitmq-cell1-server-0\" (UID: \"abea1f83-cad5-40e9-a9d7-543660436ae0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:28 crc kubenswrapper[4700]: I1007 11:41:28.983437 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"abea1f83-cad5-40e9-a9d7-543660436ae0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:29 crc kubenswrapper[4700]: I1007 11:41:29.123262 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:41:29 crc kubenswrapper[4700]: I1007 11:41:29.596760 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 11:41:29 crc kubenswrapper[4700]: W1007 11:41:29.608500 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabea1f83_cad5_40e9_a9d7_543660436ae0.slice/crio-c07b383b0cc9451b0f1f4fa8b9da8ee8c4b6edb654c4ebc67be8804bfea732d8 WatchSource:0}: Error finding container c07b383b0cc9451b0f1f4fa8b9da8ee8c4b6edb654c4ebc67be8804bfea732d8: Status 404 returned error can't find the container with id c07b383b0cc9451b0f1f4fa8b9da8ee8c4b6edb654c4ebc67be8804bfea732d8 Oct 07 11:41:29 crc kubenswrapper[4700]: I1007 11:41:29.971420 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca1c2675-0718-4979-98b8-9227bc9c5f18" path="/var/lib/kubelet/pods/ca1c2675-0718-4979-98b8-9227bc9c5f18/volumes" Oct 07 11:41:30 crc kubenswrapper[4700]: I1007 11:41:30.602693 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"509a6d73-2ff1-43f5-aa66-97d3a7d10e88","Type":"ContainerStarted","Data":"6ccdefc80b430cbdc66320262c4256099ee2e5f2853dee495f9ff4aca2a7c7ca"} Oct 07 11:41:30 crc kubenswrapper[4700]: I1007 11:41:30.605336 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"abea1f83-cad5-40e9-a9d7-543660436ae0","Type":"ContainerStarted","Data":"c07b383b0cc9451b0f1f4fa8b9da8ee8c4b6edb654c4ebc67be8804bfea732d8"} Oct 07 11:41:31 crc kubenswrapper[4700]: I1007 11:41:31.492201 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68df85789f-rxgq6"] Oct 07 11:41:31 crc kubenswrapper[4700]: I1007 11:41:31.494451 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68df85789f-rxgq6" Oct 07 11:41:31 crc kubenswrapper[4700]: I1007 11:41:31.505933 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 07 11:41:31 crc kubenswrapper[4700]: I1007 11:41:31.515469 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68df85789f-rxgq6"] Oct 07 11:41:31 crc kubenswrapper[4700]: I1007 11:41:31.633622 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4304150-ec17-4afa-87e9-e4a84b0eb346-ovsdbserver-nb\") pod \"dnsmasq-dns-68df85789f-rxgq6\" (UID: \"d4304150-ec17-4afa-87e9-e4a84b0eb346\") " pod="openstack/dnsmasq-dns-68df85789f-rxgq6" Oct 07 11:41:31 crc kubenswrapper[4700]: I1007 11:41:31.633722 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4304150-ec17-4afa-87e9-e4a84b0eb346-config\") pod \"dnsmasq-dns-68df85789f-rxgq6\" (UID: \"d4304150-ec17-4afa-87e9-e4a84b0eb346\") " pod="openstack/dnsmasq-dns-68df85789f-rxgq6" Oct 07 11:41:31 crc kubenswrapper[4700]: I1007 11:41:31.633773 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqv2f\" (UniqueName: \"kubernetes.io/projected/d4304150-ec17-4afa-87e9-e4a84b0eb346-kube-api-access-jqv2f\") pod \"dnsmasq-dns-68df85789f-rxgq6\" (UID: \"d4304150-ec17-4afa-87e9-e4a84b0eb346\") " pod="openstack/dnsmasq-dns-68df85789f-rxgq6" Oct 07 11:41:31 crc kubenswrapper[4700]: I1007 11:41:31.633840 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d4304150-ec17-4afa-87e9-e4a84b0eb346-dns-swift-storage-0\") pod \"dnsmasq-dns-68df85789f-rxgq6\" (UID: \"d4304150-ec17-4afa-87e9-e4a84b0eb346\") " pod="openstack/dnsmasq-dns-68df85789f-rxgq6" Oct 07 11:41:31 crc kubenswrapper[4700]: I1007 11:41:31.633934 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4304150-ec17-4afa-87e9-e4a84b0eb346-ovsdbserver-sb\") pod \"dnsmasq-dns-68df85789f-rxgq6\" (UID: \"d4304150-ec17-4afa-87e9-e4a84b0eb346\") " pod="openstack/dnsmasq-dns-68df85789f-rxgq6" Oct 07 11:41:31 crc kubenswrapper[4700]: I1007 11:41:31.633986 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d4304150-ec17-4afa-87e9-e4a84b0eb346-openstack-edpm-ipam\") pod \"dnsmasq-dns-68df85789f-rxgq6\" (UID: \"d4304150-ec17-4afa-87e9-e4a84b0eb346\") " pod="openstack/dnsmasq-dns-68df85789f-rxgq6" Oct 07 11:41:31 crc kubenswrapper[4700]: I1007 11:41:31.634055 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4304150-ec17-4afa-87e9-e4a84b0eb346-dns-svc\") pod \"dnsmasq-dns-68df85789f-rxgq6\" (UID: \"d4304150-ec17-4afa-87e9-e4a84b0eb346\") " pod="openstack/dnsmasq-dns-68df85789f-rxgq6" Oct 07 11:41:31 crc kubenswrapper[4700]: I1007 11:41:31.642819 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"abea1f83-cad5-40e9-a9d7-543660436ae0","Type":"ContainerStarted","Data":"30c32379454a9af2e70ad016ca486180a302228ab6bdce17e0d971b74c66b2c2"} Oct 07 11:41:31 crc kubenswrapper[4700]: I1007 11:41:31.736248 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4304150-ec17-4afa-87e9-e4a84b0eb346-ovsdbserver-nb\") pod \"dnsmasq-dns-68df85789f-rxgq6\" (UID: \"d4304150-ec17-4afa-87e9-e4a84b0eb346\") " pod="openstack/dnsmasq-dns-68df85789f-rxgq6" Oct 07 11:41:31 crc kubenswrapper[4700]: I1007 11:41:31.736858 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4304150-ec17-4afa-87e9-e4a84b0eb346-config\") pod \"dnsmasq-dns-68df85789f-rxgq6\" (UID: \"d4304150-ec17-4afa-87e9-e4a84b0eb346\") " pod="openstack/dnsmasq-dns-68df85789f-rxgq6" Oct 07 11:41:31 crc kubenswrapper[4700]: I1007 11:41:31.736893 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqv2f\" (UniqueName: \"kubernetes.io/projected/d4304150-ec17-4afa-87e9-e4a84b0eb346-kube-api-access-jqv2f\") pod \"dnsmasq-dns-68df85789f-rxgq6\" (UID: \"d4304150-ec17-4afa-87e9-e4a84b0eb346\") " pod="openstack/dnsmasq-dns-68df85789f-rxgq6" Oct 07 11:41:31 crc kubenswrapper[4700]: I1007 11:41:31.737258 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d4304150-ec17-4afa-87e9-e4a84b0eb346-dns-swift-storage-0\") pod \"dnsmasq-dns-68df85789f-rxgq6\" (UID: \"d4304150-ec17-4afa-87e9-e4a84b0eb346\") " pod="openstack/dnsmasq-dns-68df85789f-rxgq6" Oct 07 11:41:31 crc kubenswrapper[4700]: I1007 11:41:31.737677 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4304150-ec17-4afa-87e9-e4a84b0eb346-config\") pod \"dnsmasq-dns-68df85789f-rxgq6\" (UID: \"d4304150-ec17-4afa-87e9-e4a84b0eb346\") " pod="openstack/dnsmasq-dns-68df85789f-rxgq6" Oct 07 11:41:31 crc kubenswrapper[4700]: I1007 11:41:31.737696 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4304150-ec17-4afa-87e9-e4a84b0eb346-ovsdbserver-nb\") pod \"dnsmasq-dns-68df85789f-rxgq6\" (UID: \"d4304150-ec17-4afa-87e9-e4a84b0eb346\") " pod="openstack/dnsmasq-dns-68df85789f-rxgq6" Oct 07 11:41:31 crc kubenswrapper[4700]: I1007 11:41:31.737994 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4304150-ec17-4afa-87e9-e4a84b0eb346-ovsdbserver-sb\") pod \"dnsmasq-dns-68df85789f-rxgq6\" (UID: \"d4304150-ec17-4afa-87e9-e4a84b0eb346\") " pod="openstack/dnsmasq-dns-68df85789f-rxgq6" Oct 07 11:41:31 crc kubenswrapper[4700]: I1007 11:41:31.738323 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d4304150-ec17-4afa-87e9-e4a84b0eb346-dns-swift-storage-0\") pod \"dnsmasq-dns-68df85789f-rxgq6\" (UID: \"d4304150-ec17-4afa-87e9-e4a84b0eb346\") " pod="openstack/dnsmasq-dns-68df85789f-rxgq6" Oct 07 11:41:31 crc kubenswrapper[4700]: I1007 11:41:31.738545 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d4304150-ec17-4afa-87e9-e4a84b0eb346-openstack-edpm-ipam\") pod \"dnsmasq-dns-68df85789f-rxgq6\" (UID: \"d4304150-ec17-4afa-87e9-e4a84b0eb346\") " pod="openstack/dnsmasq-dns-68df85789f-rxgq6" Oct 07 11:41:31 crc kubenswrapper[4700]: I1007 11:41:31.738623 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4304150-ec17-4afa-87e9-e4a84b0eb346-dns-svc\") pod \"dnsmasq-dns-68df85789f-rxgq6\" (UID: \"d4304150-ec17-4afa-87e9-e4a84b0eb346\") " pod="openstack/dnsmasq-dns-68df85789f-rxgq6" Oct 07 11:41:31 crc kubenswrapper[4700]: I1007 11:41:31.738862 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4304150-ec17-4afa-87e9-e4a84b0eb346-ovsdbserver-sb\") pod \"dnsmasq-dns-68df85789f-rxgq6\" (UID: \"d4304150-ec17-4afa-87e9-e4a84b0eb346\") " pod="openstack/dnsmasq-dns-68df85789f-rxgq6" Oct 07 11:41:31 crc kubenswrapper[4700]: I1007 11:41:31.739257 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d4304150-ec17-4afa-87e9-e4a84b0eb346-openstack-edpm-ipam\") pod \"dnsmasq-dns-68df85789f-rxgq6\" (UID: \"d4304150-ec17-4afa-87e9-e4a84b0eb346\") " pod="openstack/dnsmasq-dns-68df85789f-rxgq6" Oct 07 11:41:31 crc kubenswrapper[4700]: I1007 11:41:31.739445 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4304150-ec17-4afa-87e9-e4a84b0eb346-dns-svc\") pod \"dnsmasq-dns-68df85789f-rxgq6\" (UID: \"d4304150-ec17-4afa-87e9-e4a84b0eb346\") " pod="openstack/dnsmasq-dns-68df85789f-rxgq6" Oct 07 11:41:31 crc kubenswrapper[4700]: I1007 11:41:31.763750 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqv2f\" (UniqueName: \"kubernetes.io/projected/d4304150-ec17-4afa-87e9-e4a84b0eb346-kube-api-access-jqv2f\") pod \"dnsmasq-dns-68df85789f-rxgq6\" (UID: \"d4304150-ec17-4afa-87e9-e4a84b0eb346\") " pod="openstack/dnsmasq-dns-68df85789f-rxgq6" Oct 07 11:41:31 crc kubenswrapper[4700]: I1007 11:41:31.815391 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68df85789f-rxgq6" Oct 07 11:41:32 crc kubenswrapper[4700]: I1007 11:41:32.359523 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68df85789f-rxgq6"] Oct 07 11:41:32 crc kubenswrapper[4700]: W1007 11:41:32.364644 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4304150_ec17_4afa_87e9_e4a84b0eb346.slice/crio-454fc46904e5c234ed457a2e988fa013b2636ffca9e91634ddfb906e7dad66e6 WatchSource:0}: Error finding container 454fc46904e5c234ed457a2e988fa013b2636ffca9e91634ddfb906e7dad66e6: Status 404 returned error can't find the container with id 454fc46904e5c234ed457a2e988fa013b2636ffca9e91634ddfb906e7dad66e6 Oct 07 11:41:32 crc kubenswrapper[4700]: I1007 11:41:32.654527 4700 generic.go:334] "Generic (PLEG): container finished" podID="d4304150-ec17-4afa-87e9-e4a84b0eb346" containerID="a13a9a288261d076586d9b2623bfc75c55db7f5b74cc6e9d7728bf9295009722" exitCode=0 Oct 07 11:41:32 crc kubenswrapper[4700]: I1007 11:41:32.654699 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68df85789f-rxgq6" event={"ID":"d4304150-ec17-4afa-87e9-e4a84b0eb346","Type":"ContainerDied","Data":"a13a9a288261d076586d9b2623bfc75c55db7f5b74cc6e9d7728bf9295009722"} Oct 07 11:41:32 crc kubenswrapper[4700]: I1007 11:41:32.655419 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68df85789f-rxgq6" event={"ID":"d4304150-ec17-4afa-87e9-e4a84b0eb346","Type":"ContainerStarted","Data":"454fc46904e5c234ed457a2e988fa013b2636ffca9e91634ddfb906e7dad66e6"} Oct 07 11:41:33 crc kubenswrapper[4700]: I1007 11:41:33.666817 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68df85789f-rxgq6" event={"ID":"d4304150-ec17-4afa-87e9-e4a84b0eb346","Type":"ContainerStarted","Data":"9ed83b1e7864fed35099bab1bc30f07bce8ccf05ae47ab108205a8bed0d03735"} Oct 07 11:41:33 crc kubenswrapper[4700]: I1007 11:41:33.667146 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68df85789f-rxgq6" Oct 07 11:41:33 crc kubenswrapper[4700]: I1007 11:41:33.689751 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68df85789f-rxgq6" podStartSLOduration=2.689718468 podStartE2EDuration="2.689718468s" podCreationTimestamp="2025-10-07 11:41:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:41:33.686440692 +0000 UTC m=+1260.482839711" watchObservedRunningTime="2025-10-07 11:41:33.689718468 +0000 UTC m=+1260.486117487" Oct 07 11:41:41 crc kubenswrapper[4700]: I1007 11:41:41.817573 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-68df85789f-rxgq6" Oct 07 11:41:41 crc kubenswrapper[4700]: I1007 11:41:41.896756 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79b5d74c8c-c4m94"] Oct 07 11:41:41 crc kubenswrapper[4700]: I1007 11:41:41.897782 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79b5d74c8c-c4m94" podUID="306818d0-d575-4dae-be97-b92df625ff44" containerName="dnsmasq-dns" containerID="cri-o://dfa01667243cdbbfc801641a2c1f414d0b172e31288c2b5de7d233329fe1d227" gracePeriod=10 Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.027792 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bb85b8995-42kbj"] Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.029316 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bb85b8995-42kbj" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.058638 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bb85b8995-42kbj"] Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.151266 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33bd4ab3-f047-4932-a264-163e46ec9749-dns-svc\") pod \"dnsmasq-dns-bb85b8995-42kbj\" (UID: \"33bd4ab3-f047-4932-a264-163e46ec9749\") " pod="openstack/dnsmasq-dns-bb85b8995-42kbj" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.151318 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33bd4ab3-f047-4932-a264-163e46ec9749-ovsdbserver-nb\") pod \"dnsmasq-dns-bb85b8995-42kbj\" (UID: \"33bd4ab3-f047-4932-a264-163e46ec9749\") " pod="openstack/dnsmasq-dns-bb85b8995-42kbj" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.151356 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/33bd4ab3-f047-4932-a264-163e46ec9749-openstack-edpm-ipam\") pod \"dnsmasq-dns-bb85b8995-42kbj\" (UID: \"33bd4ab3-f047-4932-a264-163e46ec9749\") " pod="openstack/dnsmasq-dns-bb85b8995-42kbj" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.151673 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6k9ld\" (UniqueName: \"kubernetes.io/projected/33bd4ab3-f047-4932-a264-163e46ec9749-kube-api-access-6k9ld\") pod \"dnsmasq-dns-bb85b8995-42kbj\" (UID: \"33bd4ab3-f047-4932-a264-163e46ec9749\") " pod="openstack/dnsmasq-dns-bb85b8995-42kbj" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.151750 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33bd4ab3-f047-4932-a264-163e46ec9749-ovsdbserver-sb\") pod \"dnsmasq-dns-bb85b8995-42kbj\" (UID: \"33bd4ab3-f047-4932-a264-163e46ec9749\") " pod="openstack/dnsmasq-dns-bb85b8995-42kbj" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.151776 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33bd4ab3-f047-4932-a264-163e46ec9749-dns-swift-storage-0\") pod \"dnsmasq-dns-bb85b8995-42kbj\" (UID: \"33bd4ab3-f047-4932-a264-163e46ec9749\") " pod="openstack/dnsmasq-dns-bb85b8995-42kbj" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.151805 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33bd4ab3-f047-4932-a264-163e46ec9749-config\") pod \"dnsmasq-dns-bb85b8995-42kbj\" (UID: \"33bd4ab3-f047-4932-a264-163e46ec9749\") " pod="openstack/dnsmasq-dns-bb85b8995-42kbj" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.255064 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33bd4ab3-f047-4932-a264-163e46ec9749-ovsdbserver-sb\") pod \"dnsmasq-dns-bb85b8995-42kbj\" (UID: \"33bd4ab3-f047-4932-a264-163e46ec9749\") " pod="openstack/dnsmasq-dns-bb85b8995-42kbj" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.255130 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33bd4ab3-f047-4932-a264-163e46ec9749-dns-swift-storage-0\") pod \"dnsmasq-dns-bb85b8995-42kbj\" (UID: \"33bd4ab3-f047-4932-a264-163e46ec9749\") " pod="openstack/dnsmasq-dns-bb85b8995-42kbj" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.255165 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33bd4ab3-f047-4932-a264-163e46ec9749-config\") pod \"dnsmasq-dns-bb85b8995-42kbj\" (UID: \"33bd4ab3-f047-4932-a264-163e46ec9749\") " pod="openstack/dnsmasq-dns-bb85b8995-42kbj" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.255370 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33bd4ab3-f047-4932-a264-163e46ec9749-dns-svc\") pod \"dnsmasq-dns-bb85b8995-42kbj\" (UID: \"33bd4ab3-f047-4932-a264-163e46ec9749\") " pod="openstack/dnsmasq-dns-bb85b8995-42kbj" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.255391 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33bd4ab3-f047-4932-a264-163e46ec9749-ovsdbserver-nb\") pod \"dnsmasq-dns-bb85b8995-42kbj\" (UID: \"33bd4ab3-f047-4932-a264-163e46ec9749\") " pod="openstack/dnsmasq-dns-bb85b8995-42kbj" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.255414 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/33bd4ab3-f047-4932-a264-163e46ec9749-openstack-edpm-ipam\") pod \"dnsmasq-dns-bb85b8995-42kbj\" (UID: \"33bd4ab3-f047-4932-a264-163e46ec9749\") " pod="openstack/dnsmasq-dns-bb85b8995-42kbj" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.255483 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6k9ld\" (UniqueName: \"kubernetes.io/projected/33bd4ab3-f047-4932-a264-163e46ec9749-kube-api-access-6k9ld\") pod \"dnsmasq-dns-bb85b8995-42kbj\" (UID: \"33bd4ab3-f047-4932-a264-163e46ec9749\") " pod="openstack/dnsmasq-dns-bb85b8995-42kbj" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.256991 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33bd4ab3-f047-4932-a264-163e46ec9749-ovsdbserver-sb\") pod \"dnsmasq-dns-bb85b8995-42kbj\" (UID: \"33bd4ab3-f047-4932-a264-163e46ec9749\") " pod="openstack/dnsmasq-dns-bb85b8995-42kbj" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.257554 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33bd4ab3-f047-4932-a264-163e46ec9749-dns-svc\") pod \"dnsmasq-dns-bb85b8995-42kbj\" (UID: \"33bd4ab3-f047-4932-a264-163e46ec9749\") " pod="openstack/dnsmasq-dns-bb85b8995-42kbj" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.257865 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33bd4ab3-f047-4932-a264-163e46ec9749-dns-swift-storage-0\") pod \"dnsmasq-dns-bb85b8995-42kbj\" (UID: \"33bd4ab3-f047-4932-a264-163e46ec9749\") " pod="openstack/dnsmasq-dns-bb85b8995-42kbj" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.258378 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33bd4ab3-f047-4932-a264-163e46ec9749-config\") pod \"dnsmasq-dns-bb85b8995-42kbj\" (UID: \"33bd4ab3-f047-4932-a264-163e46ec9749\") " pod="openstack/dnsmasq-dns-bb85b8995-42kbj" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.258523 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/33bd4ab3-f047-4932-a264-163e46ec9749-openstack-edpm-ipam\") pod \"dnsmasq-dns-bb85b8995-42kbj\" (UID: \"33bd4ab3-f047-4932-a264-163e46ec9749\") " pod="openstack/dnsmasq-dns-bb85b8995-42kbj" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.259787 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33bd4ab3-f047-4932-a264-163e46ec9749-ovsdbserver-nb\") pod \"dnsmasq-dns-bb85b8995-42kbj\" (UID: \"33bd4ab3-f047-4932-a264-163e46ec9749\") " pod="openstack/dnsmasq-dns-bb85b8995-42kbj" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.279634 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6k9ld\" (UniqueName: \"kubernetes.io/projected/33bd4ab3-f047-4932-a264-163e46ec9749-kube-api-access-6k9ld\") pod \"dnsmasq-dns-bb85b8995-42kbj\" (UID: \"33bd4ab3-f047-4932-a264-163e46ec9749\") " pod="openstack/dnsmasq-dns-bb85b8995-42kbj" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.389239 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bb85b8995-42kbj" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.399221 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79b5d74c8c-c4m94" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.462063 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/306818d0-d575-4dae-be97-b92df625ff44-ovsdbserver-sb\") pod \"306818d0-d575-4dae-be97-b92df625ff44\" (UID: \"306818d0-d575-4dae-be97-b92df625ff44\") " Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.462229 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/306818d0-d575-4dae-be97-b92df625ff44-dns-svc\") pod \"306818d0-d575-4dae-be97-b92df625ff44\" (UID: \"306818d0-d575-4dae-be97-b92df625ff44\") " Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.462289 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/306818d0-d575-4dae-be97-b92df625ff44-dns-swift-storage-0\") pod \"306818d0-d575-4dae-be97-b92df625ff44\" (UID: \"306818d0-d575-4dae-be97-b92df625ff44\") " Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.462447 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/306818d0-d575-4dae-be97-b92df625ff44-config\") pod \"306818d0-d575-4dae-be97-b92df625ff44\" (UID: \"306818d0-d575-4dae-be97-b92df625ff44\") " Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.462512 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/306818d0-d575-4dae-be97-b92df625ff44-ovsdbserver-nb\") pod \"306818d0-d575-4dae-be97-b92df625ff44\" (UID: \"306818d0-d575-4dae-be97-b92df625ff44\") " Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.462561 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fp2j\" (UniqueName: \"kubernetes.io/projected/306818d0-d575-4dae-be97-b92df625ff44-kube-api-access-4fp2j\") pod \"306818d0-d575-4dae-be97-b92df625ff44\" (UID: \"306818d0-d575-4dae-be97-b92df625ff44\") " Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.528166 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/306818d0-d575-4dae-be97-b92df625ff44-kube-api-access-4fp2j" (OuterVolumeSpecName: "kube-api-access-4fp2j") pod "306818d0-d575-4dae-be97-b92df625ff44" (UID: "306818d0-d575-4dae-be97-b92df625ff44"). InnerVolumeSpecName "kube-api-access-4fp2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.540938 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/306818d0-d575-4dae-be97-b92df625ff44-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "306818d0-d575-4dae-be97-b92df625ff44" (UID: "306818d0-d575-4dae-be97-b92df625ff44"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.564661 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/306818d0-d575-4dae-be97-b92df625ff44-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "306818d0-d575-4dae-be97-b92df625ff44" (UID: "306818d0-d575-4dae-be97-b92df625ff44"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.564790 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/306818d0-d575-4dae-be97-b92df625ff44-dns-swift-storage-0\") pod \"306818d0-d575-4dae-be97-b92df625ff44\" (UID: \"306818d0-d575-4dae-be97-b92df625ff44\") " Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.565586 4700 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/306818d0-d575-4dae-be97-b92df625ff44-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.565605 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fp2j\" (UniqueName: \"kubernetes.io/projected/306818d0-d575-4dae-be97-b92df625ff44-kube-api-access-4fp2j\") on node \"crc\" DevicePath \"\"" Oct 07 11:41:42 crc kubenswrapper[4700]: W1007 11:41:42.565690 4700 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/306818d0-d575-4dae-be97-b92df625ff44/volumes/kubernetes.io~configmap/dns-swift-storage-0 Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.565703 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/306818d0-d575-4dae-be97-b92df625ff44-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "306818d0-d575-4dae-be97-b92df625ff44" (UID: "306818d0-d575-4dae-be97-b92df625ff44"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.571836 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/306818d0-d575-4dae-be97-b92df625ff44-config" (OuterVolumeSpecName: "config") pod "306818d0-d575-4dae-be97-b92df625ff44" (UID: "306818d0-d575-4dae-be97-b92df625ff44"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.595806 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/306818d0-d575-4dae-be97-b92df625ff44-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "306818d0-d575-4dae-be97-b92df625ff44" (UID: "306818d0-d575-4dae-be97-b92df625ff44"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.627068 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/306818d0-d575-4dae-be97-b92df625ff44-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "306818d0-d575-4dae-be97-b92df625ff44" (UID: "306818d0-d575-4dae-be97-b92df625ff44"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.668620 4700 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/306818d0-d575-4dae-be97-b92df625ff44-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.668645 4700 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/306818d0-d575-4dae-be97-b92df625ff44-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.668658 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/306818d0-d575-4dae-be97-b92df625ff44-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.668666 4700 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/306818d0-d575-4dae-be97-b92df625ff44-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.763696 4700 generic.go:334] "Generic (PLEG): container finished" podID="306818d0-d575-4dae-be97-b92df625ff44" containerID="dfa01667243cdbbfc801641a2c1f414d0b172e31288c2b5de7d233329fe1d227" exitCode=0 Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.764158 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b5d74c8c-c4m94" event={"ID":"306818d0-d575-4dae-be97-b92df625ff44","Type":"ContainerDied","Data":"dfa01667243cdbbfc801641a2c1f414d0b172e31288c2b5de7d233329fe1d227"} Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.764185 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b5d74c8c-c4m94" event={"ID":"306818d0-d575-4dae-be97-b92df625ff44","Type":"ContainerDied","Data":"fbad093306cafb9237c8d7de696fc8032502cc9601004d42a0a083ec1aec272c"} Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.764202 4700 scope.go:117] "RemoveContainer" containerID="dfa01667243cdbbfc801641a2c1f414d0b172e31288c2b5de7d233329fe1d227" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.764356 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79b5d74c8c-c4m94" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.795685 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79b5d74c8c-c4m94"] Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.800044 4700 scope.go:117] "RemoveContainer" containerID="c6e3ea96907bbb4ab29a7f1ccd3beb0c467987f954aa1a5e1440ec0e12230d10" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.802274 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79b5d74c8c-c4m94"] Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.825539 4700 scope.go:117] "RemoveContainer" containerID="dfa01667243cdbbfc801641a2c1f414d0b172e31288c2b5de7d233329fe1d227" Oct 07 11:41:42 crc kubenswrapper[4700]: E1007 11:41:42.826054 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfa01667243cdbbfc801641a2c1f414d0b172e31288c2b5de7d233329fe1d227\": container with ID starting with dfa01667243cdbbfc801641a2c1f414d0b172e31288c2b5de7d233329fe1d227 not found: ID does not exist" containerID="dfa01667243cdbbfc801641a2c1f414d0b172e31288c2b5de7d233329fe1d227" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.826094 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfa01667243cdbbfc801641a2c1f414d0b172e31288c2b5de7d233329fe1d227"} err="failed to get container status \"dfa01667243cdbbfc801641a2c1f414d0b172e31288c2b5de7d233329fe1d227\": rpc error: code = NotFound desc = could not find container \"dfa01667243cdbbfc801641a2c1f414d0b172e31288c2b5de7d233329fe1d227\": container with ID starting with dfa01667243cdbbfc801641a2c1f414d0b172e31288c2b5de7d233329fe1d227 not found: ID does not exist" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.826126 4700 scope.go:117] "RemoveContainer" containerID="c6e3ea96907bbb4ab29a7f1ccd3beb0c467987f954aa1a5e1440ec0e12230d10" Oct 07 11:41:42 crc kubenswrapper[4700]: E1007 11:41:42.826658 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6e3ea96907bbb4ab29a7f1ccd3beb0c467987f954aa1a5e1440ec0e12230d10\": container with ID starting with c6e3ea96907bbb4ab29a7f1ccd3beb0c467987f954aa1a5e1440ec0e12230d10 not found: ID does not exist" containerID="c6e3ea96907bbb4ab29a7f1ccd3beb0c467987f954aa1a5e1440ec0e12230d10" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.826686 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6e3ea96907bbb4ab29a7f1ccd3beb0c467987f954aa1a5e1440ec0e12230d10"} err="failed to get container status \"c6e3ea96907bbb4ab29a7f1ccd3beb0c467987f954aa1a5e1440ec0e12230d10\": rpc error: code = NotFound desc = could not find container \"c6e3ea96907bbb4ab29a7f1ccd3beb0c467987f954aa1a5e1440ec0e12230d10\": container with ID starting with c6e3ea96907bbb4ab29a7f1ccd3beb0c467987f954aa1a5e1440ec0e12230d10 not found: ID does not exist" Oct 07 11:41:42 crc kubenswrapper[4700]: I1007 11:41:42.962957 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bb85b8995-42kbj"] Oct 07 11:41:42 crc kubenswrapper[4700]: W1007 11:41:42.969909 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33bd4ab3_f047_4932_a264_163e46ec9749.slice/crio-28ab8f9c52c83843003efac0ab4225f403e799000c07bc9cd8b558b2fb93fc58 WatchSource:0}: Error finding container 28ab8f9c52c83843003efac0ab4225f403e799000c07bc9cd8b558b2fb93fc58: Status 404 returned error can't find the container with id 28ab8f9c52c83843003efac0ab4225f403e799000c07bc9cd8b558b2fb93fc58 Oct 07 11:41:43 crc kubenswrapper[4700]: I1007 11:41:43.779731 4700 generic.go:334] "Generic (PLEG): container finished" podID="33bd4ab3-f047-4932-a264-163e46ec9749" containerID="5a49324c6c044c957475020365f26c06068e9b680e38c50783b0d0b8e3f18a92" exitCode=0 Oct 07 11:41:43 crc kubenswrapper[4700]: I1007 11:41:43.779795 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bb85b8995-42kbj" event={"ID":"33bd4ab3-f047-4932-a264-163e46ec9749","Type":"ContainerDied","Data":"5a49324c6c044c957475020365f26c06068e9b680e38c50783b0d0b8e3f18a92"} Oct 07 11:41:43 crc kubenswrapper[4700]: I1007 11:41:43.780459 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bb85b8995-42kbj" event={"ID":"33bd4ab3-f047-4932-a264-163e46ec9749","Type":"ContainerStarted","Data":"28ab8f9c52c83843003efac0ab4225f403e799000c07bc9cd8b558b2fb93fc58"} Oct 07 11:41:43 crc kubenswrapper[4700]: I1007 11:41:43.975003 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="306818d0-d575-4dae-be97-b92df625ff44" path="/var/lib/kubelet/pods/306818d0-d575-4dae-be97-b92df625ff44/volumes" Oct 07 11:41:44 crc kubenswrapper[4700]: I1007 11:41:44.795758 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bb85b8995-42kbj" event={"ID":"33bd4ab3-f047-4932-a264-163e46ec9749","Type":"ContainerStarted","Data":"09f4358ecd7331b20913e7358ab09d96ec013832833c3002719d3dac55fdb994"} Oct 07 11:41:44 crc kubenswrapper[4700]: I1007 11:41:44.796623 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bb85b8995-42kbj" Oct 07 11:41:44 crc kubenswrapper[4700]: I1007 11:41:44.824254 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bb85b8995-42kbj" podStartSLOduration=3.824233929 podStartE2EDuration="3.824233929s" podCreationTimestamp="2025-10-07 11:41:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:41:44.823775597 +0000 UTC m=+1271.620174576" watchObservedRunningTime="2025-10-07 11:41:44.824233929 +0000 UTC m=+1271.620632918" Oct 07 11:41:45 crc kubenswrapper[4700]: I1007 11:41:45.334227 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 11:41:45 crc kubenswrapper[4700]: I1007 11:41:45.334293 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 11:41:47 crc kubenswrapper[4700]: I1007 11:41:47.309101 4700 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-79b5d74c8c-c4m94" podUID="306818d0-d575-4dae-be97-b92df625ff44" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.209:5353: i/o timeout" Oct 07 11:41:52 crc kubenswrapper[4700]: I1007 11:41:52.391425 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bb85b8995-42kbj" Oct 07 11:41:52 crc kubenswrapper[4700]: I1007 11:41:52.499635 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68df85789f-rxgq6"] Oct 07 11:41:52 crc kubenswrapper[4700]: I1007 11:41:52.500354 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-68df85789f-rxgq6" podUID="d4304150-ec17-4afa-87e9-e4a84b0eb346" containerName="dnsmasq-dns" containerID="cri-o://9ed83b1e7864fed35099bab1bc30f07bce8ccf05ae47ab108205a8bed0d03735" gracePeriod=10 Oct 07 11:41:52 crc kubenswrapper[4700]: I1007 11:41:52.875062 4700 generic.go:334] "Generic (PLEG): container finished" podID="d4304150-ec17-4afa-87e9-e4a84b0eb346" containerID="9ed83b1e7864fed35099bab1bc30f07bce8ccf05ae47ab108205a8bed0d03735" exitCode=0 Oct 07 11:41:52 crc kubenswrapper[4700]: I1007 11:41:52.875338 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68df85789f-rxgq6" event={"ID":"d4304150-ec17-4afa-87e9-e4a84b0eb346","Type":"ContainerDied","Data":"9ed83b1e7864fed35099bab1bc30f07bce8ccf05ae47ab108205a8bed0d03735"} Oct 07 11:41:52 crc kubenswrapper[4700]: I1007 11:41:52.875385 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68df85789f-rxgq6" event={"ID":"d4304150-ec17-4afa-87e9-e4a84b0eb346","Type":"ContainerDied","Data":"454fc46904e5c234ed457a2e988fa013b2636ffca9e91634ddfb906e7dad66e6"} Oct 07 11:41:52 crc kubenswrapper[4700]: I1007 11:41:52.875397 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="454fc46904e5c234ed457a2e988fa013b2636ffca9e91634ddfb906e7dad66e6" Oct 07 11:41:52 crc kubenswrapper[4700]: I1007 11:41:52.945186 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68df85789f-rxgq6" Oct 07 11:41:52 crc kubenswrapper[4700]: I1007 11:41:52.985560 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4304150-ec17-4afa-87e9-e4a84b0eb346-ovsdbserver-nb\") pod \"d4304150-ec17-4afa-87e9-e4a84b0eb346\" (UID: \"d4304150-ec17-4afa-87e9-e4a84b0eb346\") " Oct 07 11:41:52 crc kubenswrapper[4700]: I1007 11:41:52.985658 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d4304150-ec17-4afa-87e9-e4a84b0eb346-dns-swift-storage-0\") pod \"d4304150-ec17-4afa-87e9-e4a84b0eb346\" (UID: \"d4304150-ec17-4afa-87e9-e4a84b0eb346\") " Oct 07 11:41:52 crc kubenswrapper[4700]: I1007 11:41:52.985778 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqv2f\" (UniqueName: \"kubernetes.io/projected/d4304150-ec17-4afa-87e9-e4a84b0eb346-kube-api-access-jqv2f\") pod \"d4304150-ec17-4afa-87e9-e4a84b0eb346\" (UID: \"d4304150-ec17-4afa-87e9-e4a84b0eb346\") " Oct 07 11:41:52 crc kubenswrapper[4700]: I1007 11:41:52.985824 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4304150-ec17-4afa-87e9-e4a84b0eb346-config\") pod \"d4304150-ec17-4afa-87e9-e4a84b0eb346\" (UID: \"d4304150-ec17-4afa-87e9-e4a84b0eb346\") " Oct 07 11:41:52 crc kubenswrapper[4700]: I1007 11:41:52.985860 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4304150-ec17-4afa-87e9-e4a84b0eb346-dns-svc\") pod \"d4304150-ec17-4afa-87e9-e4a84b0eb346\" (UID: \"d4304150-ec17-4afa-87e9-e4a84b0eb346\") " Oct 07 11:41:52 crc kubenswrapper[4700]: I1007 11:41:52.985948 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4304150-ec17-4afa-87e9-e4a84b0eb346-ovsdbserver-sb\") pod \"d4304150-ec17-4afa-87e9-e4a84b0eb346\" (UID: \"d4304150-ec17-4afa-87e9-e4a84b0eb346\") " Oct 07 11:41:52 crc kubenswrapper[4700]: I1007 11:41:52.986028 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d4304150-ec17-4afa-87e9-e4a84b0eb346-openstack-edpm-ipam\") pod \"d4304150-ec17-4afa-87e9-e4a84b0eb346\" (UID: \"d4304150-ec17-4afa-87e9-e4a84b0eb346\") " Oct 07 11:41:53 crc kubenswrapper[4700]: I1007 11:41:53.018524 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4304150-ec17-4afa-87e9-e4a84b0eb346-kube-api-access-jqv2f" (OuterVolumeSpecName: "kube-api-access-jqv2f") pod "d4304150-ec17-4afa-87e9-e4a84b0eb346" (UID: "d4304150-ec17-4afa-87e9-e4a84b0eb346"). InnerVolumeSpecName "kube-api-access-jqv2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:41:53 crc kubenswrapper[4700]: I1007 11:41:53.093854 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqv2f\" (UniqueName: \"kubernetes.io/projected/d4304150-ec17-4afa-87e9-e4a84b0eb346-kube-api-access-jqv2f\") on node \"crc\" DevicePath \"\"" Oct 07 11:41:53 crc kubenswrapper[4700]: I1007 11:41:53.094741 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4304150-ec17-4afa-87e9-e4a84b0eb346-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d4304150-ec17-4afa-87e9-e4a84b0eb346" (UID: "d4304150-ec17-4afa-87e9-e4a84b0eb346"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:41:53 crc kubenswrapper[4700]: I1007 11:41:53.170560 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4304150-ec17-4afa-87e9-e4a84b0eb346-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "d4304150-ec17-4afa-87e9-e4a84b0eb346" (UID: "d4304150-ec17-4afa-87e9-e4a84b0eb346"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:41:53 crc kubenswrapper[4700]: I1007 11:41:53.171667 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4304150-ec17-4afa-87e9-e4a84b0eb346-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d4304150-ec17-4afa-87e9-e4a84b0eb346" (UID: "d4304150-ec17-4afa-87e9-e4a84b0eb346"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:41:53 crc kubenswrapper[4700]: I1007 11:41:53.177852 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4304150-ec17-4afa-87e9-e4a84b0eb346-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d4304150-ec17-4afa-87e9-e4a84b0eb346" (UID: "d4304150-ec17-4afa-87e9-e4a84b0eb346"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:41:53 crc kubenswrapper[4700]: I1007 11:41:53.182424 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4304150-ec17-4afa-87e9-e4a84b0eb346-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d4304150-ec17-4afa-87e9-e4a84b0eb346" (UID: "d4304150-ec17-4afa-87e9-e4a84b0eb346"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:41:53 crc kubenswrapper[4700]: I1007 11:41:53.195186 4700 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4304150-ec17-4afa-87e9-e4a84b0eb346-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 11:41:53 crc kubenswrapper[4700]: I1007 11:41:53.195213 4700 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d4304150-ec17-4afa-87e9-e4a84b0eb346-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 07 11:41:53 crc kubenswrapper[4700]: I1007 11:41:53.195223 4700 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4304150-ec17-4afa-87e9-e4a84b0eb346-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 11:41:53 crc kubenswrapper[4700]: I1007 11:41:53.195230 4700 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d4304150-ec17-4afa-87e9-e4a84b0eb346-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 11:41:53 crc kubenswrapper[4700]: I1007 11:41:53.195239 4700 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4304150-ec17-4afa-87e9-e4a84b0eb346-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 11:41:53 crc kubenswrapper[4700]: I1007 11:41:53.214449 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4304150-ec17-4afa-87e9-e4a84b0eb346-config" (OuterVolumeSpecName: "config") pod "d4304150-ec17-4afa-87e9-e4a84b0eb346" (UID: "d4304150-ec17-4afa-87e9-e4a84b0eb346"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:41:53 crc kubenswrapper[4700]: I1007 11:41:53.296633 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4304150-ec17-4afa-87e9-e4a84b0eb346-config\") on node \"crc\" DevicePath \"\"" Oct 07 11:41:53 crc kubenswrapper[4700]: I1007 11:41:53.883766 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68df85789f-rxgq6" Oct 07 11:41:53 crc kubenswrapper[4700]: I1007 11:41:53.921496 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68df85789f-rxgq6"] Oct 07 11:41:53 crc kubenswrapper[4700]: I1007 11:41:53.930802 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68df85789f-rxgq6"] Oct 07 11:41:53 crc kubenswrapper[4700]: I1007 11:41:53.976675 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4304150-ec17-4afa-87e9-e4a84b0eb346" path="/var/lib/kubelet/pods/d4304150-ec17-4afa-87e9-e4a84b0eb346/volumes" Oct 07 11:42:01 crc kubenswrapper[4700]: I1007 11:42:01.343052 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5"] Oct 07 11:42:01 crc kubenswrapper[4700]: E1007 11:42:01.344173 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4304150-ec17-4afa-87e9-e4a84b0eb346" containerName="init" Oct 07 11:42:01 crc kubenswrapper[4700]: I1007 11:42:01.344191 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4304150-ec17-4afa-87e9-e4a84b0eb346" containerName="init" Oct 07 11:42:01 crc kubenswrapper[4700]: E1007 11:42:01.344233 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306818d0-d575-4dae-be97-b92df625ff44" containerName="dnsmasq-dns" Oct 07 11:42:01 crc kubenswrapper[4700]: I1007 11:42:01.344242 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="306818d0-d575-4dae-be97-b92df625ff44" containerName="dnsmasq-dns" Oct 07 11:42:01 crc kubenswrapper[4700]: E1007 11:42:01.344273 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306818d0-d575-4dae-be97-b92df625ff44" containerName="init" Oct 07 11:42:01 crc kubenswrapper[4700]: I1007 11:42:01.344281 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="306818d0-d575-4dae-be97-b92df625ff44" containerName="init" Oct 07 11:42:01 crc kubenswrapper[4700]: E1007 11:42:01.344291 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4304150-ec17-4afa-87e9-e4a84b0eb346" containerName="dnsmasq-dns" Oct 07 11:42:01 crc kubenswrapper[4700]: I1007 11:42:01.344298 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4304150-ec17-4afa-87e9-e4a84b0eb346" containerName="dnsmasq-dns" Oct 07 11:42:01 crc kubenswrapper[4700]: I1007 11:42:01.344557 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="306818d0-d575-4dae-be97-b92df625ff44" containerName="dnsmasq-dns" Oct 07 11:42:01 crc kubenswrapper[4700]: I1007 11:42:01.344576 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4304150-ec17-4afa-87e9-e4a84b0eb346" containerName="dnsmasq-dns" Oct 07 11:42:01 crc kubenswrapper[4700]: I1007 11:42:01.345380 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5" Oct 07 11:42:01 crc kubenswrapper[4700]: I1007 11:42:01.348007 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 11:42:01 crc kubenswrapper[4700]: I1007 11:42:01.348097 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxxzn" Oct 07 11:42:01 crc kubenswrapper[4700]: I1007 11:42:01.348273 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 11:42:01 crc kubenswrapper[4700]: I1007 11:42:01.348336 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 11:42:01 crc kubenswrapper[4700]: I1007 11:42:01.355701 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5"] Oct 07 11:42:01 crc kubenswrapper[4700]: I1007 11:42:01.491412 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3da1015a-c431-4f0f-971a-98b31f112e53-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5\" (UID: \"3da1015a-c431-4f0f-971a-98b31f112e53\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5" Oct 07 11:42:01 crc kubenswrapper[4700]: I1007 11:42:01.491518 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3da1015a-c431-4f0f-971a-98b31f112e53-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5\" (UID: \"3da1015a-c431-4f0f-971a-98b31f112e53\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5" Oct 07 11:42:01 crc kubenswrapper[4700]: I1007 11:42:01.491644 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3da1015a-c431-4f0f-971a-98b31f112e53-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5\" (UID: \"3da1015a-c431-4f0f-971a-98b31f112e53\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5" Oct 07 11:42:01 crc kubenswrapper[4700]: I1007 11:42:01.491686 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc48d\" (UniqueName: \"kubernetes.io/projected/3da1015a-c431-4f0f-971a-98b31f112e53-kube-api-access-wc48d\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5\" (UID: \"3da1015a-c431-4f0f-971a-98b31f112e53\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5" Oct 07 11:42:01 crc kubenswrapper[4700]: I1007 11:42:01.593781 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3da1015a-c431-4f0f-971a-98b31f112e53-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5\" (UID: \"3da1015a-c431-4f0f-971a-98b31f112e53\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5" Oct 07 11:42:01 crc kubenswrapper[4700]: I1007 11:42:01.593874 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3da1015a-c431-4f0f-971a-98b31f112e53-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5\" (UID: \"3da1015a-c431-4f0f-971a-98b31f112e53\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5" Oct 07 11:42:01 crc kubenswrapper[4700]: I1007 11:42:01.593950 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3da1015a-c431-4f0f-971a-98b31f112e53-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5\" (UID: \"3da1015a-c431-4f0f-971a-98b31f112e53\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5" Oct 07 11:42:01 crc kubenswrapper[4700]: I1007 11:42:01.593992 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc48d\" (UniqueName: \"kubernetes.io/projected/3da1015a-c431-4f0f-971a-98b31f112e53-kube-api-access-wc48d\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5\" (UID: \"3da1015a-c431-4f0f-971a-98b31f112e53\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5" Oct 07 11:42:01 crc kubenswrapper[4700]: I1007 11:42:01.606380 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3da1015a-c431-4f0f-971a-98b31f112e53-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5\" (UID: \"3da1015a-c431-4f0f-971a-98b31f112e53\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5" Oct 07 11:42:01 crc kubenswrapper[4700]: I1007 11:42:01.606550 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3da1015a-c431-4f0f-971a-98b31f112e53-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5\" (UID: \"3da1015a-c431-4f0f-971a-98b31f112e53\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5" Oct 07 11:42:01 crc kubenswrapper[4700]: I1007 11:42:01.608638 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3da1015a-c431-4f0f-971a-98b31f112e53-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5\" (UID: \"3da1015a-c431-4f0f-971a-98b31f112e53\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5" Oct 07 11:42:01 crc kubenswrapper[4700]: I1007 11:42:01.622138 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc48d\" (UniqueName: \"kubernetes.io/projected/3da1015a-c431-4f0f-971a-98b31f112e53-kube-api-access-wc48d\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5\" (UID: \"3da1015a-c431-4f0f-971a-98b31f112e53\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5" Oct 07 11:42:01 crc kubenswrapper[4700]: I1007 11:42:01.668172 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5" Oct 07 11:42:02 crc kubenswrapper[4700]: I1007 11:42:02.232786 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5"] Oct 07 11:42:02 crc kubenswrapper[4700]: I1007 11:42:02.242231 4700 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 11:42:02 crc kubenswrapper[4700]: I1007 11:42:02.972471 4700 generic.go:334] "Generic (PLEG): container finished" podID="509a6d73-2ff1-43f5-aa66-97d3a7d10e88" containerID="6ccdefc80b430cbdc66320262c4256099ee2e5f2853dee495f9ff4aca2a7c7ca" exitCode=0 Oct 07 11:42:02 crc kubenswrapper[4700]: I1007 11:42:02.972542 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"509a6d73-2ff1-43f5-aa66-97d3a7d10e88","Type":"ContainerDied","Data":"6ccdefc80b430cbdc66320262c4256099ee2e5f2853dee495f9ff4aca2a7c7ca"} Oct 07 11:42:02 crc kubenswrapper[4700]: I1007 11:42:02.974126 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5" event={"ID":"3da1015a-c431-4f0f-971a-98b31f112e53","Type":"ContainerStarted","Data":"03025e430d84dfbccc32b5d88f0bc295ddcd5eddf66474725ee7ecdb9f968deb"} Oct 07 11:42:03 crc kubenswrapper[4700]: I1007 11:42:03.991296 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"509a6d73-2ff1-43f5-aa66-97d3a7d10e88","Type":"ContainerStarted","Data":"457a8fbe25ea659f6f92582f5c4f50572fdba199691b3e389b50d949181827d9"} Oct 07 11:42:03 crc kubenswrapper[4700]: I1007 11:42:03.992853 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 07 11:42:04 crc kubenswrapper[4700]: I1007 11:42:04.032593 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.032575678 podStartE2EDuration="37.032575678s" podCreationTimestamp="2025-10-07 11:41:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:42:04.02920815 +0000 UTC m=+1290.825607159" watchObservedRunningTime="2025-10-07 11:42:04.032575678 +0000 UTC m=+1290.828974667" Oct 07 11:42:05 crc kubenswrapper[4700]: I1007 11:42:05.002656 4700 generic.go:334] "Generic (PLEG): container finished" podID="abea1f83-cad5-40e9-a9d7-543660436ae0" containerID="30c32379454a9af2e70ad016ca486180a302228ab6bdce17e0d971b74c66b2c2" exitCode=0 Oct 07 11:42:05 crc kubenswrapper[4700]: I1007 11:42:05.002740 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"abea1f83-cad5-40e9-a9d7-543660436ae0","Type":"ContainerDied","Data":"30c32379454a9af2e70ad016ca486180a302228ab6bdce17e0d971b74c66b2c2"} Oct 07 11:42:06 crc kubenswrapper[4700]: I1007 11:42:06.013349 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"abea1f83-cad5-40e9-a9d7-543660436ae0","Type":"ContainerStarted","Data":"caefe2aeb8f65d4ea822e2889fcc0b0c46acbe8ddfeb7f1bf1d36d015d1ef116"} Oct 07 11:42:06 crc kubenswrapper[4700]: I1007 11:42:06.013888 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:42:06 crc kubenswrapper[4700]: I1007 11:42:06.036371 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.036351109 podStartE2EDuration="38.036351109s" podCreationTimestamp="2025-10-07 11:41:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 11:42:06.031993096 +0000 UTC m=+1292.828392105" watchObservedRunningTime="2025-10-07 11:42:06.036351109 +0000 UTC m=+1292.832750098" Oct 07 11:42:11 crc kubenswrapper[4700]: I1007 11:42:11.090600 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5" event={"ID":"3da1015a-c431-4f0f-971a-98b31f112e53","Type":"ContainerStarted","Data":"b3e766323171b9870c25df560963d8838c349a43db5da2e6de8fe2bfb6a28af5"} Oct 07 11:42:11 crc kubenswrapper[4700]: I1007 11:42:11.114123 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5" podStartSLOduration=1.625568645 podStartE2EDuration="10.114102379s" podCreationTimestamp="2025-10-07 11:42:01 +0000 UTC" firstStartedPulling="2025-10-07 11:42:02.241969243 +0000 UTC m=+1289.038368242" lastFinishedPulling="2025-10-07 11:42:10.730502977 +0000 UTC m=+1297.526901976" observedRunningTime="2025-10-07 11:42:11.106504752 +0000 UTC m=+1297.902903751" watchObservedRunningTime="2025-10-07 11:42:11.114102379 +0000 UTC m=+1297.910501368" Oct 07 11:42:15 crc kubenswrapper[4700]: I1007 11:42:15.334199 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 11:42:15 crc kubenswrapper[4700]: I1007 11:42:15.334526 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 11:42:18 crc kubenswrapper[4700]: I1007 11:42:18.031863 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 07 11:42:19 crc kubenswrapper[4700]: I1007 11:42:19.128558 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 07 11:42:23 crc kubenswrapper[4700]: I1007 11:42:23.218185 4700 generic.go:334] "Generic (PLEG): container finished" podID="3da1015a-c431-4f0f-971a-98b31f112e53" containerID="b3e766323171b9870c25df560963d8838c349a43db5da2e6de8fe2bfb6a28af5" exitCode=0 Oct 07 11:42:23 crc kubenswrapper[4700]: I1007 11:42:23.218264 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5" event={"ID":"3da1015a-c431-4f0f-971a-98b31f112e53","Type":"ContainerDied","Data":"b3e766323171b9870c25df560963d8838c349a43db5da2e6de8fe2bfb6a28af5"} Oct 07 11:42:24 crc kubenswrapper[4700]: I1007 11:42:24.700541 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5" Oct 07 11:42:24 crc kubenswrapper[4700]: I1007 11:42:24.859396 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3da1015a-c431-4f0f-971a-98b31f112e53-repo-setup-combined-ca-bundle\") pod \"3da1015a-c431-4f0f-971a-98b31f112e53\" (UID: \"3da1015a-c431-4f0f-971a-98b31f112e53\") " Oct 07 11:42:24 crc kubenswrapper[4700]: I1007 11:42:24.859528 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3da1015a-c431-4f0f-971a-98b31f112e53-inventory\") pod \"3da1015a-c431-4f0f-971a-98b31f112e53\" (UID: \"3da1015a-c431-4f0f-971a-98b31f112e53\") " Oct 07 11:42:24 crc kubenswrapper[4700]: I1007 11:42:24.859576 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3da1015a-c431-4f0f-971a-98b31f112e53-ssh-key\") pod \"3da1015a-c431-4f0f-971a-98b31f112e53\" (UID: \"3da1015a-c431-4f0f-971a-98b31f112e53\") " Oct 07 11:42:24 crc kubenswrapper[4700]: I1007 11:42:24.859646 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wc48d\" (UniqueName: \"kubernetes.io/projected/3da1015a-c431-4f0f-971a-98b31f112e53-kube-api-access-wc48d\") pod \"3da1015a-c431-4f0f-971a-98b31f112e53\" (UID: \"3da1015a-c431-4f0f-971a-98b31f112e53\") " Oct 07 11:42:24 crc kubenswrapper[4700]: I1007 11:42:24.865301 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3da1015a-c431-4f0f-971a-98b31f112e53-kube-api-access-wc48d" (OuterVolumeSpecName: "kube-api-access-wc48d") pod "3da1015a-c431-4f0f-971a-98b31f112e53" (UID: "3da1015a-c431-4f0f-971a-98b31f112e53"). InnerVolumeSpecName "kube-api-access-wc48d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:42:24 crc kubenswrapper[4700]: I1007 11:42:24.865969 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3da1015a-c431-4f0f-971a-98b31f112e53-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "3da1015a-c431-4f0f-971a-98b31f112e53" (UID: "3da1015a-c431-4f0f-971a-98b31f112e53"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:42:24 crc kubenswrapper[4700]: I1007 11:42:24.890853 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3da1015a-c431-4f0f-971a-98b31f112e53-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3da1015a-c431-4f0f-971a-98b31f112e53" (UID: "3da1015a-c431-4f0f-971a-98b31f112e53"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:42:24 crc kubenswrapper[4700]: I1007 11:42:24.892656 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3da1015a-c431-4f0f-971a-98b31f112e53-inventory" (OuterVolumeSpecName: "inventory") pod "3da1015a-c431-4f0f-971a-98b31f112e53" (UID: "3da1015a-c431-4f0f-971a-98b31f112e53"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:42:24 crc kubenswrapper[4700]: I1007 11:42:24.961704 4700 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3da1015a-c431-4f0f-971a-98b31f112e53-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:42:24 crc kubenswrapper[4700]: I1007 11:42:24.961744 4700 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3da1015a-c431-4f0f-971a-98b31f112e53-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 11:42:24 crc kubenswrapper[4700]: I1007 11:42:24.961754 4700 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3da1015a-c431-4f0f-971a-98b31f112e53-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 11:42:24 crc kubenswrapper[4700]: I1007 11:42:24.961763 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wc48d\" (UniqueName: \"kubernetes.io/projected/3da1015a-c431-4f0f-971a-98b31f112e53-kube-api-access-wc48d\") on node \"crc\" DevicePath \"\"" Oct 07 11:42:25 crc kubenswrapper[4700]: I1007 11:42:25.245909 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5" event={"ID":"3da1015a-c431-4f0f-971a-98b31f112e53","Type":"ContainerDied","Data":"03025e430d84dfbccc32b5d88f0bc295ddcd5eddf66474725ee7ecdb9f968deb"} Oct 07 11:42:25 crc kubenswrapper[4700]: I1007 11:42:25.246022 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03025e430d84dfbccc32b5d88f0bc295ddcd5eddf66474725ee7ecdb9f968deb" Oct 07 11:42:25 crc kubenswrapper[4700]: I1007 11:42:25.246034 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5" Oct 07 11:42:25 crc kubenswrapper[4700]: I1007 11:42:25.347723 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-7926j"] Oct 07 11:42:25 crc kubenswrapper[4700]: E1007 11:42:25.348152 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3da1015a-c431-4f0f-971a-98b31f112e53" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 07 11:42:25 crc kubenswrapper[4700]: I1007 11:42:25.348173 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="3da1015a-c431-4f0f-971a-98b31f112e53" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 07 11:42:25 crc kubenswrapper[4700]: I1007 11:42:25.348476 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="3da1015a-c431-4f0f-971a-98b31f112e53" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 07 11:42:25 crc kubenswrapper[4700]: I1007 11:42:25.349485 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7926j" Oct 07 11:42:25 crc kubenswrapper[4700]: I1007 11:42:25.360591 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxxzn" Oct 07 11:42:25 crc kubenswrapper[4700]: I1007 11:42:25.360932 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 11:42:25 crc kubenswrapper[4700]: I1007 11:42:25.361489 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-7926j"] Oct 07 11:42:25 crc kubenswrapper[4700]: I1007 11:42:25.362904 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 11:42:25 crc kubenswrapper[4700]: I1007 11:42:25.367293 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 11:42:25 crc kubenswrapper[4700]: I1007 11:42:25.474484 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7926j\" (UID: \"d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7926j" Oct 07 11:42:25 crc kubenswrapper[4700]: I1007 11:42:25.474597 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7926j\" (UID: \"d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7926j" Oct 07 11:42:25 crc kubenswrapper[4700]: I1007 11:42:25.474676 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjfzv\" (UniqueName: \"kubernetes.io/projected/d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636-kube-api-access-cjfzv\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7926j\" (UID: \"d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7926j" Oct 07 11:42:25 crc kubenswrapper[4700]: I1007 11:42:25.576256 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7926j\" (UID: \"d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7926j" Oct 07 11:42:25 crc kubenswrapper[4700]: I1007 11:42:25.576365 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7926j\" (UID: \"d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7926j" Oct 07 11:42:25 crc kubenswrapper[4700]: I1007 11:42:25.576417 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjfzv\" (UniqueName: \"kubernetes.io/projected/d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636-kube-api-access-cjfzv\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7926j\" (UID: \"d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7926j" Oct 07 11:42:25 crc kubenswrapper[4700]: I1007 11:42:25.582238 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7926j\" (UID: \"d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7926j" Oct 07 11:42:25 crc kubenswrapper[4700]: I1007 11:42:25.582693 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7926j\" (UID: \"d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7926j" Oct 07 11:42:25 crc kubenswrapper[4700]: I1007 11:42:25.591913 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjfzv\" (UniqueName: \"kubernetes.io/projected/d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636-kube-api-access-cjfzv\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7926j\" (UID: \"d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7926j" Oct 07 11:42:25 crc kubenswrapper[4700]: I1007 11:42:25.710132 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7926j" Oct 07 11:42:26 crc kubenswrapper[4700]: I1007 11:42:26.239356 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-7926j"] Oct 07 11:42:26 crc kubenswrapper[4700]: W1007 11:42:26.243676 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1dbb92e_36fc_47a7_8c0f_2b28c5f2d636.slice/crio-ae8466b518e045fef076f2922758a54cd1e2f04c03c712785eb4b08f30b76b47 WatchSource:0}: Error finding container ae8466b518e045fef076f2922758a54cd1e2f04c03c712785eb4b08f30b76b47: Status 404 returned error can't find the container with id ae8466b518e045fef076f2922758a54cd1e2f04c03c712785eb4b08f30b76b47 Oct 07 11:42:26 crc kubenswrapper[4700]: I1007 11:42:26.258966 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7926j" event={"ID":"d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636","Type":"ContainerStarted","Data":"ae8466b518e045fef076f2922758a54cd1e2f04c03c712785eb4b08f30b76b47"} Oct 07 11:42:27 crc kubenswrapper[4700]: I1007 11:42:27.272019 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7926j" event={"ID":"d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636","Type":"ContainerStarted","Data":"2f323fc4cf88624a1c914c1cea40062ca9d523e87c4d011fc74b4f802fec0a09"} Oct 07 11:42:27 crc kubenswrapper[4700]: I1007 11:42:27.295812 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7926j" podStartSLOduration=1.770664992 podStartE2EDuration="2.295786154s" podCreationTimestamp="2025-10-07 11:42:25 +0000 UTC" firstStartedPulling="2025-10-07 11:42:26.248703493 +0000 UTC m=+1313.045102482" lastFinishedPulling="2025-10-07 11:42:26.773824615 +0000 UTC m=+1313.570223644" observedRunningTime="2025-10-07 11:42:27.288559476 +0000 UTC m=+1314.084958465" watchObservedRunningTime="2025-10-07 11:42:27.295786154 +0000 UTC m=+1314.092185173" Oct 07 11:42:30 crc kubenswrapper[4700]: I1007 11:42:30.305891 4700 generic.go:334] "Generic (PLEG): container finished" podID="d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636" containerID="2f323fc4cf88624a1c914c1cea40062ca9d523e87c4d011fc74b4f802fec0a09" exitCode=0 Oct 07 11:42:30 crc kubenswrapper[4700]: I1007 11:42:30.305983 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7926j" event={"ID":"d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636","Type":"ContainerDied","Data":"2f323fc4cf88624a1c914c1cea40062ca9d523e87c4d011fc74b4f802fec0a09"} Oct 07 11:42:31 crc kubenswrapper[4700]: I1007 11:42:31.754575 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7926j" Oct 07 11:42:31 crc kubenswrapper[4700]: I1007 11:42:31.906502 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjfzv\" (UniqueName: \"kubernetes.io/projected/d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636-kube-api-access-cjfzv\") pod \"d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636\" (UID: \"d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636\") " Oct 07 11:42:31 crc kubenswrapper[4700]: I1007 11:42:31.906633 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636-inventory\") pod \"d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636\" (UID: \"d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636\") " Oct 07 11:42:31 crc kubenswrapper[4700]: I1007 11:42:31.906907 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636-ssh-key\") pod \"d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636\" (UID: \"d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636\") " Oct 07 11:42:31 crc kubenswrapper[4700]: I1007 11:42:31.913541 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636-kube-api-access-cjfzv" (OuterVolumeSpecName: "kube-api-access-cjfzv") pod "d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636" (UID: "d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636"). InnerVolumeSpecName "kube-api-access-cjfzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:42:31 crc kubenswrapper[4700]: I1007 11:42:31.945020 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636-inventory" (OuterVolumeSpecName: "inventory") pod "d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636" (UID: "d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:42:31 crc kubenswrapper[4700]: I1007 11:42:31.967512 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636" (UID: "d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:42:32 crc kubenswrapper[4700]: I1007 11:42:32.008556 4700 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 11:42:32 crc kubenswrapper[4700]: I1007 11:42:32.008591 4700 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 11:42:32 crc kubenswrapper[4700]: I1007 11:42:32.008601 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjfzv\" (UniqueName: \"kubernetes.io/projected/d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636-kube-api-access-cjfzv\") on node \"crc\" DevicePath \"\"" Oct 07 11:42:32 crc kubenswrapper[4700]: I1007 11:42:32.341258 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7926j" event={"ID":"d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636","Type":"ContainerDied","Data":"ae8466b518e045fef076f2922758a54cd1e2f04c03c712785eb4b08f30b76b47"} Oct 07 11:42:32 crc kubenswrapper[4700]: I1007 11:42:32.341294 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7926j" Oct 07 11:42:32 crc kubenswrapper[4700]: I1007 11:42:32.341365 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae8466b518e045fef076f2922758a54cd1e2f04c03c712785eb4b08f30b76b47" Oct 07 11:42:32 crc kubenswrapper[4700]: I1007 11:42:32.405899 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx"] Oct 07 11:42:32 crc kubenswrapper[4700]: E1007 11:42:32.406376 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 07 11:42:32 crc kubenswrapper[4700]: I1007 11:42:32.406398 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 07 11:42:32 crc kubenswrapper[4700]: I1007 11:42:32.406591 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 07 11:42:32 crc kubenswrapper[4700]: I1007 11:42:32.407200 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx" Oct 07 11:42:32 crc kubenswrapper[4700]: I1007 11:42:32.413382 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 11:42:32 crc kubenswrapper[4700]: I1007 11:42:32.413490 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 11:42:32 crc kubenswrapper[4700]: I1007 11:42:32.413661 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 11:42:32 crc kubenswrapper[4700]: I1007 11:42:32.413853 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxxzn" Oct 07 11:42:32 crc kubenswrapper[4700]: I1007 11:42:32.414341 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx"] Oct 07 11:42:32 crc kubenswrapper[4700]: I1007 11:42:32.520943 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/112df1a0-e767-41be-a95e-4f7e62024fa2-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx\" (UID: \"112df1a0-e767-41be-a95e-4f7e62024fa2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx" Oct 07 11:42:32 crc kubenswrapper[4700]: I1007 11:42:32.520984 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv7tl\" (UniqueName: \"kubernetes.io/projected/112df1a0-e767-41be-a95e-4f7e62024fa2-kube-api-access-wv7tl\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx\" (UID: \"112df1a0-e767-41be-a95e-4f7e62024fa2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx" Oct 07 11:42:32 crc kubenswrapper[4700]: I1007 11:42:32.521069 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112df1a0-e767-41be-a95e-4f7e62024fa2-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx\" (UID: \"112df1a0-e767-41be-a95e-4f7e62024fa2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx" Oct 07 11:42:32 crc kubenswrapper[4700]: I1007 11:42:32.521113 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/112df1a0-e767-41be-a95e-4f7e62024fa2-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx\" (UID: \"112df1a0-e767-41be-a95e-4f7e62024fa2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx" Oct 07 11:42:32 crc kubenswrapper[4700]: I1007 11:42:32.622904 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112df1a0-e767-41be-a95e-4f7e62024fa2-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx\" (UID: \"112df1a0-e767-41be-a95e-4f7e62024fa2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx" Oct 07 11:42:32 crc kubenswrapper[4700]: I1007 11:42:32.622964 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/112df1a0-e767-41be-a95e-4f7e62024fa2-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx\" (UID: \"112df1a0-e767-41be-a95e-4f7e62024fa2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx" Oct 07 11:42:32 crc kubenswrapper[4700]: I1007 11:42:32.623087 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/112df1a0-e767-41be-a95e-4f7e62024fa2-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx\" (UID: \"112df1a0-e767-41be-a95e-4f7e62024fa2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx" Oct 07 11:42:32 crc kubenswrapper[4700]: I1007 11:42:32.623108 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wv7tl\" (UniqueName: \"kubernetes.io/projected/112df1a0-e767-41be-a95e-4f7e62024fa2-kube-api-access-wv7tl\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx\" (UID: \"112df1a0-e767-41be-a95e-4f7e62024fa2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx" Oct 07 11:42:32 crc kubenswrapper[4700]: I1007 11:42:32.627902 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/112df1a0-e767-41be-a95e-4f7e62024fa2-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx\" (UID: \"112df1a0-e767-41be-a95e-4f7e62024fa2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx" Oct 07 11:42:32 crc kubenswrapper[4700]: I1007 11:42:32.628057 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/112df1a0-e767-41be-a95e-4f7e62024fa2-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx\" (UID: \"112df1a0-e767-41be-a95e-4f7e62024fa2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx" Oct 07 11:42:32 crc kubenswrapper[4700]: I1007 11:42:32.634373 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112df1a0-e767-41be-a95e-4f7e62024fa2-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx\" (UID: \"112df1a0-e767-41be-a95e-4f7e62024fa2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx" Oct 07 11:42:32 crc kubenswrapper[4700]: I1007 11:42:32.637176 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv7tl\" (UniqueName: \"kubernetes.io/projected/112df1a0-e767-41be-a95e-4f7e62024fa2-kube-api-access-wv7tl\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx\" (UID: \"112df1a0-e767-41be-a95e-4f7e62024fa2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx" Oct 07 11:42:32 crc kubenswrapper[4700]: I1007 11:42:32.731450 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx" Oct 07 11:42:33 crc kubenswrapper[4700]: I1007 11:42:33.341167 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx"] Oct 07 11:42:34 crc kubenswrapper[4700]: I1007 11:42:34.358887 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx" event={"ID":"112df1a0-e767-41be-a95e-4f7e62024fa2","Type":"ContainerStarted","Data":"55d48e8123e7a01882152b88c0ed844cd98b53a93cc9ffea7c95b0fc0ba3f448"} Oct 07 11:42:34 crc kubenswrapper[4700]: I1007 11:42:34.359265 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx" event={"ID":"112df1a0-e767-41be-a95e-4f7e62024fa2","Type":"ContainerStarted","Data":"8c8e599a4b174e0e91abfd16f8a53691308ac12983faf7e488d26d7a0cd19432"} Oct 07 11:42:34 crc kubenswrapper[4700]: I1007 11:42:34.374854 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx" podStartSLOduration=1.924163431 podStartE2EDuration="2.374837058s" podCreationTimestamp="2025-10-07 11:42:32 +0000 UTC" firstStartedPulling="2025-10-07 11:42:33.36095831 +0000 UTC m=+1320.157357299" lastFinishedPulling="2025-10-07 11:42:33.811631937 +0000 UTC m=+1320.608030926" observedRunningTime="2025-10-07 11:42:34.373361649 +0000 UTC m=+1321.169760648" watchObservedRunningTime="2025-10-07 11:42:34.374837058 +0000 UTC m=+1321.171236047" Oct 07 11:42:45 crc kubenswrapper[4700]: I1007 11:42:45.333898 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 11:42:45 crc kubenswrapper[4700]: I1007 11:42:45.336593 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 11:42:45 crc kubenswrapper[4700]: I1007 11:42:45.336686 4700 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" Oct 07 11:42:45 crc kubenswrapper[4700]: I1007 11:42:45.337543 4700 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2dd00ae003149c481b44bf29df7a596aca95ac6b3173a4a0af3e08d67d5e4363"} pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 11:42:45 crc kubenswrapper[4700]: I1007 11:42:45.337616 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" containerID="cri-o://2dd00ae003149c481b44bf29df7a596aca95ac6b3173a4a0af3e08d67d5e4363" gracePeriod=600 Oct 07 11:42:45 crc kubenswrapper[4700]: I1007 11:42:45.492276 4700 generic.go:334] "Generic (PLEG): container finished" podID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerID="2dd00ae003149c481b44bf29df7a596aca95ac6b3173a4a0af3e08d67d5e4363" exitCode=0 Oct 07 11:42:45 crc kubenswrapper[4700]: I1007 11:42:45.492336 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" event={"ID":"97a77b38-e9b1-4243-ac3a-28d83d87cf15","Type":"ContainerDied","Data":"2dd00ae003149c481b44bf29df7a596aca95ac6b3173a4a0af3e08d67d5e4363"} Oct 07 11:42:45 crc kubenswrapper[4700]: I1007 11:42:45.492392 4700 scope.go:117] "RemoveContainer" containerID="39f51c218de12efa082d2cc5034a6195a011e23573e568730496b6798d2fbe71" Oct 07 11:42:46 crc kubenswrapper[4700]: I1007 11:42:46.509888 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" event={"ID":"97a77b38-e9b1-4243-ac3a-28d83d87cf15","Type":"ContainerStarted","Data":"cd159b1c4a4e5543066d303bc63fe54cd56f2bbe3439d5248b95da07598874b4"} Oct 07 11:43:03 crc kubenswrapper[4700]: I1007 11:43:03.510598 4700 scope.go:117] "RemoveContainer" containerID="ba4093bbaa3c2de7ca0aa39ce7b8afb747128458eb7ba59f3f285d9ca6373e48" Oct 07 11:43:03 crc kubenswrapper[4700]: I1007 11:43:03.573752 4700 scope.go:117] "RemoveContainer" containerID="6b88e3b321d163b31884cb8c8ad7a51e70893093ff03f97f6832b45a6179a6e2" Oct 07 11:43:03 crc kubenswrapper[4700]: I1007 11:43:03.623086 4700 scope.go:117] "RemoveContainer" containerID="47af6b6ff9ef5fd7f618a5997659b15d4a8e586766205c3395d4e2052a2bb5e8" Oct 07 11:44:03 crc kubenswrapper[4700]: I1007 11:44:03.624950 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-q99vw"] Oct 07 11:44:03 crc kubenswrapper[4700]: I1007 11:44:03.632519 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q99vw" Oct 07 11:44:03 crc kubenswrapper[4700]: I1007 11:44:03.639061 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q99vw"] Oct 07 11:44:03 crc kubenswrapper[4700]: I1007 11:44:03.706876 4700 scope.go:117] "RemoveContainer" containerID="8ba875b9970322122937419ca40d4a433e564a2f21d72bb77973f8eac13b7334" Oct 07 11:44:03 crc kubenswrapper[4700]: I1007 11:44:03.722995 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s29fc\" (UniqueName: \"kubernetes.io/projected/8f3ae1dd-ead7-4f8f-a6b9-95da701ae514-kube-api-access-s29fc\") pod \"certified-operators-q99vw\" (UID: \"8f3ae1dd-ead7-4f8f-a6b9-95da701ae514\") " pod="openshift-marketplace/certified-operators-q99vw" Oct 07 11:44:03 crc kubenswrapper[4700]: I1007 11:44:03.723075 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f3ae1dd-ead7-4f8f-a6b9-95da701ae514-utilities\") pod \"certified-operators-q99vw\" (UID: \"8f3ae1dd-ead7-4f8f-a6b9-95da701ae514\") " pod="openshift-marketplace/certified-operators-q99vw" Oct 07 11:44:03 crc kubenswrapper[4700]: I1007 11:44:03.723188 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f3ae1dd-ead7-4f8f-a6b9-95da701ae514-catalog-content\") pod \"certified-operators-q99vw\" (UID: \"8f3ae1dd-ead7-4f8f-a6b9-95da701ae514\") " pod="openshift-marketplace/certified-operators-q99vw" Oct 07 11:44:03 crc kubenswrapper[4700]: I1007 11:44:03.736361 4700 scope.go:117] "RemoveContainer" containerID="18949fced552b3138e4f36a465881e5a00520a9f77fb3f38d51c5aae57604e1c" Oct 07 11:44:03 crc kubenswrapper[4700]: I1007 11:44:03.792573 4700 scope.go:117] "RemoveContainer" containerID="1f0493224277713d33d3c5331e8bb336b71e555a8e3b1707531c13cd3a7e06ba" Oct 07 11:44:03 crc kubenswrapper[4700]: I1007 11:44:03.818232 4700 scope.go:117] "RemoveContainer" containerID="baa5af43a455336c1da8d2c15456158bfd440fe9e926e442d93be4e6efa18560" Oct 07 11:44:03 crc kubenswrapper[4700]: I1007 11:44:03.825062 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f3ae1dd-ead7-4f8f-a6b9-95da701ae514-catalog-content\") pod \"certified-operators-q99vw\" (UID: \"8f3ae1dd-ead7-4f8f-a6b9-95da701ae514\") " pod="openshift-marketplace/certified-operators-q99vw" Oct 07 11:44:03 crc kubenswrapper[4700]: I1007 11:44:03.825418 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s29fc\" (UniqueName: \"kubernetes.io/projected/8f3ae1dd-ead7-4f8f-a6b9-95da701ae514-kube-api-access-s29fc\") pod \"certified-operators-q99vw\" (UID: \"8f3ae1dd-ead7-4f8f-a6b9-95da701ae514\") " pod="openshift-marketplace/certified-operators-q99vw" Oct 07 11:44:03 crc kubenswrapper[4700]: I1007 11:44:03.825552 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f3ae1dd-ead7-4f8f-a6b9-95da701ae514-utilities\") pod \"certified-operators-q99vw\" (UID: \"8f3ae1dd-ead7-4f8f-a6b9-95da701ae514\") " pod="openshift-marketplace/certified-operators-q99vw" Oct 07 11:44:03 crc kubenswrapper[4700]: I1007 11:44:03.825735 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f3ae1dd-ead7-4f8f-a6b9-95da701ae514-catalog-content\") pod \"certified-operators-q99vw\" (UID: \"8f3ae1dd-ead7-4f8f-a6b9-95da701ae514\") " pod="openshift-marketplace/certified-operators-q99vw" Oct 07 11:44:03 crc kubenswrapper[4700]: I1007 11:44:03.825965 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f3ae1dd-ead7-4f8f-a6b9-95da701ae514-utilities\") pod \"certified-operators-q99vw\" (UID: \"8f3ae1dd-ead7-4f8f-a6b9-95da701ae514\") " pod="openshift-marketplace/certified-operators-q99vw" Oct 07 11:44:03 crc kubenswrapper[4700]: I1007 11:44:03.850765 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s29fc\" (UniqueName: \"kubernetes.io/projected/8f3ae1dd-ead7-4f8f-a6b9-95da701ae514-kube-api-access-s29fc\") pod \"certified-operators-q99vw\" (UID: \"8f3ae1dd-ead7-4f8f-a6b9-95da701ae514\") " pod="openshift-marketplace/certified-operators-q99vw" Oct 07 11:44:03 crc kubenswrapper[4700]: I1007 11:44:03.961858 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q99vw" Oct 07 11:44:04 crc kubenswrapper[4700]: I1007 11:44:04.456811 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q99vw"] Oct 07 11:44:05 crc kubenswrapper[4700]: I1007 11:44:05.456634 4700 generic.go:334] "Generic (PLEG): container finished" podID="8f3ae1dd-ead7-4f8f-a6b9-95da701ae514" containerID="158fbfc3c1371f5616cf39bb7cba52c29998367c8ccbcdaf6057e32c5988cf20" exitCode=0 Oct 07 11:44:05 crc kubenswrapper[4700]: I1007 11:44:05.456702 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q99vw" event={"ID":"8f3ae1dd-ead7-4f8f-a6b9-95da701ae514","Type":"ContainerDied","Data":"158fbfc3c1371f5616cf39bb7cba52c29998367c8ccbcdaf6057e32c5988cf20"} Oct 07 11:44:05 crc kubenswrapper[4700]: I1007 11:44:05.457094 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q99vw" event={"ID":"8f3ae1dd-ead7-4f8f-a6b9-95da701ae514","Type":"ContainerStarted","Data":"89dbb554969919b671e54f02fe8381eef92a63ddf8efa522806efb6b85451bf8"} Oct 07 11:44:07 crc kubenswrapper[4700]: I1007 11:44:07.481413 4700 generic.go:334] "Generic (PLEG): container finished" podID="8f3ae1dd-ead7-4f8f-a6b9-95da701ae514" containerID="f9c19c838692c071f9769f05809449a0b64f54d28c732a66a6e3b410923dc0ed" exitCode=0 Oct 07 11:44:07 crc kubenswrapper[4700]: I1007 11:44:07.481494 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q99vw" event={"ID":"8f3ae1dd-ead7-4f8f-a6b9-95da701ae514","Type":"ContainerDied","Data":"f9c19c838692c071f9769f05809449a0b64f54d28c732a66a6e3b410923dc0ed"} Oct 07 11:44:08 crc kubenswrapper[4700]: I1007 11:44:08.500487 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q99vw" event={"ID":"8f3ae1dd-ead7-4f8f-a6b9-95da701ae514","Type":"ContainerStarted","Data":"a9bc13bd2d230bce21450dd8b320a326059bd7ef8207e98f9a6b127bac867ab6"} Oct 07 11:44:13 crc kubenswrapper[4700]: I1007 11:44:13.983853 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-q99vw" Oct 07 11:44:13 crc kubenswrapper[4700]: I1007 11:44:13.984459 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-q99vw" Oct 07 11:44:14 crc kubenswrapper[4700]: I1007 11:44:14.046232 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-q99vw" Oct 07 11:44:14 crc kubenswrapper[4700]: I1007 11:44:14.077008 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-q99vw" podStartSLOduration=8.632967683 podStartE2EDuration="11.07698715s" podCreationTimestamp="2025-10-07 11:44:03 +0000 UTC" firstStartedPulling="2025-10-07 11:44:05.460821284 +0000 UTC m=+1412.257220303" lastFinishedPulling="2025-10-07 11:44:07.904840751 +0000 UTC m=+1414.701239770" observedRunningTime="2025-10-07 11:44:08.534973583 +0000 UTC m=+1415.331372622" watchObservedRunningTime="2025-10-07 11:44:14.07698715 +0000 UTC m=+1420.873386139" Oct 07 11:44:14 crc kubenswrapper[4700]: I1007 11:44:14.638738 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-q99vw" Oct 07 11:44:14 crc kubenswrapper[4700]: I1007 11:44:14.722364 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q99vw"] Oct 07 11:44:16 crc kubenswrapper[4700]: I1007 11:44:16.594281 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-q99vw" podUID="8f3ae1dd-ead7-4f8f-a6b9-95da701ae514" containerName="registry-server" containerID="cri-o://a9bc13bd2d230bce21450dd8b320a326059bd7ef8207e98f9a6b127bac867ab6" gracePeriod=2 Oct 07 11:44:17 crc kubenswrapper[4700]: I1007 11:44:17.136468 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q99vw" Oct 07 11:44:17 crc kubenswrapper[4700]: I1007 11:44:17.201383 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f3ae1dd-ead7-4f8f-a6b9-95da701ae514-utilities\") pod \"8f3ae1dd-ead7-4f8f-a6b9-95da701ae514\" (UID: \"8f3ae1dd-ead7-4f8f-a6b9-95da701ae514\") " Oct 07 11:44:17 crc kubenswrapper[4700]: I1007 11:44:17.201547 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s29fc\" (UniqueName: \"kubernetes.io/projected/8f3ae1dd-ead7-4f8f-a6b9-95da701ae514-kube-api-access-s29fc\") pod \"8f3ae1dd-ead7-4f8f-a6b9-95da701ae514\" (UID: \"8f3ae1dd-ead7-4f8f-a6b9-95da701ae514\") " Oct 07 11:44:17 crc kubenswrapper[4700]: I1007 11:44:17.201806 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f3ae1dd-ead7-4f8f-a6b9-95da701ae514-catalog-content\") pod \"8f3ae1dd-ead7-4f8f-a6b9-95da701ae514\" (UID: \"8f3ae1dd-ead7-4f8f-a6b9-95da701ae514\") " Oct 07 11:44:17 crc kubenswrapper[4700]: I1007 11:44:17.202171 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f3ae1dd-ead7-4f8f-a6b9-95da701ae514-utilities" (OuterVolumeSpecName: "utilities") pod "8f3ae1dd-ead7-4f8f-a6b9-95da701ae514" (UID: "8f3ae1dd-ead7-4f8f-a6b9-95da701ae514"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:44:17 crc kubenswrapper[4700]: I1007 11:44:17.202963 4700 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f3ae1dd-ead7-4f8f-a6b9-95da701ae514-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 11:44:17 crc kubenswrapper[4700]: I1007 11:44:17.207244 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f3ae1dd-ead7-4f8f-a6b9-95da701ae514-kube-api-access-s29fc" (OuterVolumeSpecName: "kube-api-access-s29fc") pod "8f3ae1dd-ead7-4f8f-a6b9-95da701ae514" (UID: "8f3ae1dd-ead7-4f8f-a6b9-95da701ae514"). InnerVolumeSpecName "kube-api-access-s29fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:44:17 crc kubenswrapper[4700]: I1007 11:44:17.251102 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f3ae1dd-ead7-4f8f-a6b9-95da701ae514-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8f3ae1dd-ead7-4f8f-a6b9-95da701ae514" (UID: "8f3ae1dd-ead7-4f8f-a6b9-95da701ae514"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:44:17 crc kubenswrapper[4700]: I1007 11:44:17.304276 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s29fc\" (UniqueName: \"kubernetes.io/projected/8f3ae1dd-ead7-4f8f-a6b9-95da701ae514-kube-api-access-s29fc\") on node \"crc\" DevicePath \"\"" Oct 07 11:44:17 crc kubenswrapper[4700]: I1007 11:44:17.304326 4700 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f3ae1dd-ead7-4f8f-a6b9-95da701ae514-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 11:44:17 crc kubenswrapper[4700]: I1007 11:44:17.610173 4700 generic.go:334] "Generic (PLEG): container finished" podID="8f3ae1dd-ead7-4f8f-a6b9-95da701ae514" containerID="a9bc13bd2d230bce21450dd8b320a326059bd7ef8207e98f9a6b127bac867ab6" exitCode=0 Oct 07 11:44:17 crc kubenswrapper[4700]: I1007 11:44:17.610249 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q99vw" event={"ID":"8f3ae1dd-ead7-4f8f-a6b9-95da701ae514","Type":"ContainerDied","Data":"a9bc13bd2d230bce21450dd8b320a326059bd7ef8207e98f9a6b127bac867ab6"} Oct 07 11:44:17 crc kubenswrapper[4700]: I1007 11:44:17.610281 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q99vw" Oct 07 11:44:17 crc kubenswrapper[4700]: I1007 11:44:17.610351 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q99vw" event={"ID":"8f3ae1dd-ead7-4f8f-a6b9-95da701ae514","Type":"ContainerDied","Data":"89dbb554969919b671e54f02fe8381eef92a63ddf8efa522806efb6b85451bf8"} Oct 07 11:44:17 crc kubenswrapper[4700]: I1007 11:44:17.610405 4700 scope.go:117] "RemoveContainer" containerID="a9bc13bd2d230bce21450dd8b320a326059bd7ef8207e98f9a6b127bac867ab6" Oct 07 11:44:17 crc kubenswrapper[4700]: I1007 11:44:17.641176 4700 scope.go:117] "RemoveContainer" containerID="f9c19c838692c071f9769f05809449a0b64f54d28c732a66a6e3b410923dc0ed" Oct 07 11:44:17 crc kubenswrapper[4700]: I1007 11:44:17.658367 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q99vw"] Oct 07 11:44:17 crc kubenswrapper[4700]: I1007 11:44:17.666846 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-q99vw"] Oct 07 11:44:17 crc kubenswrapper[4700]: I1007 11:44:17.690830 4700 scope.go:117] "RemoveContainer" containerID="158fbfc3c1371f5616cf39bb7cba52c29998367c8ccbcdaf6057e32c5988cf20" Oct 07 11:44:17 crc kubenswrapper[4700]: I1007 11:44:17.768111 4700 scope.go:117] "RemoveContainer" containerID="a9bc13bd2d230bce21450dd8b320a326059bd7ef8207e98f9a6b127bac867ab6" Oct 07 11:44:17 crc kubenswrapper[4700]: E1007 11:44:17.771556 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9bc13bd2d230bce21450dd8b320a326059bd7ef8207e98f9a6b127bac867ab6\": container with ID starting with a9bc13bd2d230bce21450dd8b320a326059bd7ef8207e98f9a6b127bac867ab6 not found: ID does not exist" containerID="a9bc13bd2d230bce21450dd8b320a326059bd7ef8207e98f9a6b127bac867ab6" Oct 07 11:44:17 crc kubenswrapper[4700]: I1007 11:44:17.771609 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9bc13bd2d230bce21450dd8b320a326059bd7ef8207e98f9a6b127bac867ab6"} err="failed to get container status \"a9bc13bd2d230bce21450dd8b320a326059bd7ef8207e98f9a6b127bac867ab6\": rpc error: code = NotFound desc = could not find container \"a9bc13bd2d230bce21450dd8b320a326059bd7ef8207e98f9a6b127bac867ab6\": container with ID starting with a9bc13bd2d230bce21450dd8b320a326059bd7ef8207e98f9a6b127bac867ab6 not found: ID does not exist" Oct 07 11:44:17 crc kubenswrapper[4700]: I1007 11:44:17.771642 4700 scope.go:117] "RemoveContainer" containerID="f9c19c838692c071f9769f05809449a0b64f54d28c732a66a6e3b410923dc0ed" Oct 07 11:44:17 crc kubenswrapper[4700]: E1007 11:44:17.772160 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9c19c838692c071f9769f05809449a0b64f54d28c732a66a6e3b410923dc0ed\": container with ID starting with f9c19c838692c071f9769f05809449a0b64f54d28c732a66a6e3b410923dc0ed not found: ID does not exist" containerID="f9c19c838692c071f9769f05809449a0b64f54d28c732a66a6e3b410923dc0ed" Oct 07 11:44:17 crc kubenswrapper[4700]: I1007 11:44:17.772216 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9c19c838692c071f9769f05809449a0b64f54d28c732a66a6e3b410923dc0ed"} err="failed to get container status \"f9c19c838692c071f9769f05809449a0b64f54d28c732a66a6e3b410923dc0ed\": rpc error: code = NotFound desc = could not find container \"f9c19c838692c071f9769f05809449a0b64f54d28c732a66a6e3b410923dc0ed\": container with ID starting with f9c19c838692c071f9769f05809449a0b64f54d28c732a66a6e3b410923dc0ed not found: ID does not exist" Oct 07 11:44:17 crc kubenswrapper[4700]: I1007 11:44:17.772255 4700 scope.go:117] "RemoveContainer" containerID="158fbfc3c1371f5616cf39bb7cba52c29998367c8ccbcdaf6057e32c5988cf20" Oct 07 11:44:17 crc kubenswrapper[4700]: E1007 11:44:17.772748 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"158fbfc3c1371f5616cf39bb7cba52c29998367c8ccbcdaf6057e32c5988cf20\": container with ID starting with 158fbfc3c1371f5616cf39bb7cba52c29998367c8ccbcdaf6057e32c5988cf20 not found: ID does not exist" containerID="158fbfc3c1371f5616cf39bb7cba52c29998367c8ccbcdaf6057e32c5988cf20" Oct 07 11:44:17 crc kubenswrapper[4700]: I1007 11:44:17.772774 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"158fbfc3c1371f5616cf39bb7cba52c29998367c8ccbcdaf6057e32c5988cf20"} err="failed to get container status \"158fbfc3c1371f5616cf39bb7cba52c29998367c8ccbcdaf6057e32c5988cf20\": rpc error: code = NotFound desc = could not find container \"158fbfc3c1371f5616cf39bb7cba52c29998367c8ccbcdaf6057e32c5988cf20\": container with ID starting with 158fbfc3c1371f5616cf39bb7cba52c29998367c8ccbcdaf6057e32c5988cf20 not found: ID does not exist" Oct 07 11:44:17 crc kubenswrapper[4700]: I1007 11:44:17.971291 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f3ae1dd-ead7-4f8f-a6b9-95da701ae514" path="/var/lib/kubelet/pods/8f3ae1dd-ead7-4f8f-a6b9-95da701ae514/volumes" Oct 07 11:44:45 crc kubenswrapper[4700]: I1007 11:44:45.333795 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 11:44:45 crc kubenswrapper[4700]: I1007 11:44:45.334384 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 11:45:00 crc kubenswrapper[4700]: I1007 11:45:00.150628 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330625-frtmq"] Oct 07 11:45:00 crc kubenswrapper[4700]: E1007 11:45:00.151672 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f3ae1dd-ead7-4f8f-a6b9-95da701ae514" containerName="extract-content" Oct 07 11:45:00 crc kubenswrapper[4700]: I1007 11:45:00.151689 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f3ae1dd-ead7-4f8f-a6b9-95da701ae514" containerName="extract-content" Oct 07 11:45:00 crc kubenswrapper[4700]: E1007 11:45:00.151712 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f3ae1dd-ead7-4f8f-a6b9-95da701ae514" containerName="extract-utilities" Oct 07 11:45:00 crc kubenswrapper[4700]: I1007 11:45:00.151719 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f3ae1dd-ead7-4f8f-a6b9-95da701ae514" containerName="extract-utilities" Oct 07 11:45:00 crc kubenswrapper[4700]: E1007 11:45:00.151750 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f3ae1dd-ead7-4f8f-a6b9-95da701ae514" containerName="registry-server" Oct 07 11:45:00 crc kubenswrapper[4700]: I1007 11:45:00.151756 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f3ae1dd-ead7-4f8f-a6b9-95da701ae514" containerName="registry-server" Oct 07 11:45:00 crc kubenswrapper[4700]: I1007 11:45:00.151963 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f3ae1dd-ead7-4f8f-a6b9-95da701ae514" containerName="registry-server" Oct 07 11:45:00 crc kubenswrapper[4700]: I1007 11:45:00.153517 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330625-frtmq" Oct 07 11:45:00 crc kubenswrapper[4700]: I1007 11:45:00.155714 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 11:45:00 crc kubenswrapper[4700]: I1007 11:45:00.157009 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 11:45:00 crc kubenswrapper[4700]: I1007 11:45:00.163421 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330625-frtmq"] Oct 07 11:45:00 crc kubenswrapper[4700]: I1007 11:45:00.315787 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8998a50a-5077-4ee0-aa24-54ee046989b3-secret-volume\") pod \"collect-profiles-29330625-frtmq\" (UID: \"8998a50a-5077-4ee0-aa24-54ee046989b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330625-frtmq" Oct 07 11:45:00 crc kubenswrapper[4700]: I1007 11:45:00.316017 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8998a50a-5077-4ee0-aa24-54ee046989b3-config-volume\") pod \"collect-profiles-29330625-frtmq\" (UID: \"8998a50a-5077-4ee0-aa24-54ee046989b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330625-frtmq" Oct 07 11:45:00 crc kubenswrapper[4700]: I1007 11:45:00.316106 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6v98\" (UniqueName: \"kubernetes.io/projected/8998a50a-5077-4ee0-aa24-54ee046989b3-kube-api-access-r6v98\") pod \"collect-profiles-29330625-frtmq\" (UID: \"8998a50a-5077-4ee0-aa24-54ee046989b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330625-frtmq" Oct 07 11:45:00 crc kubenswrapper[4700]: I1007 11:45:00.417967 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8998a50a-5077-4ee0-aa24-54ee046989b3-secret-volume\") pod \"collect-profiles-29330625-frtmq\" (UID: \"8998a50a-5077-4ee0-aa24-54ee046989b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330625-frtmq" Oct 07 11:45:00 crc kubenswrapper[4700]: I1007 11:45:00.418023 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8998a50a-5077-4ee0-aa24-54ee046989b3-config-volume\") pod \"collect-profiles-29330625-frtmq\" (UID: \"8998a50a-5077-4ee0-aa24-54ee046989b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330625-frtmq" Oct 07 11:45:00 crc kubenswrapper[4700]: I1007 11:45:00.418046 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6v98\" (UniqueName: \"kubernetes.io/projected/8998a50a-5077-4ee0-aa24-54ee046989b3-kube-api-access-r6v98\") pod \"collect-profiles-29330625-frtmq\" (UID: \"8998a50a-5077-4ee0-aa24-54ee046989b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330625-frtmq" Oct 07 11:45:00 crc kubenswrapper[4700]: I1007 11:45:00.419143 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8998a50a-5077-4ee0-aa24-54ee046989b3-config-volume\") pod \"collect-profiles-29330625-frtmq\" (UID: \"8998a50a-5077-4ee0-aa24-54ee046989b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330625-frtmq" Oct 07 11:45:00 crc kubenswrapper[4700]: I1007 11:45:00.427360 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8998a50a-5077-4ee0-aa24-54ee046989b3-secret-volume\") pod \"collect-profiles-29330625-frtmq\" (UID: \"8998a50a-5077-4ee0-aa24-54ee046989b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330625-frtmq" Oct 07 11:45:00 crc kubenswrapper[4700]: I1007 11:45:00.447394 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6v98\" (UniqueName: \"kubernetes.io/projected/8998a50a-5077-4ee0-aa24-54ee046989b3-kube-api-access-r6v98\") pod \"collect-profiles-29330625-frtmq\" (UID: \"8998a50a-5077-4ee0-aa24-54ee046989b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330625-frtmq" Oct 07 11:45:00 crc kubenswrapper[4700]: I1007 11:45:00.473111 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330625-frtmq" Oct 07 11:45:01 crc kubenswrapper[4700]: I1007 11:45:01.016101 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330625-frtmq"] Oct 07 11:45:01 crc kubenswrapper[4700]: I1007 11:45:01.106878 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330625-frtmq" event={"ID":"8998a50a-5077-4ee0-aa24-54ee046989b3","Type":"ContainerStarted","Data":"7d00cbc3d73312c5a7eeea56c1829cf073b46957263684c83685febae2848de5"} Oct 07 11:45:02 crc kubenswrapper[4700]: I1007 11:45:02.122236 4700 generic.go:334] "Generic (PLEG): container finished" podID="8998a50a-5077-4ee0-aa24-54ee046989b3" containerID="3ba6614b3cd4d1e380f606f9ac8ac08a8fe47511de2dbd7f544641ed0a36fc2f" exitCode=0 Oct 07 11:45:02 crc kubenswrapper[4700]: I1007 11:45:02.123384 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330625-frtmq" event={"ID":"8998a50a-5077-4ee0-aa24-54ee046989b3","Type":"ContainerDied","Data":"3ba6614b3cd4d1e380f606f9ac8ac08a8fe47511de2dbd7f544641ed0a36fc2f"} Oct 07 11:45:03 crc kubenswrapper[4700]: I1007 11:45:03.452397 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330625-frtmq" Oct 07 11:45:03 crc kubenswrapper[4700]: I1007 11:45:03.494037 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8998a50a-5077-4ee0-aa24-54ee046989b3-secret-volume\") pod \"8998a50a-5077-4ee0-aa24-54ee046989b3\" (UID: \"8998a50a-5077-4ee0-aa24-54ee046989b3\") " Oct 07 11:45:03 crc kubenswrapper[4700]: I1007 11:45:03.494169 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8998a50a-5077-4ee0-aa24-54ee046989b3-config-volume\") pod \"8998a50a-5077-4ee0-aa24-54ee046989b3\" (UID: \"8998a50a-5077-4ee0-aa24-54ee046989b3\") " Oct 07 11:45:03 crc kubenswrapper[4700]: I1007 11:45:03.494316 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6v98\" (UniqueName: \"kubernetes.io/projected/8998a50a-5077-4ee0-aa24-54ee046989b3-kube-api-access-r6v98\") pod \"8998a50a-5077-4ee0-aa24-54ee046989b3\" (UID: \"8998a50a-5077-4ee0-aa24-54ee046989b3\") " Oct 07 11:45:03 crc kubenswrapper[4700]: I1007 11:45:03.495644 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8998a50a-5077-4ee0-aa24-54ee046989b3-config-volume" (OuterVolumeSpecName: "config-volume") pod "8998a50a-5077-4ee0-aa24-54ee046989b3" (UID: "8998a50a-5077-4ee0-aa24-54ee046989b3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:45:03 crc kubenswrapper[4700]: I1007 11:45:03.500455 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8998a50a-5077-4ee0-aa24-54ee046989b3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8998a50a-5077-4ee0-aa24-54ee046989b3" (UID: "8998a50a-5077-4ee0-aa24-54ee046989b3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:45:03 crc kubenswrapper[4700]: I1007 11:45:03.501871 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8998a50a-5077-4ee0-aa24-54ee046989b3-kube-api-access-r6v98" (OuterVolumeSpecName: "kube-api-access-r6v98") pod "8998a50a-5077-4ee0-aa24-54ee046989b3" (UID: "8998a50a-5077-4ee0-aa24-54ee046989b3"). InnerVolumeSpecName "kube-api-access-r6v98". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:45:03 crc kubenswrapper[4700]: I1007 11:45:03.597189 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6v98\" (UniqueName: \"kubernetes.io/projected/8998a50a-5077-4ee0-aa24-54ee046989b3-kube-api-access-r6v98\") on node \"crc\" DevicePath \"\"" Oct 07 11:45:03 crc kubenswrapper[4700]: I1007 11:45:03.597234 4700 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8998a50a-5077-4ee0-aa24-54ee046989b3-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 11:45:03 crc kubenswrapper[4700]: I1007 11:45:03.597250 4700 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8998a50a-5077-4ee0-aa24-54ee046989b3-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 11:45:03 crc kubenswrapper[4700]: I1007 11:45:03.924516 4700 scope.go:117] "RemoveContainer" containerID="370391fb89d0b6dbfe5285a98a41f53e1e1f4d782e3b568d7f8fc6e55edf5a19" Oct 07 11:45:04 crc kubenswrapper[4700]: I1007 11:45:04.141450 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330625-frtmq" event={"ID":"8998a50a-5077-4ee0-aa24-54ee046989b3","Type":"ContainerDied","Data":"7d00cbc3d73312c5a7eeea56c1829cf073b46957263684c83685febae2848de5"} Oct 07 11:45:04 crc kubenswrapper[4700]: I1007 11:45:04.141482 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330625-frtmq" Oct 07 11:45:04 crc kubenswrapper[4700]: I1007 11:45:04.141491 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d00cbc3d73312c5a7eeea56c1829cf073b46957263684c83685febae2848de5" Oct 07 11:45:13 crc kubenswrapper[4700]: I1007 11:45:13.654204 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2f2qn"] Oct 07 11:45:13 crc kubenswrapper[4700]: E1007 11:45:13.655523 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8998a50a-5077-4ee0-aa24-54ee046989b3" containerName="collect-profiles" Oct 07 11:45:13 crc kubenswrapper[4700]: I1007 11:45:13.655546 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="8998a50a-5077-4ee0-aa24-54ee046989b3" containerName="collect-profiles" Oct 07 11:45:13 crc kubenswrapper[4700]: I1007 11:45:13.655912 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="8998a50a-5077-4ee0-aa24-54ee046989b3" containerName="collect-profiles" Oct 07 11:45:13 crc kubenswrapper[4700]: I1007 11:45:13.658376 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2f2qn" Oct 07 11:45:13 crc kubenswrapper[4700]: I1007 11:45:13.671930 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2f2qn"] Oct 07 11:45:13 crc kubenswrapper[4700]: I1007 11:45:13.721802 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2f7ffb6-7be6-46ab-b4c7-20244b5d813e-catalog-content\") pod \"redhat-marketplace-2f2qn\" (UID: \"f2f7ffb6-7be6-46ab-b4c7-20244b5d813e\") " pod="openshift-marketplace/redhat-marketplace-2f2qn" Oct 07 11:45:13 crc kubenswrapper[4700]: I1007 11:45:13.721948 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2f7ffb6-7be6-46ab-b4c7-20244b5d813e-utilities\") pod \"redhat-marketplace-2f2qn\" (UID: \"f2f7ffb6-7be6-46ab-b4c7-20244b5d813e\") " pod="openshift-marketplace/redhat-marketplace-2f2qn" Oct 07 11:45:13 crc kubenswrapper[4700]: I1007 11:45:13.722017 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28f5p\" (UniqueName: \"kubernetes.io/projected/f2f7ffb6-7be6-46ab-b4c7-20244b5d813e-kube-api-access-28f5p\") pod \"redhat-marketplace-2f2qn\" (UID: \"f2f7ffb6-7be6-46ab-b4c7-20244b5d813e\") " pod="openshift-marketplace/redhat-marketplace-2f2qn" Oct 07 11:45:13 crc kubenswrapper[4700]: I1007 11:45:13.823844 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28f5p\" (UniqueName: \"kubernetes.io/projected/f2f7ffb6-7be6-46ab-b4c7-20244b5d813e-kube-api-access-28f5p\") pod \"redhat-marketplace-2f2qn\" (UID: \"f2f7ffb6-7be6-46ab-b4c7-20244b5d813e\") " pod="openshift-marketplace/redhat-marketplace-2f2qn" Oct 07 11:45:13 crc kubenswrapper[4700]: I1007 11:45:13.824289 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2f7ffb6-7be6-46ab-b4c7-20244b5d813e-catalog-content\") pod \"redhat-marketplace-2f2qn\" (UID: \"f2f7ffb6-7be6-46ab-b4c7-20244b5d813e\") " pod="openshift-marketplace/redhat-marketplace-2f2qn" Oct 07 11:45:13 crc kubenswrapper[4700]: I1007 11:45:13.824680 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2f7ffb6-7be6-46ab-b4c7-20244b5d813e-catalog-content\") pod \"redhat-marketplace-2f2qn\" (UID: \"f2f7ffb6-7be6-46ab-b4c7-20244b5d813e\") " pod="openshift-marketplace/redhat-marketplace-2f2qn" Oct 07 11:45:13 crc kubenswrapper[4700]: I1007 11:45:13.825063 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2f7ffb6-7be6-46ab-b4c7-20244b5d813e-utilities\") pod \"redhat-marketplace-2f2qn\" (UID: \"f2f7ffb6-7be6-46ab-b4c7-20244b5d813e\") " pod="openshift-marketplace/redhat-marketplace-2f2qn" Oct 07 11:45:13 crc kubenswrapper[4700]: I1007 11:45:13.825109 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2f7ffb6-7be6-46ab-b4c7-20244b5d813e-utilities\") pod \"redhat-marketplace-2f2qn\" (UID: \"f2f7ffb6-7be6-46ab-b4c7-20244b5d813e\") " pod="openshift-marketplace/redhat-marketplace-2f2qn" Oct 07 11:45:13 crc kubenswrapper[4700]: I1007 11:45:13.849078 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28f5p\" (UniqueName: \"kubernetes.io/projected/f2f7ffb6-7be6-46ab-b4c7-20244b5d813e-kube-api-access-28f5p\") pod \"redhat-marketplace-2f2qn\" (UID: \"f2f7ffb6-7be6-46ab-b4c7-20244b5d813e\") " pod="openshift-marketplace/redhat-marketplace-2f2qn" Oct 07 11:45:13 crc kubenswrapper[4700]: I1007 11:45:13.985344 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2f2qn" Oct 07 11:45:14 crc kubenswrapper[4700]: I1007 11:45:14.467420 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2f2qn"] Oct 07 11:45:14 crc kubenswrapper[4700]: W1007 11:45:14.487126 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2f7ffb6_7be6_46ab_b4c7_20244b5d813e.slice/crio-2f1143db3749f26ad57c9d25bd89cf6e06f970f4f7277e9521baaf5458d4e980 WatchSource:0}: Error finding container 2f1143db3749f26ad57c9d25bd89cf6e06f970f4f7277e9521baaf5458d4e980: Status 404 returned error can't find the container with id 2f1143db3749f26ad57c9d25bd89cf6e06f970f4f7277e9521baaf5458d4e980 Oct 07 11:45:15 crc kubenswrapper[4700]: I1007 11:45:15.252027 4700 generic.go:334] "Generic (PLEG): container finished" podID="f2f7ffb6-7be6-46ab-b4c7-20244b5d813e" containerID="563823e869352a66405376e9776f99f8c7c8285c88830c7f09447bd6a0e115c8" exitCode=0 Oct 07 11:45:15 crc kubenswrapper[4700]: I1007 11:45:15.252097 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2f2qn" event={"ID":"f2f7ffb6-7be6-46ab-b4c7-20244b5d813e","Type":"ContainerDied","Data":"563823e869352a66405376e9776f99f8c7c8285c88830c7f09447bd6a0e115c8"} Oct 07 11:45:15 crc kubenswrapper[4700]: I1007 11:45:15.252374 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2f2qn" event={"ID":"f2f7ffb6-7be6-46ab-b4c7-20244b5d813e","Type":"ContainerStarted","Data":"2f1143db3749f26ad57c9d25bd89cf6e06f970f4f7277e9521baaf5458d4e980"} Oct 07 11:45:15 crc kubenswrapper[4700]: I1007 11:45:15.334133 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 11:45:15 crc kubenswrapper[4700]: I1007 11:45:15.334190 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 11:45:16 crc kubenswrapper[4700]: I1007 11:45:16.262347 4700 generic.go:334] "Generic (PLEG): container finished" podID="f2f7ffb6-7be6-46ab-b4c7-20244b5d813e" containerID="c763f70a246059504564daa13a8abdfec120a36dfa4150648cf9a38e2ae84bd6" exitCode=0 Oct 07 11:45:16 crc kubenswrapper[4700]: I1007 11:45:16.262511 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2f2qn" event={"ID":"f2f7ffb6-7be6-46ab-b4c7-20244b5d813e","Type":"ContainerDied","Data":"c763f70a246059504564daa13a8abdfec120a36dfa4150648cf9a38e2ae84bd6"} Oct 07 11:45:17 crc kubenswrapper[4700]: I1007 11:45:17.277951 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2f2qn" event={"ID":"f2f7ffb6-7be6-46ab-b4c7-20244b5d813e","Type":"ContainerStarted","Data":"43db98f288b59493fcad51f29903f94e6d0b375fa8081ca36a809d38641c37ef"} Oct 07 11:45:17 crc kubenswrapper[4700]: I1007 11:45:17.311474 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2f2qn" podStartSLOduration=2.785322312 podStartE2EDuration="4.311453086s" podCreationTimestamp="2025-10-07 11:45:13 +0000 UTC" firstStartedPulling="2025-10-07 11:45:15.254561315 +0000 UTC m=+1482.050960314" lastFinishedPulling="2025-10-07 11:45:16.780692089 +0000 UTC m=+1483.577091088" observedRunningTime="2025-10-07 11:45:17.30546361 +0000 UTC m=+1484.101862609" watchObservedRunningTime="2025-10-07 11:45:17.311453086 +0000 UTC m=+1484.107852085" Oct 07 11:45:23 crc kubenswrapper[4700]: I1007 11:45:23.986813 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2f2qn" Oct 07 11:45:23 crc kubenswrapper[4700]: I1007 11:45:23.987448 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2f2qn" Oct 07 11:45:24 crc kubenswrapper[4700]: I1007 11:45:24.055464 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2f2qn" Oct 07 11:45:24 crc kubenswrapper[4700]: I1007 11:45:24.400153 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2f2qn" Oct 07 11:45:24 crc kubenswrapper[4700]: I1007 11:45:24.443793 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-h5tvz"] Oct 07 11:45:24 crc kubenswrapper[4700]: I1007 11:45:24.446510 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h5tvz" Oct 07 11:45:24 crc kubenswrapper[4700]: I1007 11:45:24.487227 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h5tvz"] Oct 07 11:45:24 crc kubenswrapper[4700]: I1007 11:45:24.632820 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08d24f2c-f43c-4512-923c-759d1b90f2c7-utilities\") pod \"redhat-operators-h5tvz\" (UID: \"08d24f2c-f43c-4512-923c-759d1b90f2c7\") " pod="openshift-marketplace/redhat-operators-h5tvz" Oct 07 11:45:24 crc kubenswrapper[4700]: I1007 11:45:24.632888 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6xnb\" (UniqueName: \"kubernetes.io/projected/08d24f2c-f43c-4512-923c-759d1b90f2c7-kube-api-access-s6xnb\") pod \"redhat-operators-h5tvz\" (UID: \"08d24f2c-f43c-4512-923c-759d1b90f2c7\") " pod="openshift-marketplace/redhat-operators-h5tvz" Oct 07 11:45:24 crc kubenswrapper[4700]: I1007 11:45:24.632988 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08d24f2c-f43c-4512-923c-759d1b90f2c7-catalog-content\") pod \"redhat-operators-h5tvz\" (UID: \"08d24f2c-f43c-4512-923c-759d1b90f2c7\") " pod="openshift-marketplace/redhat-operators-h5tvz" Oct 07 11:45:24 crc kubenswrapper[4700]: I1007 11:45:24.734446 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08d24f2c-f43c-4512-923c-759d1b90f2c7-utilities\") pod \"redhat-operators-h5tvz\" (UID: \"08d24f2c-f43c-4512-923c-759d1b90f2c7\") " pod="openshift-marketplace/redhat-operators-h5tvz" Oct 07 11:45:24 crc kubenswrapper[4700]: I1007 11:45:24.734495 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6xnb\" (UniqueName: \"kubernetes.io/projected/08d24f2c-f43c-4512-923c-759d1b90f2c7-kube-api-access-s6xnb\") pod \"redhat-operators-h5tvz\" (UID: \"08d24f2c-f43c-4512-923c-759d1b90f2c7\") " pod="openshift-marketplace/redhat-operators-h5tvz" Oct 07 11:45:24 crc kubenswrapper[4700]: I1007 11:45:24.734553 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08d24f2c-f43c-4512-923c-759d1b90f2c7-catalog-content\") pod \"redhat-operators-h5tvz\" (UID: \"08d24f2c-f43c-4512-923c-759d1b90f2c7\") " pod="openshift-marketplace/redhat-operators-h5tvz" Oct 07 11:45:24 crc kubenswrapper[4700]: I1007 11:45:24.735041 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08d24f2c-f43c-4512-923c-759d1b90f2c7-catalog-content\") pod \"redhat-operators-h5tvz\" (UID: \"08d24f2c-f43c-4512-923c-759d1b90f2c7\") " pod="openshift-marketplace/redhat-operators-h5tvz" Oct 07 11:45:24 crc kubenswrapper[4700]: I1007 11:45:24.735380 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08d24f2c-f43c-4512-923c-759d1b90f2c7-utilities\") pod \"redhat-operators-h5tvz\" (UID: \"08d24f2c-f43c-4512-923c-759d1b90f2c7\") " pod="openshift-marketplace/redhat-operators-h5tvz" Oct 07 11:45:24 crc kubenswrapper[4700]: I1007 11:45:24.755630 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6xnb\" (UniqueName: \"kubernetes.io/projected/08d24f2c-f43c-4512-923c-759d1b90f2c7-kube-api-access-s6xnb\") pod \"redhat-operators-h5tvz\" (UID: \"08d24f2c-f43c-4512-923c-759d1b90f2c7\") " pod="openshift-marketplace/redhat-operators-h5tvz" Oct 07 11:45:24 crc kubenswrapper[4700]: I1007 11:45:24.787122 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h5tvz" Oct 07 11:45:25 crc kubenswrapper[4700]: I1007 11:45:25.262801 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h5tvz"] Oct 07 11:45:25 crc kubenswrapper[4700]: I1007 11:45:25.357104 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h5tvz" event={"ID":"08d24f2c-f43c-4512-923c-759d1b90f2c7","Type":"ContainerStarted","Data":"073ea3b952cba4264439ca8066313002c1a082382dda3cb45f3ae27f915ee925"} Oct 07 11:45:26 crc kubenswrapper[4700]: I1007 11:45:26.370156 4700 generic.go:334] "Generic (PLEG): container finished" podID="08d24f2c-f43c-4512-923c-759d1b90f2c7" containerID="50906c5975829320dba8f0365a0c62e002ff885e5a53931ee73189db77f9c9e2" exitCode=0 Oct 07 11:45:26 crc kubenswrapper[4700]: I1007 11:45:26.370211 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h5tvz" event={"ID":"08d24f2c-f43c-4512-923c-759d1b90f2c7","Type":"ContainerDied","Data":"50906c5975829320dba8f0365a0c62e002ff885e5a53931ee73189db77f9c9e2"} Oct 07 11:45:26 crc kubenswrapper[4700]: I1007 11:45:26.705242 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2f2qn"] Oct 07 11:45:26 crc kubenswrapper[4700]: I1007 11:45:26.705844 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2f2qn" podUID="f2f7ffb6-7be6-46ab-b4c7-20244b5d813e" containerName="registry-server" containerID="cri-o://43db98f288b59493fcad51f29903f94e6d0b375fa8081ca36a809d38641c37ef" gracePeriod=2 Oct 07 11:45:27 crc kubenswrapper[4700]: I1007 11:45:27.339969 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2f2qn" Oct 07 11:45:27 crc kubenswrapper[4700]: I1007 11:45:27.383486 4700 generic.go:334] "Generic (PLEG): container finished" podID="f2f7ffb6-7be6-46ab-b4c7-20244b5d813e" containerID="43db98f288b59493fcad51f29903f94e6d0b375fa8081ca36a809d38641c37ef" exitCode=0 Oct 07 11:45:27 crc kubenswrapper[4700]: I1007 11:45:27.383535 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2f2qn" event={"ID":"f2f7ffb6-7be6-46ab-b4c7-20244b5d813e","Type":"ContainerDied","Data":"43db98f288b59493fcad51f29903f94e6d0b375fa8081ca36a809d38641c37ef"} Oct 07 11:45:27 crc kubenswrapper[4700]: I1007 11:45:27.383591 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2f2qn" Oct 07 11:45:27 crc kubenswrapper[4700]: I1007 11:45:27.383605 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2f2qn" event={"ID":"f2f7ffb6-7be6-46ab-b4c7-20244b5d813e","Type":"ContainerDied","Data":"2f1143db3749f26ad57c9d25bd89cf6e06f970f4f7277e9521baaf5458d4e980"} Oct 07 11:45:27 crc kubenswrapper[4700]: I1007 11:45:27.383638 4700 scope.go:117] "RemoveContainer" containerID="43db98f288b59493fcad51f29903f94e6d0b375fa8081ca36a809d38641c37ef" Oct 07 11:45:27 crc kubenswrapper[4700]: I1007 11:45:27.412907 4700 scope.go:117] "RemoveContainer" containerID="c763f70a246059504564daa13a8abdfec120a36dfa4150648cf9a38e2ae84bd6" Oct 07 11:45:27 crc kubenswrapper[4700]: I1007 11:45:27.444716 4700 scope.go:117] "RemoveContainer" containerID="563823e869352a66405376e9776f99f8c7c8285c88830c7f09447bd6a0e115c8" Oct 07 11:45:27 crc kubenswrapper[4700]: I1007 11:45:27.490103 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2f7ffb6-7be6-46ab-b4c7-20244b5d813e-utilities\") pod \"f2f7ffb6-7be6-46ab-b4c7-20244b5d813e\" (UID: \"f2f7ffb6-7be6-46ab-b4c7-20244b5d813e\") " Oct 07 11:45:27 crc kubenswrapper[4700]: I1007 11:45:27.490858 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2f7ffb6-7be6-46ab-b4c7-20244b5d813e-catalog-content\") pod \"f2f7ffb6-7be6-46ab-b4c7-20244b5d813e\" (UID: \"f2f7ffb6-7be6-46ab-b4c7-20244b5d813e\") " Oct 07 11:45:27 crc kubenswrapper[4700]: I1007 11:45:27.490941 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28f5p\" (UniqueName: \"kubernetes.io/projected/f2f7ffb6-7be6-46ab-b4c7-20244b5d813e-kube-api-access-28f5p\") pod \"f2f7ffb6-7be6-46ab-b4c7-20244b5d813e\" (UID: \"f2f7ffb6-7be6-46ab-b4c7-20244b5d813e\") " Oct 07 11:45:27 crc kubenswrapper[4700]: I1007 11:45:27.491658 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2f7ffb6-7be6-46ab-b4c7-20244b5d813e-utilities" (OuterVolumeSpecName: "utilities") pod "f2f7ffb6-7be6-46ab-b4c7-20244b5d813e" (UID: "f2f7ffb6-7be6-46ab-b4c7-20244b5d813e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:45:27 crc kubenswrapper[4700]: I1007 11:45:27.500675 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2f7ffb6-7be6-46ab-b4c7-20244b5d813e-kube-api-access-28f5p" (OuterVolumeSpecName: "kube-api-access-28f5p") pod "f2f7ffb6-7be6-46ab-b4c7-20244b5d813e" (UID: "f2f7ffb6-7be6-46ab-b4c7-20244b5d813e"). InnerVolumeSpecName "kube-api-access-28f5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:45:27 crc kubenswrapper[4700]: I1007 11:45:27.502906 4700 scope.go:117] "RemoveContainer" containerID="43db98f288b59493fcad51f29903f94e6d0b375fa8081ca36a809d38641c37ef" Oct 07 11:45:27 crc kubenswrapper[4700]: E1007 11:45:27.503741 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43db98f288b59493fcad51f29903f94e6d0b375fa8081ca36a809d38641c37ef\": container with ID starting with 43db98f288b59493fcad51f29903f94e6d0b375fa8081ca36a809d38641c37ef not found: ID does not exist" containerID="43db98f288b59493fcad51f29903f94e6d0b375fa8081ca36a809d38641c37ef" Oct 07 11:45:27 crc kubenswrapper[4700]: I1007 11:45:27.503773 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43db98f288b59493fcad51f29903f94e6d0b375fa8081ca36a809d38641c37ef"} err="failed to get container status \"43db98f288b59493fcad51f29903f94e6d0b375fa8081ca36a809d38641c37ef\": rpc error: code = NotFound desc = could not find container \"43db98f288b59493fcad51f29903f94e6d0b375fa8081ca36a809d38641c37ef\": container with ID starting with 43db98f288b59493fcad51f29903f94e6d0b375fa8081ca36a809d38641c37ef not found: ID does not exist" Oct 07 11:45:27 crc kubenswrapper[4700]: I1007 11:45:27.503794 4700 scope.go:117] "RemoveContainer" containerID="c763f70a246059504564daa13a8abdfec120a36dfa4150648cf9a38e2ae84bd6" Oct 07 11:45:27 crc kubenswrapper[4700]: E1007 11:45:27.504083 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c763f70a246059504564daa13a8abdfec120a36dfa4150648cf9a38e2ae84bd6\": container with ID starting with c763f70a246059504564daa13a8abdfec120a36dfa4150648cf9a38e2ae84bd6 not found: ID does not exist" containerID="c763f70a246059504564daa13a8abdfec120a36dfa4150648cf9a38e2ae84bd6" Oct 07 11:45:27 crc kubenswrapper[4700]: I1007 11:45:27.504110 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c763f70a246059504564daa13a8abdfec120a36dfa4150648cf9a38e2ae84bd6"} err="failed to get container status \"c763f70a246059504564daa13a8abdfec120a36dfa4150648cf9a38e2ae84bd6\": rpc error: code = NotFound desc = could not find container \"c763f70a246059504564daa13a8abdfec120a36dfa4150648cf9a38e2ae84bd6\": container with ID starting with c763f70a246059504564daa13a8abdfec120a36dfa4150648cf9a38e2ae84bd6 not found: ID does not exist" Oct 07 11:45:27 crc kubenswrapper[4700]: I1007 11:45:27.504127 4700 scope.go:117] "RemoveContainer" containerID="563823e869352a66405376e9776f99f8c7c8285c88830c7f09447bd6a0e115c8" Oct 07 11:45:27 crc kubenswrapper[4700]: E1007 11:45:27.504472 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"563823e869352a66405376e9776f99f8c7c8285c88830c7f09447bd6a0e115c8\": container with ID starting with 563823e869352a66405376e9776f99f8c7c8285c88830c7f09447bd6a0e115c8 not found: ID does not exist" containerID="563823e869352a66405376e9776f99f8c7c8285c88830c7f09447bd6a0e115c8" Oct 07 11:45:27 crc kubenswrapper[4700]: I1007 11:45:27.504490 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"563823e869352a66405376e9776f99f8c7c8285c88830c7f09447bd6a0e115c8"} err="failed to get container status \"563823e869352a66405376e9776f99f8c7c8285c88830c7f09447bd6a0e115c8\": rpc error: code = NotFound desc = could not find container \"563823e869352a66405376e9776f99f8c7c8285c88830c7f09447bd6a0e115c8\": container with ID starting with 563823e869352a66405376e9776f99f8c7c8285c88830c7f09447bd6a0e115c8 not found: ID does not exist" Oct 07 11:45:27 crc kubenswrapper[4700]: I1007 11:45:27.504769 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2f7ffb6-7be6-46ab-b4c7-20244b5d813e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f2f7ffb6-7be6-46ab-b4c7-20244b5d813e" (UID: "f2f7ffb6-7be6-46ab-b4c7-20244b5d813e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:45:27 crc kubenswrapper[4700]: I1007 11:45:27.593123 4700 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2f7ffb6-7be6-46ab-b4c7-20244b5d813e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 11:45:27 crc kubenswrapper[4700]: I1007 11:45:27.593173 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28f5p\" (UniqueName: \"kubernetes.io/projected/f2f7ffb6-7be6-46ab-b4c7-20244b5d813e-kube-api-access-28f5p\") on node \"crc\" DevicePath \"\"" Oct 07 11:45:27 crc kubenswrapper[4700]: I1007 11:45:27.593187 4700 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2f7ffb6-7be6-46ab-b4c7-20244b5d813e-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 11:45:27 crc kubenswrapper[4700]: I1007 11:45:27.726071 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2f2qn"] Oct 07 11:45:27 crc kubenswrapper[4700]: I1007 11:45:27.762777 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2f2qn"] Oct 07 11:45:27 crc kubenswrapper[4700]: I1007 11:45:27.968069 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2f7ffb6-7be6-46ab-b4c7-20244b5d813e" path="/var/lib/kubelet/pods/f2f7ffb6-7be6-46ab-b4c7-20244b5d813e/volumes" Oct 07 11:45:28 crc kubenswrapper[4700]: I1007 11:45:28.395764 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h5tvz" event={"ID":"08d24f2c-f43c-4512-923c-759d1b90f2c7","Type":"ContainerStarted","Data":"16ca5e16cf8fe9e43faf43ef9d38a1d2f1d4a3522778e83ebe4dcf89d0314842"} Oct 07 11:45:30 crc kubenswrapper[4700]: I1007 11:45:30.422145 4700 generic.go:334] "Generic (PLEG): container finished" podID="08d24f2c-f43c-4512-923c-759d1b90f2c7" containerID="16ca5e16cf8fe9e43faf43ef9d38a1d2f1d4a3522778e83ebe4dcf89d0314842" exitCode=0 Oct 07 11:45:30 crc kubenswrapper[4700]: I1007 11:45:30.422217 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h5tvz" event={"ID":"08d24f2c-f43c-4512-923c-759d1b90f2c7","Type":"ContainerDied","Data":"16ca5e16cf8fe9e43faf43ef9d38a1d2f1d4a3522778e83ebe4dcf89d0314842"} Oct 07 11:45:31 crc kubenswrapper[4700]: I1007 11:45:31.440555 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h5tvz" event={"ID":"08d24f2c-f43c-4512-923c-759d1b90f2c7","Type":"ContainerStarted","Data":"4dd32be83094ba3a91097aa76cef6b1ebc1b1a9d6ba2d4754aedbd01002ff9b3"} Oct 07 11:45:31 crc kubenswrapper[4700]: I1007 11:45:31.461971 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-h5tvz" podStartSLOduration=2.771203747 podStartE2EDuration="7.461944971s" podCreationTimestamp="2025-10-07 11:45:24 +0000 UTC" firstStartedPulling="2025-10-07 11:45:26.37220165 +0000 UTC m=+1493.168600649" lastFinishedPulling="2025-10-07 11:45:31.062942854 +0000 UTC m=+1497.859341873" observedRunningTime="2025-10-07 11:45:31.458918162 +0000 UTC m=+1498.255317201" watchObservedRunningTime="2025-10-07 11:45:31.461944971 +0000 UTC m=+1498.258343990" Oct 07 11:45:34 crc kubenswrapper[4700]: I1007 11:45:34.787409 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-h5tvz" Oct 07 11:45:34 crc kubenswrapper[4700]: I1007 11:45:34.787971 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-h5tvz" Oct 07 11:45:35 crc kubenswrapper[4700]: I1007 11:45:35.849562 4700 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-h5tvz" podUID="08d24f2c-f43c-4512-923c-759d1b90f2c7" containerName="registry-server" probeResult="failure" output=< Oct 07 11:45:35 crc kubenswrapper[4700]: timeout: failed to connect service ":50051" within 1s Oct 07 11:45:35 crc kubenswrapper[4700]: > Oct 07 11:45:44 crc kubenswrapper[4700]: I1007 11:45:44.862602 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-h5tvz" Oct 07 11:45:44 crc kubenswrapper[4700]: I1007 11:45:44.935135 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-h5tvz" Oct 07 11:45:45 crc kubenswrapper[4700]: I1007 11:45:45.103733 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h5tvz"] Oct 07 11:45:45 crc kubenswrapper[4700]: I1007 11:45:45.334255 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 11:45:45 crc kubenswrapper[4700]: I1007 11:45:45.334359 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 11:45:45 crc kubenswrapper[4700]: I1007 11:45:45.334429 4700 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" Oct 07 11:45:45 crc kubenswrapper[4700]: I1007 11:45:45.335222 4700 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cd159b1c4a4e5543066d303bc63fe54cd56f2bbe3439d5248b95da07598874b4"} pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 11:45:45 crc kubenswrapper[4700]: I1007 11:45:45.335347 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" containerID="cri-o://cd159b1c4a4e5543066d303bc63fe54cd56f2bbe3439d5248b95da07598874b4" gracePeriod=600 Oct 07 11:45:45 crc kubenswrapper[4700]: E1007 11:45:45.468006 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:45:45 crc kubenswrapper[4700]: I1007 11:45:45.576760 4700 generic.go:334] "Generic (PLEG): container finished" podID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerID="cd159b1c4a4e5543066d303bc63fe54cd56f2bbe3439d5248b95da07598874b4" exitCode=0 Oct 07 11:45:45 crc kubenswrapper[4700]: I1007 11:45:45.576897 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" event={"ID":"97a77b38-e9b1-4243-ac3a-28d83d87cf15","Type":"ContainerDied","Data":"cd159b1c4a4e5543066d303bc63fe54cd56f2bbe3439d5248b95da07598874b4"} Oct 07 11:45:45 crc kubenswrapper[4700]: I1007 11:45:45.576973 4700 scope.go:117] "RemoveContainer" containerID="2dd00ae003149c481b44bf29df7a596aca95ac6b3173a4a0af3e08d67d5e4363" Oct 07 11:45:45 crc kubenswrapper[4700]: I1007 11:45:45.577634 4700 scope.go:117] "RemoveContainer" containerID="cd159b1c4a4e5543066d303bc63fe54cd56f2bbe3439d5248b95da07598874b4" Oct 07 11:45:45 crc kubenswrapper[4700]: E1007 11:45:45.578932 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:45:46 crc kubenswrapper[4700]: I1007 11:45:46.590248 4700 generic.go:334] "Generic (PLEG): container finished" podID="112df1a0-e767-41be-a95e-4f7e62024fa2" containerID="55d48e8123e7a01882152b88c0ed844cd98b53a93cc9ffea7c95b0fc0ba3f448" exitCode=0 Oct 07 11:45:46 crc kubenswrapper[4700]: I1007 11:45:46.590340 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx" event={"ID":"112df1a0-e767-41be-a95e-4f7e62024fa2","Type":"ContainerDied","Data":"55d48e8123e7a01882152b88c0ed844cd98b53a93cc9ffea7c95b0fc0ba3f448"} Oct 07 11:45:46 crc kubenswrapper[4700]: I1007 11:45:46.593810 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-h5tvz" podUID="08d24f2c-f43c-4512-923c-759d1b90f2c7" containerName="registry-server" containerID="cri-o://4dd32be83094ba3a91097aa76cef6b1ebc1b1a9d6ba2d4754aedbd01002ff9b3" gracePeriod=2 Oct 07 11:45:47 crc kubenswrapper[4700]: I1007 11:45:47.107840 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h5tvz" Oct 07 11:45:47 crc kubenswrapper[4700]: I1007 11:45:47.239757 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08d24f2c-f43c-4512-923c-759d1b90f2c7-utilities\") pod \"08d24f2c-f43c-4512-923c-759d1b90f2c7\" (UID: \"08d24f2c-f43c-4512-923c-759d1b90f2c7\") " Oct 07 11:45:47 crc kubenswrapper[4700]: I1007 11:45:47.239895 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08d24f2c-f43c-4512-923c-759d1b90f2c7-catalog-content\") pod \"08d24f2c-f43c-4512-923c-759d1b90f2c7\" (UID: \"08d24f2c-f43c-4512-923c-759d1b90f2c7\") " Oct 07 11:45:47 crc kubenswrapper[4700]: I1007 11:45:47.239920 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6xnb\" (UniqueName: \"kubernetes.io/projected/08d24f2c-f43c-4512-923c-759d1b90f2c7-kube-api-access-s6xnb\") pod \"08d24f2c-f43c-4512-923c-759d1b90f2c7\" (UID: \"08d24f2c-f43c-4512-923c-759d1b90f2c7\") " Oct 07 11:45:47 crc kubenswrapper[4700]: I1007 11:45:47.241437 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08d24f2c-f43c-4512-923c-759d1b90f2c7-utilities" (OuterVolumeSpecName: "utilities") pod "08d24f2c-f43c-4512-923c-759d1b90f2c7" (UID: "08d24f2c-f43c-4512-923c-759d1b90f2c7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:45:47 crc kubenswrapper[4700]: I1007 11:45:47.261212 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08d24f2c-f43c-4512-923c-759d1b90f2c7-kube-api-access-s6xnb" (OuterVolumeSpecName: "kube-api-access-s6xnb") pod "08d24f2c-f43c-4512-923c-759d1b90f2c7" (UID: "08d24f2c-f43c-4512-923c-759d1b90f2c7"). InnerVolumeSpecName "kube-api-access-s6xnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:45:47 crc kubenswrapper[4700]: I1007 11:45:47.342671 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6xnb\" (UniqueName: \"kubernetes.io/projected/08d24f2c-f43c-4512-923c-759d1b90f2c7-kube-api-access-s6xnb\") on node \"crc\" DevicePath \"\"" Oct 07 11:45:47 crc kubenswrapper[4700]: I1007 11:45:47.342935 4700 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08d24f2c-f43c-4512-923c-759d1b90f2c7-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 11:45:47 crc kubenswrapper[4700]: I1007 11:45:47.365705 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08d24f2c-f43c-4512-923c-759d1b90f2c7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "08d24f2c-f43c-4512-923c-759d1b90f2c7" (UID: "08d24f2c-f43c-4512-923c-759d1b90f2c7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:45:47 crc kubenswrapper[4700]: I1007 11:45:47.444484 4700 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08d24f2c-f43c-4512-923c-759d1b90f2c7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 11:45:47 crc kubenswrapper[4700]: I1007 11:45:47.605685 4700 generic.go:334] "Generic (PLEG): container finished" podID="08d24f2c-f43c-4512-923c-759d1b90f2c7" containerID="4dd32be83094ba3a91097aa76cef6b1ebc1b1a9d6ba2d4754aedbd01002ff9b3" exitCode=0 Oct 07 11:45:47 crc kubenswrapper[4700]: I1007 11:45:47.605784 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h5tvz" event={"ID":"08d24f2c-f43c-4512-923c-759d1b90f2c7","Type":"ContainerDied","Data":"4dd32be83094ba3a91097aa76cef6b1ebc1b1a9d6ba2d4754aedbd01002ff9b3"} Oct 07 11:45:47 crc kubenswrapper[4700]: I1007 11:45:47.605797 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h5tvz" Oct 07 11:45:47 crc kubenswrapper[4700]: I1007 11:45:47.605831 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h5tvz" event={"ID":"08d24f2c-f43c-4512-923c-759d1b90f2c7","Type":"ContainerDied","Data":"073ea3b952cba4264439ca8066313002c1a082382dda3cb45f3ae27f915ee925"} Oct 07 11:45:47 crc kubenswrapper[4700]: I1007 11:45:47.605852 4700 scope.go:117] "RemoveContainer" containerID="4dd32be83094ba3a91097aa76cef6b1ebc1b1a9d6ba2d4754aedbd01002ff9b3" Oct 07 11:45:47 crc kubenswrapper[4700]: I1007 11:45:47.661663 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h5tvz"] Oct 07 11:45:47 crc kubenswrapper[4700]: I1007 11:45:47.669560 4700 scope.go:117] "RemoveContainer" containerID="16ca5e16cf8fe9e43faf43ef9d38a1d2f1d4a3522778e83ebe4dcf89d0314842" Oct 07 11:45:47 crc kubenswrapper[4700]: I1007 11:45:47.671139 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-h5tvz"] Oct 07 11:45:47 crc kubenswrapper[4700]: I1007 11:45:47.699513 4700 scope.go:117] "RemoveContainer" containerID="50906c5975829320dba8f0365a0c62e002ff885e5a53931ee73189db77f9c9e2" Oct 07 11:45:47 crc kubenswrapper[4700]: I1007 11:45:47.757640 4700 scope.go:117] "RemoveContainer" containerID="4dd32be83094ba3a91097aa76cef6b1ebc1b1a9d6ba2d4754aedbd01002ff9b3" Oct 07 11:45:47 crc kubenswrapper[4700]: E1007 11:45:47.758069 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dd32be83094ba3a91097aa76cef6b1ebc1b1a9d6ba2d4754aedbd01002ff9b3\": container with ID starting with 4dd32be83094ba3a91097aa76cef6b1ebc1b1a9d6ba2d4754aedbd01002ff9b3 not found: ID does not exist" containerID="4dd32be83094ba3a91097aa76cef6b1ebc1b1a9d6ba2d4754aedbd01002ff9b3" Oct 07 11:45:47 crc kubenswrapper[4700]: I1007 11:45:47.758127 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dd32be83094ba3a91097aa76cef6b1ebc1b1a9d6ba2d4754aedbd01002ff9b3"} err="failed to get container status \"4dd32be83094ba3a91097aa76cef6b1ebc1b1a9d6ba2d4754aedbd01002ff9b3\": rpc error: code = NotFound desc = could not find container \"4dd32be83094ba3a91097aa76cef6b1ebc1b1a9d6ba2d4754aedbd01002ff9b3\": container with ID starting with 4dd32be83094ba3a91097aa76cef6b1ebc1b1a9d6ba2d4754aedbd01002ff9b3 not found: ID does not exist" Oct 07 11:45:47 crc kubenswrapper[4700]: I1007 11:45:47.758169 4700 scope.go:117] "RemoveContainer" containerID="16ca5e16cf8fe9e43faf43ef9d38a1d2f1d4a3522778e83ebe4dcf89d0314842" Oct 07 11:45:47 crc kubenswrapper[4700]: E1007 11:45:47.758701 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16ca5e16cf8fe9e43faf43ef9d38a1d2f1d4a3522778e83ebe4dcf89d0314842\": container with ID starting with 16ca5e16cf8fe9e43faf43ef9d38a1d2f1d4a3522778e83ebe4dcf89d0314842 not found: ID does not exist" containerID="16ca5e16cf8fe9e43faf43ef9d38a1d2f1d4a3522778e83ebe4dcf89d0314842" Oct 07 11:45:47 crc kubenswrapper[4700]: I1007 11:45:47.758731 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16ca5e16cf8fe9e43faf43ef9d38a1d2f1d4a3522778e83ebe4dcf89d0314842"} err="failed to get container status \"16ca5e16cf8fe9e43faf43ef9d38a1d2f1d4a3522778e83ebe4dcf89d0314842\": rpc error: code = NotFound desc = could not find container \"16ca5e16cf8fe9e43faf43ef9d38a1d2f1d4a3522778e83ebe4dcf89d0314842\": container with ID starting with 16ca5e16cf8fe9e43faf43ef9d38a1d2f1d4a3522778e83ebe4dcf89d0314842 not found: ID does not exist" Oct 07 11:45:47 crc kubenswrapper[4700]: I1007 11:45:47.758756 4700 scope.go:117] "RemoveContainer" containerID="50906c5975829320dba8f0365a0c62e002ff885e5a53931ee73189db77f9c9e2" Oct 07 11:45:47 crc kubenswrapper[4700]: E1007 11:45:47.759035 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50906c5975829320dba8f0365a0c62e002ff885e5a53931ee73189db77f9c9e2\": container with ID starting with 50906c5975829320dba8f0365a0c62e002ff885e5a53931ee73189db77f9c9e2 not found: ID does not exist" containerID="50906c5975829320dba8f0365a0c62e002ff885e5a53931ee73189db77f9c9e2" Oct 07 11:45:47 crc kubenswrapper[4700]: I1007 11:45:47.759077 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50906c5975829320dba8f0365a0c62e002ff885e5a53931ee73189db77f9c9e2"} err="failed to get container status \"50906c5975829320dba8f0365a0c62e002ff885e5a53931ee73189db77f9c9e2\": rpc error: code = NotFound desc = could not find container \"50906c5975829320dba8f0365a0c62e002ff885e5a53931ee73189db77f9c9e2\": container with ID starting with 50906c5975829320dba8f0365a0c62e002ff885e5a53931ee73189db77f9c9e2 not found: ID does not exist" Oct 07 11:45:47 crc kubenswrapper[4700]: I1007 11:45:47.970263 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08d24f2c-f43c-4512-923c-759d1b90f2c7" path="/var/lib/kubelet/pods/08d24f2c-f43c-4512-923c-759d1b90f2c7/volumes" Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.035230 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx" Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.160359 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/112df1a0-e767-41be-a95e-4f7e62024fa2-ssh-key\") pod \"112df1a0-e767-41be-a95e-4f7e62024fa2\" (UID: \"112df1a0-e767-41be-a95e-4f7e62024fa2\") " Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.160839 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/112df1a0-e767-41be-a95e-4f7e62024fa2-inventory\") pod \"112df1a0-e767-41be-a95e-4f7e62024fa2\" (UID: \"112df1a0-e767-41be-a95e-4f7e62024fa2\") " Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.160961 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112df1a0-e767-41be-a95e-4f7e62024fa2-bootstrap-combined-ca-bundle\") pod \"112df1a0-e767-41be-a95e-4f7e62024fa2\" (UID: \"112df1a0-e767-41be-a95e-4f7e62024fa2\") " Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.161168 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wv7tl\" (UniqueName: \"kubernetes.io/projected/112df1a0-e767-41be-a95e-4f7e62024fa2-kube-api-access-wv7tl\") pod \"112df1a0-e767-41be-a95e-4f7e62024fa2\" (UID: \"112df1a0-e767-41be-a95e-4f7e62024fa2\") " Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.166760 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/112df1a0-e767-41be-a95e-4f7e62024fa2-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "112df1a0-e767-41be-a95e-4f7e62024fa2" (UID: "112df1a0-e767-41be-a95e-4f7e62024fa2"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.167005 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/112df1a0-e767-41be-a95e-4f7e62024fa2-kube-api-access-wv7tl" (OuterVolumeSpecName: "kube-api-access-wv7tl") pod "112df1a0-e767-41be-a95e-4f7e62024fa2" (UID: "112df1a0-e767-41be-a95e-4f7e62024fa2"). InnerVolumeSpecName "kube-api-access-wv7tl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.199892 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/112df1a0-e767-41be-a95e-4f7e62024fa2-inventory" (OuterVolumeSpecName: "inventory") pod "112df1a0-e767-41be-a95e-4f7e62024fa2" (UID: "112df1a0-e767-41be-a95e-4f7e62024fa2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.210371 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/112df1a0-e767-41be-a95e-4f7e62024fa2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "112df1a0-e767-41be-a95e-4f7e62024fa2" (UID: "112df1a0-e767-41be-a95e-4f7e62024fa2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.264045 4700 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/112df1a0-e767-41be-a95e-4f7e62024fa2-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.264075 4700 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/112df1a0-e767-41be-a95e-4f7e62024fa2-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.264085 4700 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112df1a0-e767-41be-a95e-4f7e62024fa2-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.264095 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wv7tl\" (UniqueName: \"kubernetes.io/projected/112df1a0-e767-41be-a95e-4f7e62024fa2-kube-api-access-wv7tl\") on node \"crc\" DevicePath \"\"" Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.636104 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx" event={"ID":"112df1a0-e767-41be-a95e-4f7e62024fa2","Type":"ContainerDied","Data":"8c8e599a4b174e0e91abfd16f8a53691308ac12983faf7e488d26d7a0cd19432"} Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.636166 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c8e599a4b174e0e91abfd16f8a53691308ac12983faf7e488d26d7a0cd19432" Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.636265 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx" Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.746677 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wkfzn"] Oct 07 11:45:48 crc kubenswrapper[4700]: E1007 11:45:48.747783 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2f7ffb6-7be6-46ab-b4c7-20244b5d813e" containerName="extract-content" Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.747818 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2f7ffb6-7be6-46ab-b4c7-20244b5d813e" containerName="extract-content" Oct 07 11:45:48 crc kubenswrapper[4700]: E1007 11:45:48.747842 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08d24f2c-f43c-4512-923c-759d1b90f2c7" containerName="registry-server" Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.747850 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="08d24f2c-f43c-4512-923c-759d1b90f2c7" containerName="registry-server" Oct 07 11:45:48 crc kubenswrapper[4700]: E1007 11:45:48.747869 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08d24f2c-f43c-4512-923c-759d1b90f2c7" containerName="extract-utilities" Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.747879 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="08d24f2c-f43c-4512-923c-759d1b90f2c7" containerName="extract-utilities" Oct 07 11:45:48 crc kubenswrapper[4700]: E1007 11:45:48.747901 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08d24f2c-f43c-4512-923c-759d1b90f2c7" containerName="extract-content" Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.747908 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="08d24f2c-f43c-4512-923c-759d1b90f2c7" containerName="extract-content" Oct 07 11:45:48 crc kubenswrapper[4700]: E1007 11:45:48.747917 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="112df1a0-e767-41be-a95e-4f7e62024fa2" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.747926 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="112df1a0-e767-41be-a95e-4f7e62024fa2" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 07 11:45:48 crc kubenswrapper[4700]: E1007 11:45:48.747951 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2f7ffb6-7be6-46ab-b4c7-20244b5d813e" containerName="extract-utilities" Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.747958 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2f7ffb6-7be6-46ab-b4c7-20244b5d813e" containerName="extract-utilities" Oct 07 11:45:48 crc kubenswrapper[4700]: E1007 11:45:48.747968 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2f7ffb6-7be6-46ab-b4c7-20244b5d813e" containerName="registry-server" Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.747975 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2f7ffb6-7be6-46ab-b4c7-20244b5d813e" containerName="registry-server" Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.748216 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2f7ffb6-7be6-46ab-b4c7-20244b5d813e" containerName="registry-server" Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.748236 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="08d24f2c-f43c-4512-923c-759d1b90f2c7" containerName="registry-server" Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.748249 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="112df1a0-e767-41be-a95e-4f7e62024fa2" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.749055 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wkfzn" Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.752868 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxxzn" Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.753070 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.753217 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.753586 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.755559 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wkfzn"] Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.874166 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcmpn\" (UniqueName: \"kubernetes.io/projected/f2864034-dca7-4ae9-b846-17c9ba11e35c-kube-api-access-wcmpn\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wkfzn\" (UID: \"f2864034-dca7-4ae9-b846-17c9ba11e35c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wkfzn" Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.874682 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2864034-dca7-4ae9-b846-17c9ba11e35c-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wkfzn\" (UID: \"f2864034-dca7-4ae9-b846-17c9ba11e35c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wkfzn" Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.874758 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2864034-dca7-4ae9-b846-17c9ba11e35c-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wkfzn\" (UID: \"f2864034-dca7-4ae9-b846-17c9ba11e35c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wkfzn" Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.977187 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2864034-dca7-4ae9-b846-17c9ba11e35c-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wkfzn\" (UID: \"f2864034-dca7-4ae9-b846-17c9ba11e35c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wkfzn" Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.977247 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2864034-dca7-4ae9-b846-17c9ba11e35c-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wkfzn\" (UID: \"f2864034-dca7-4ae9-b846-17c9ba11e35c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wkfzn" Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.977372 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcmpn\" (UniqueName: \"kubernetes.io/projected/f2864034-dca7-4ae9-b846-17c9ba11e35c-kube-api-access-wcmpn\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wkfzn\" (UID: \"f2864034-dca7-4ae9-b846-17c9ba11e35c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wkfzn" Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.982758 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2864034-dca7-4ae9-b846-17c9ba11e35c-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wkfzn\" (UID: \"f2864034-dca7-4ae9-b846-17c9ba11e35c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wkfzn" Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.983238 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2864034-dca7-4ae9-b846-17c9ba11e35c-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wkfzn\" (UID: \"f2864034-dca7-4ae9-b846-17c9ba11e35c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wkfzn" Oct 07 11:45:48 crc kubenswrapper[4700]: I1007 11:45:48.999830 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcmpn\" (UniqueName: \"kubernetes.io/projected/f2864034-dca7-4ae9-b846-17c9ba11e35c-kube-api-access-wcmpn\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wkfzn\" (UID: \"f2864034-dca7-4ae9-b846-17c9ba11e35c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wkfzn" Oct 07 11:45:49 crc kubenswrapper[4700]: I1007 11:45:49.121623 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wkfzn" Oct 07 11:45:49 crc kubenswrapper[4700]: I1007 11:45:49.762906 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wkfzn"] Oct 07 11:45:49 crc kubenswrapper[4700]: W1007 11:45:49.768467 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2864034_dca7_4ae9_b846_17c9ba11e35c.slice/crio-4c8e3f5ea5856d4a763818cafa36c89c5f83b5865d0cddfb10c5d8b17732d131 WatchSource:0}: Error finding container 4c8e3f5ea5856d4a763818cafa36c89c5f83b5865d0cddfb10c5d8b17732d131: Status 404 returned error can't find the container with id 4c8e3f5ea5856d4a763818cafa36c89c5f83b5865d0cddfb10c5d8b17732d131 Oct 07 11:45:50 crc kubenswrapper[4700]: I1007 11:45:50.663164 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wkfzn" event={"ID":"f2864034-dca7-4ae9-b846-17c9ba11e35c","Type":"ContainerStarted","Data":"0829f7217ef9d3b51b715debc8564833079451bc862449eade939ea045baf0f1"} Oct 07 11:45:50 crc kubenswrapper[4700]: I1007 11:45:50.663558 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wkfzn" event={"ID":"f2864034-dca7-4ae9-b846-17c9ba11e35c","Type":"ContainerStarted","Data":"4c8e3f5ea5856d4a763818cafa36c89c5f83b5865d0cddfb10c5d8b17732d131"} Oct 07 11:45:50 crc kubenswrapper[4700]: I1007 11:45:50.692022 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wkfzn" podStartSLOduration=2.154214542 podStartE2EDuration="2.691987622s" podCreationTimestamp="2025-10-07 11:45:48 +0000 UTC" firstStartedPulling="2025-10-07 11:45:49.770802942 +0000 UTC m=+1516.567201931" lastFinishedPulling="2025-10-07 11:45:50.308576012 +0000 UTC m=+1517.104975011" observedRunningTime="2025-10-07 11:45:50.679599718 +0000 UTC m=+1517.475998747" watchObservedRunningTime="2025-10-07 11:45:50.691987622 +0000 UTC m=+1517.488386651" Oct 07 11:45:55 crc kubenswrapper[4700]: I1007 11:45:55.957492 4700 scope.go:117] "RemoveContainer" containerID="cd159b1c4a4e5543066d303bc63fe54cd56f2bbe3439d5248b95da07598874b4" Oct 07 11:45:55 crc kubenswrapper[4700]: E1007 11:45:55.960829 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:46:04 crc kubenswrapper[4700]: I1007 11:46:04.009838 4700 scope.go:117] "RemoveContainer" containerID="9acc0339b9a2fc597468ca200b34801b9e52360bb8f3b2bf05a27c107357cbb1" Oct 07 11:46:04 crc kubenswrapper[4700]: I1007 11:46:04.049694 4700 scope.go:117] "RemoveContainer" containerID="276f1b7b755a38b65f42320526b73e464b73cd602d0a8ea7c35f278534e947d7" Oct 07 11:46:09 crc kubenswrapper[4700]: I1007 11:46:09.957738 4700 scope.go:117] "RemoveContainer" containerID="cd159b1c4a4e5543066d303bc63fe54cd56f2bbe3439d5248b95da07598874b4" Oct 07 11:46:09 crc kubenswrapper[4700]: E1007 11:46:09.958648 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:46:23 crc kubenswrapper[4700]: I1007 11:46:23.966437 4700 scope.go:117] "RemoveContainer" containerID="cd159b1c4a4e5543066d303bc63fe54cd56f2bbe3439d5248b95da07598874b4" Oct 07 11:46:23 crc kubenswrapper[4700]: E1007 11:46:23.967998 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:46:32 crc kubenswrapper[4700]: I1007 11:46:32.041503 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-v9hfj"] Oct 07 11:46:32 crc kubenswrapper[4700]: I1007 11:46:32.051367 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-bbffr"] Oct 07 11:46:32 crc kubenswrapper[4700]: I1007 11:46:32.058513 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-qn762"] Oct 07 11:46:32 crc kubenswrapper[4700]: I1007 11:46:32.066031 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-v9hfj"] Oct 07 11:46:32 crc kubenswrapper[4700]: I1007 11:46:32.074101 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-qn762"] Oct 07 11:46:32 crc kubenswrapper[4700]: I1007 11:46:32.082078 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-bbffr"] Oct 07 11:46:33 crc kubenswrapper[4700]: I1007 11:46:33.985541 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5e9d9fd-6daa-4dd6-a1d7-3dbdf6384b09" path="/var/lib/kubelet/pods/c5e9d9fd-6daa-4dd6-a1d7-3dbdf6384b09/volumes" Oct 07 11:46:33 crc kubenswrapper[4700]: I1007 11:46:33.987135 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea8dd361-88e8-4530-9d61-ac84770c792e" path="/var/lib/kubelet/pods/ea8dd361-88e8-4530-9d61-ac84770c792e/volumes" Oct 07 11:46:33 crc kubenswrapper[4700]: I1007 11:46:33.988340 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9008225-ce95-4591-9a50-8f2982a231a5" path="/var/lib/kubelet/pods/f9008225-ce95-4591-9a50-8f2982a231a5/volumes" Oct 07 11:46:37 crc kubenswrapper[4700]: I1007 11:46:37.957896 4700 scope.go:117] "RemoveContainer" containerID="cd159b1c4a4e5543066d303bc63fe54cd56f2bbe3439d5248b95da07598874b4" Oct 07 11:46:37 crc kubenswrapper[4700]: E1007 11:46:37.958986 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:46:42 crc kubenswrapper[4700]: I1007 11:46:42.027079 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-ddb5-account-create-6bjgp"] Oct 07 11:46:42 crc kubenswrapper[4700]: I1007 11:46:42.035643 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-ddb5-account-create-6bjgp"] Oct 07 11:46:43 crc kubenswrapper[4700]: I1007 11:46:43.985149 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3eb24dac-4e3e-44db-bdb3-b5dc85b446af" path="/var/lib/kubelet/pods/3eb24dac-4e3e-44db-bdb3-b5dc85b446af/volumes" Oct 07 11:46:47 crc kubenswrapper[4700]: I1007 11:46:47.047934 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-4970-account-create-b52jf"] Oct 07 11:46:47 crc kubenswrapper[4700]: I1007 11:46:47.063868 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5355-account-create-2pblh"] Oct 07 11:46:47 crc kubenswrapper[4700]: I1007 11:46:47.075103 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-4970-account-create-b52jf"] Oct 07 11:46:47 crc kubenswrapper[4700]: I1007 11:46:47.086735 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5355-account-create-2pblh"] Oct 07 11:46:47 crc kubenswrapper[4700]: I1007 11:46:47.979777 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7923001-b1a5-4e98-8776-9acdf9f161a1" path="/var/lib/kubelet/pods/b7923001-b1a5-4e98-8776-9acdf9f161a1/volumes" Oct 07 11:46:47 crc kubenswrapper[4700]: I1007 11:46:47.981557 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cefeb69f-ade8-41be-9525-ed5d017c8b2b" path="/var/lib/kubelet/pods/cefeb69f-ade8-41be-9525-ed5d017c8b2b/volumes" Oct 07 11:46:50 crc kubenswrapper[4700]: I1007 11:46:50.957524 4700 scope.go:117] "RemoveContainer" containerID="cd159b1c4a4e5543066d303bc63fe54cd56f2bbe3439d5248b95da07598874b4" Oct 07 11:46:50 crc kubenswrapper[4700]: E1007 11:46:50.958479 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:47:04 crc kubenswrapper[4700]: I1007 11:47:04.058411 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-ckw8z"] Oct 07 11:47:04 crc kubenswrapper[4700]: I1007 11:47:04.070028 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-ckw8z"] Oct 07 11:47:04 crc kubenswrapper[4700]: I1007 11:47:04.168142 4700 scope.go:117] "RemoveContainer" containerID="3b2c970b225bc607b376513ede6e3ecd84aa161923b43c27a1cc7917a322c541" Oct 07 11:47:04 crc kubenswrapper[4700]: I1007 11:47:04.223455 4700 scope.go:117] "RemoveContainer" containerID="65d073bc1e52cf98a23725406668688e5c82fb318f5179cbc95e90e6c7ac74ce" Oct 07 11:47:04 crc kubenswrapper[4700]: I1007 11:47:04.279513 4700 scope.go:117] "RemoveContainer" containerID="b1c70a7c665f63dafd46d84580594819f06d1e9edaa6baa472bfafe3204dee91" Oct 07 11:47:04 crc kubenswrapper[4700]: I1007 11:47:04.341241 4700 scope.go:117] "RemoveContainer" containerID="fddd95d90b055b59f30571370ffbadef4dd2bfa0f1f04155a4586d921eee3bcd" Oct 07 11:47:04 crc kubenswrapper[4700]: I1007 11:47:04.371031 4700 scope.go:117] "RemoveContainer" containerID="de7e272971a3f058c8529c2a1e7dc497030c88992d5aedb6dfd742b8a505cbd8" Oct 07 11:47:04 crc kubenswrapper[4700]: I1007 11:47:04.413223 4700 scope.go:117] "RemoveContainer" containerID="a5b0c25c2b2a7455592b2febbd21a4fe93b84d1ed938c4e29250fa55202ee554" Oct 07 11:47:05 crc kubenswrapper[4700]: I1007 11:47:05.957724 4700 scope.go:117] "RemoveContainer" containerID="cd159b1c4a4e5543066d303bc63fe54cd56f2bbe3439d5248b95da07598874b4" Oct 07 11:47:05 crc kubenswrapper[4700]: E1007 11:47:05.958130 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:47:05 crc kubenswrapper[4700]: I1007 11:47:05.974197 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="392647fc-fc26-4946-9e59-9a3af9283041" path="/var/lib/kubelet/pods/392647fc-fc26-4946-9e59-9a3af9283041/volumes" Oct 07 11:47:07 crc kubenswrapper[4700]: I1007 11:47:07.040296 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-964gm"] Oct 07 11:47:07 crc kubenswrapper[4700]: I1007 11:47:07.055958 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-cktjx"] Oct 07 11:47:07 crc kubenswrapper[4700]: I1007 11:47:07.074715 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-ffmxn"] Oct 07 11:47:07 crc kubenswrapper[4700]: I1007 11:47:07.091033 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-964gm"] Oct 07 11:47:07 crc kubenswrapper[4700]: I1007 11:47:07.106174 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-cktjx"] Oct 07 11:47:07 crc kubenswrapper[4700]: I1007 11:47:07.115847 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-ffmxn"] Oct 07 11:47:07 crc kubenswrapper[4700]: I1007 11:47:07.970885 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41f08d93-f0df-4917-a21a-70f26ac33e1f" path="/var/lib/kubelet/pods/41f08d93-f0df-4917-a21a-70f26ac33e1f/volumes" Oct 07 11:47:07 crc kubenswrapper[4700]: I1007 11:47:07.971926 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd89bc50-bb18-405d-863b-5edf57509f7f" path="/var/lib/kubelet/pods/cd89bc50-bb18-405d-863b-5edf57509f7f/volumes" Oct 07 11:47:07 crc kubenswrapper[4700]: I1007 11:47:07.972803 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de501b70-8d75-489f-9826-e0f1dd7306f7" path="/var/lib/kubelet/pods/de501b70-8d75-489f-9826-e0f1dd7306f7/volumes" Oct 07 11:47:08 crc kubenswrapper[4700]: I1007 11:47:08.032341 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-24ghv"] Oct 07 11:47:08 crc kubenswrapper[4700]: I1007 11:47:08.043549 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-24ghv"] Oct 07 11:47:09 crc kubenswrapper[4700]: I1007 11:47:09.976095 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40ee0582-037f-452a-a529-bedd3e1f51c9" path="/var/lib/kubelet/pods/40ee0582-037f-452a-a529-bedd3e1f51c9/volumes" Oct 07 11:47:13 crc kubenswrapper[4700]: I1007 11:47:13.034963 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-8rr4j"] Oct 07 11:47:13 crc kubenswrapper[4700]: I1007 11:47:13.045865 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-8rr4j"] Oct 07 11:47:13 crc kubenswrapper[4700]: I1007 11:47:13.976832 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90c677ff-4aff-4044-b765-e0b3de056302" path="/var/lib/kubelet/pods/90c677ff-4aff-4044-b765-e0b3de056302/volumes" Oct 07 11:47:14 crc kubenswrapper[4700]: I1007 11:47:14.054623 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-897f-account-create-qqqnz"] Oct 07 11:47:14 crc kubenswrapper[4700]: I1007 11:47:14.070961 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-4750-account-create-h6rmf"] Oct 07 11:47:14 crc kubenswrapper[4700]: I1007 11:47:14.082491 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-897f-account-create-qqqnz"] Oct 07 11:47:14 crc kubenswrapper[4700]: I1007 11:47:14.093861 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-4750-account-create-h6rmf"] Oct 07 11:47:15 crc kubenswrapper[4700]: I1007 11:47:15.969977 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ab6e4ca-6676-4f0b-ac32-ca674ee7dfa9" path="/var/lib/kubelet/pods/1ab6e4ca-6676-4f0b-ac32-ca674ee7dfa9/volumes" Oct 07 11:47:15 crc kubenswrapper[4700]: I1007 11:47:15.971040 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bce568a-3e8c-4994-ab06-02cec9b80cc3" path="/var/lib/kubelet/pods/6bce568a-3e8c-4994-ab06-02cec9b80cc3/volumes" Oct 07 11:47:20 crc kubenswrapper[4700]: I1007 11:47:20.957543 4700 scope.go:117] "RemoveContainer" containerID="cd159b1c4a4e5543066d303bc63fe54cd56f2bbe3439d5248b95da07598874b4" Oct 07 11:47:20 crc kubenswrapper[4700]: E1007 11:47:20.958192 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:47:26 crc kubenswrapper[4700]: I1007 11:47:26.800504 4700 generic.go:334] "Generic (PLEG): container finished" podID="f2864034-dca7-4ae9-b846-17c9ba11e35c" containerID="0829f7217ef9d3b51b715debc8564833079451bc862449eade939ea045baf0f1" exitCode=0 Oct 07 11:47:26 crc kubenswrapper[4700]: I1007 11:47:26.800987 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wkfzn" event={"ID":"f2864034-dca7-4ae9-b846-17c9ba11e35c","Type":"ContainerDied","Data":"0829f7217ef9d3b51b715debc8564833079451bc862449eade939ea045baf0f1"} Oct 07 11:47:28 crc kubenswrapper[4700]: I1007 11:47:28.279004 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wkfzn" Oct 07 11:47:28 crc kubenswrapper[4700]: I1007 11:47:28.416778 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2864034-dca7-4ae9-b846-17c9ba11e35c-ssh-key\") pod \"f2864034-dca7-4ae9-b846-17c9ba11e35c\" (UID: \"f2864034-dca7-4ae9-b846-17c9ba11e35c\") " Oct 07 11:47:28 crc kubenswrapper[4700]: I1007 11:47:28.416888 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcmpn\" (UniqueName: \"kubernetes.io/projected/f2864034-dca7-4ae9-b846-17c9ba11e35c-kube-api-access-wcmpn\") pod \"f2864034-dca7-4ae9-b846-17c9ba11e35c\" (UID: \"f2864034-dca7-4ae9-b846-17c9ba11e35c\") " Oct 07 11:47:28 crc kubenswrapper[4700]: I1007 11:47:28.417116 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2864034-dca7-4ae9-b846-17c9ba11e35c-inventory\") pod \"f2864034-dca7-4ae9-b846-17c9ba11e35c\" (UID: \"f2864034-dca7-4ae9-b846-17c9ba11e35c\") " Oct 07 11:47:28 crc kubenswrapper[4700]: I1007 11:47:28.423373 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2864034-dca7-4ae9-b846-17c9ba11e35c-kube-api-access-wcmpn" (OuterVolumeSpecName: "kube-api-access-wcmpn") pod "f2864034-dca7-4ae9-b846-17c9ba11e35c" (UID: "f2864034-dca7-4ae9-b846-17c9ba11e35c"). InnerVolumeSpecName "kube-api-access-wcmpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:47:28 crc kubenswrapper[4700]: I1007 11:47:28.453230 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2864034-dca7-4ae9-b846-17c9ba11e35c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f2864034-dca7-4ae9-b846-17c9ba11e35c" (UID: "f2864034-dca7-4ae9-b846-17c9ba11e35c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:47:28 crc kubenswrapper[4700]: I1007 11:47:28.453234 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2864034-dca7-4ae9-b846-17c9ba11e35c-inventory" (OuterVolumeSpecName: "inventory") pod "f2864034-dca7-4ae9-b846-17c9ba11e35c" (UID: "f2864034-dca7-4ae9-b846-17c9ba11e35c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:47:28 crc kubenswrapper[4700]: I1007 11:47:28.520580 4700 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2864034-dca7-4ae9-b846-17c9ba11e35c-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 11:47:28 crc kubenswrapper[4700]: I1007 11:47:28.520617 4700 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2864034-dca7-4ae9-b846-17c9ba11e35c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 11:47:28 crc kubenswrapper[4700]: I1007 11:47:28.520630 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcmpn\" (UniqueName: \"kubernetes.io/projected/f2864034-dca7-4ae9-b846-17c9ba11e35c-kube-api-access-wcmpn\") on node \"crc\" DevicePath \"\"" Oct 07 11:47:28 crc kubenswrapper[4700]: I1007 11:47:28.826301 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wkfzn" event={"ID":"f2864034-dca7-4ae9-b846-17c9ba11e35c","Type":"ContainerDied","Data":"4c8e3f5ea5856d4a763818cafa36c89c5f83b5865d0cddfb10c5d8b17732d131"} Oct 07 11:47:28 crc kubenswrapper[4700]: I1007 11:47:28.826410 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wkfzn" Oct 07 11:47:28 crc kubenswrapper[4700]: I1007 11:47:28.827478 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c8e3f5ea5856d4a763818cafa36c89c5f83b5865d0cddfb10c5d8b17732d131" Oct 07 11:47:28 crc kubenswrapper[4700]: I1007 11:47:28.933425 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hxcml"] Oct 07 11:47:28 crc kubenswrapper[4700]: E1007 11:47:28.933937 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2864034-dca7-4ae9-b846-17c9ba11e35c" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 07 11:47:28 crc kubenswrapper[4700]: I1007 11:47:28.933972 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2864034-dca7-4ae9-b846-17c9ba11e35c" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 07 11:47:28 crc kubenswrapper[4700]: I1007 11:47:28.934289 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2864034-dca7-4ae9-b846-17c9ba11e35c" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 07 11:47:28 crc kubenswrapper[4700]: I1007 11:47:28.935158 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hxcml" Oct 07 11:47:28 crc kubenswrapper[4700]: I1007 11:47:28.941507 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 11:47:28 crc kubenswrapper[4700]: I1007 11:47:28.943066 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 11:47:28 crc kubenswrapper[4700]: I1007 11:47:28.943428 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxxzn" Oct 07 11:47:28 crc kubenswrapper[4700]: I1007 11:47:28.943664 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 11:47:28 crc kubenswrapper[4700]: I1007 11:47:28.968097 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hxcml"] Oct 07 11:47:29 crc kubenswrapper[4700]: I1007 11:47:29.030531 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjqqq\" (UniqueName: \"kubernetes.io/projected/e0170992-8798-4624-9953-368a237e9903-kube-api-access-wjqqq\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hxcml\" (UID: \"e0170992-8798-4624-9953-368a237e9903\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hxcml" Oct 07 11:47:29 crc kubenswrapper[4700]: I1007 11:47:29.030720 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0170992-8798-4624-9953-368a237e9903-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hxcml\" (UID: \"e0170992-8798-4624-9953-368a237e9903\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hxcml" Oct 07 11:47:29 crc kubenswrapper[4700]: I1007 11:47:29.031153 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0170992-8798-4624-9953-368a237e9903-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hxcml\" (UID: \"e0170992-8798-4624-9953-368a237e9903\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hxcml" Oct 07 11:47:29 crc kubenswrapper[4700]: I1007 11:47:29.133514 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0170992-8798-4624-9953-368a237e9903-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hxcml\" (UID: \"e0170992-8798-4624-9953-368a237e9903\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hxcml" Oct 07 11:47:29 crc kubenswrapper[4700]: I1007 11:47:29.133750 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0170992-8798-4624-9953-368a237e9903-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hxcml\" (UID: \"e0170992-8798-4624-9953-368a237e9903\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hxcml" Oct 07 11:47:29 crc kubenswrapper[4700]: I1007 11:47:29.133800 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjqqq\" (UniqueName: \"kubernetes.io/projected/e0170992-8798-4624-9953-368a237e9903-kube-api-access-wjqqq\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hxcml\" (UID: \"e0170992-8798-4624-9953-368a237e9903\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hxcml" Oct 07 11:47:29 crc kubenswrapper[4700]: I1007 11:47:29.143863 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0170992-8798-4624-9953-368a237e9903-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hxcml\" (UID: \"e0170992-8798-4624-9953-368a237e9903\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hxcml" Oct 07 11:47:29 crc kubenswrapper[4700]: I1007 11:47:29.159183 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0170992-8798-4624-9953-368a237e9903-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hxcml\" (UID: \"e0170992-8798-4624-9953-368a237e9903\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hxcml" Oct 07 11:47:29 crc kubenswrapper[4700]: I1007 11:47:29.167850 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjqqq\" (UniqueName: \"kubernetes.io/projected/e0170992-8798-4624-9953-368a237e9903-kube-api-access-wjqqq\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hxcml\" (UID: \"e0170992-8798-4624-9953-368a237e9903\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hxcml" Oct 07 11:47:29 crc kubenswrapper[4700]: I1007 11:47:29.269219 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hxcml" Oct 07 11:47:29 crc kubenswrapper[4700]: I1007 11:47:29.848762 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hxcml"] Oct 07 11:47:29 crc kubenswrapper[4700]: W1007 11:47:29.851939 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0170992_8798_4624_9953_368a237e9903.slice/crio-e9daa6ca01cba4a31f5edc333afb1bccbfbf4aa09cc888fed05912e7f9d205c1 WatchSource:0}: Error finding container e9daa6ca01cba4a31f5edc333afb1bccbfbf4aa09cc888fed05912e7f9d205c1: Status 404 returned error can't find the container with id e9daa6ca01cba4a31f5edc333afb1bccbfbf4aa09cc888fed05912e7f9d205c1 Oct 07 11:47:29 crc kubenswrapper[4700]: I1007 11:47:29.856445 4700 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 11:47:30 crc kubenswrapper[4700]: I1007 11:47:30.852516 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hxcml" event={"ID":"e0170992-8798-4624-9953-368a237e9903","Type":"ContainerStarted","Data":"cde1144e619f5cafea29009ea1c4a359e971cd1e4226cfc611ef2c32daa87d9f"} Oct 07 11:47:30 crc kubenswrapper[4700]: I1007 11:47:30.852863 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hxcml" event={"ID":"e0170992-8798-4624-9953-368a237e9903","Type":"ContainerStarted","Data":"e9daa6ca01cba4a31f5edc333afb1bccbfbf4aa09cc888fed05912e7f9d205c1"} Oct 07 11:47:30 crc kubenswrapper[4700]: I1007 11:47:30.882803 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hxcml" podStartSLOduration=2.179916215 podStartE2EDuration="2.882781249s" podCreationTimestamp="2025-10-07 11:47:28 +0000 UTC" firstStartedPulling="2025-10-07 11:47:29.856116305 +0000 UTC m=+1616.652515314" lastFinishedPulling="2025-10-07 11:47:30.558981349 +0000 UTC m=+1617.355380348" observedRunningTime="2025-10-07 11:47:30.878106207 +0000 UTC m=+1617.674505196" watchObservedRunningTime="2025-10-07 11:47:30.882781249 +0000 UTC m=+1617.679180238" Oct 07 11:47:34 crc kubenswrapper[4700]: I1007 11:47:34.956935 4700 scope.go:117] "RemoveContainer" containerID="cd159b1c4a4e5543066d303bc63fe54cd56f2bbe3439d5248b95da07598874b4" Oct 07 11:47:34 crc kubenswrapper[4700]: E1007 11:47:34.957637 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:47:40 crc kubenswrapper[4700]: I1007 11:47:40.056788 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-f1ef-account-create-jrd5q"] Oct 07 11:47:40 crc kubenswrapper[4700]: I1007 11:47:40.068745 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-f1ef-account-create-jrd5q"] Oct 07 11:47:40 crc kubenswrapper[4700]: I1007 11:47:40.076003 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b0d7-account-create-kdfv5"] Oct 07 11:47:40 crc kubenswrapper[4700]: I1007 11:47:40.083733 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b0d7-account-create-kdfv5"] Oct 07 11:47:41 crc kubenswrapper[4700]: I1007 11:47:41.973228 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="980702b0-2482-4109-8ece-d5ecef7f84fc" path="/var/lib/kubelet/pods/980702b0-2482-4109-8ece-d5ecef7f84fc/volumes" Oct 07 11:47:41 crc kubenswrapper[4700]: I1007 11:47:41.974632 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4cbf147-32c1-4ba8-b7b0-f91d1df46101" path="/var/lib/kubelet/pods/f4cbf147-32c1-4ba8-b7b0-f91d1df46101/volumes" Oct 07 11:47:43 crc kubenswrapper[4700]: I1007 11:47:43.039372 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-nkx5c"] Oct 07 11:47:43 crc kubenswrapper[4700]: I1007 11:47:43.052931 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-nkx5c"] Oct 07 11:47:43 crc kubenswrapper[4700]: I1007 11:47:43.978701 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ccc0ab6-f412-449d-bef2-82bbd06f3d9e" path="/var/lib/kubelet/pods/4ccc0ab6-f412-449d-bef2-82bbd06f3d9e/volumes" Oct 07 11:47:44 crc kubenswrapper[4700]: I1007 11:47:44.045225 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-ppsbl"] Oct 07 11:47:44 crc kubenswrapper[4700]: I1007 11:47:44.054135 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-6ftwf"] Oct 07 11:47:44 crc kubenswrapper[4700]: I1007 11:47:44.062878 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-6ftwf"] Oct 07 11:47:44 crc kubenswrapper[4700]: I1007 11:47:44.071699 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-ppsbl"] Oct 07 11:47:45 crc kubenswrapper[4700]: I1007 11:47:45.975492 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7aaa144-2b28-48f6-8398-d4b0766f53f4" path="/var/lib/kubelet/pods/c7aaa144-2b28-48f6-8398-d4b0766f53f4/volumes" Oct 07 11:47:45 crc kubenswrapper[4700]: I1007 11:47:45.977673 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e383a2d1-cd6e-4658-9148-ef9c25c63430" path="/var/lib/kubelet/pods/e383a2d1-cd6e-4658-9148-ef9c25c63430/volumes" Oct 07 11:47:46 crc kubenswrapper[4700]: I1007 11:47:46.957738 4700 scope.go:117] "RemoveContainer" containerID="cd159b1c4a4e5543066d303bc63fe54cd56f2bbe3439d5248b95da07598874b4" Oct 07 11:47:46 crc kubenswrapper[4700]: E1007 11:47:46.958588 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:47:59 crc kubenswrapper[4700]: I1007 11:47:59.957951 4700 scope.go:117] "RemoveContainer" containerID="cd159b1c4a4e5543066d303bc63fe54cd56f2bbe3439d5248b95da07598874b4" Oct 07 11:47:59 crc kubenswrapper[4700]: E1007 11:47:59.958968 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:48:04 crc kubenswrapper[4700]: I1007 11:48:04.574855 4700 scope.go:117] "RemoveContainer" containerID="a13a9a288261d076586d9b2623bfc75c55db7f5b74cc6e9d7728bf9295009722" Oct 07 11:48:04 crc kubenswrapper[4700]: I1007 11:48:04.621888 4700 scope.go:117] "RemoveContainer" containerID="6745a98b7adf70ff20c6d8fe1332644b0c1436c7dec4ad9e2e9653dcb7a9a283" Oct 07 11:48:04 crc kubenswrapper[4700]: I1007 11:48:04.677766 4700 scope.go:117] "RemoveContainer" containerID="64f961a309a5f564e3068d2b7817cb5cc6fe374991c15711927ddcdc348fc56a" Oct 07 11:48:04 crc kubenswrapper[4700]: I1007 11:48:04.737750 4700 scope.go:117] "RemoveContainer" containerID="9ed83b1e7864fed35099bab1bc30f07bce8ccf05ae47ab108205a8bed0d03735" Oct 07 11:48:04 crc kubenswrapper[4700]: I1007 11:48:04.778253 4700 scope.go:117] "RemoveContainer" containerID="e7570c0d7e2d342d603bedebed72f3dbb2dc11e306ee4c4366531fd76e44ed6a" Oct 07 11:48:04 crc kubenswrapper[4700]: I1007 11:48:04.807812 4700 scope.go:117] "RemoveContainer" containerID="119214a849ac67aa5eea076516b83ce8f957dd596c73519418edbb8198b10be8" Oct 07 11:48:04 crc kubenswrapper[4700]: I1007 11:48:04.848608 4700 scope.go:117] "RemoveContainer" containerID="40348253aebbd51116f135f23c070043b62fbab44caf0b885192942bf8a6c408" Oct 07 11:48:04 crc kubenswrapper[4700]: I1007 11:48:04.889727 4700 scope.go:117] "RemoveContainer" containerID="9819328bfdc94d3425997889115d50d17e11b13b7ab4f63ae8a7b884ef3f8890" Oct 07 11:48:04 crc kubenswrapper[4700]: I1007 11:48:04.960196 4700 scope.go:117] "RemoveContainer" containerID="9cf6042d6634403e51838cea86a617b31b97412c31cddb072e05249464f8d133" Oct 07 11:48:04 crc kubenswrapper[4700]: I1007 11:48:04.979412 4700 scope.go:117] "RemoveContainer" containerID="a2f607e9371a10c56ee4950077f53fad0da7e1ebb8672a4013168a25ebad525e" Oct 07 11:48:05 crc kubenswrapper[4700]: I1007 11:48:05.004376 4700 scope.go:117] "RemoveContainer" containerID="f5373611b12383a8276693b628d0d46bbe05067429ae8a07113a26b059d96ea7" Oct 07 11:48:05 crc kubenswrapper[4700]: I1007 11:48:05.047093 4700 scope.go:117] "RemoveContainer" containerID="ddf67e85849c352426d96ae60bc9a62de6d8fbb658ded3171492394360995f3e" Oct 07 11:48:05 crc kubenswrapper[4700]: I1007 11:48:05.076034 4700 scope.go:117] "RemoveContainer" containerID="c4a7c8d6c1646705da0979076f65c39e23bb8d3e449de3dc7b8a5014ca3b2f4e" Oct 07 11:48:05 crc kubenswrapper[4700]: I1007 11:48:05.094954 4700 scope.go:117] "RemoveContainer" containerID="9ecaf3e0157a3c65e904791cca5a5879c87804fb536def8eb1e20b4919783cc9" Oct 07 11:48:05 crc kubenswrapper[4700]: I1007 11:48:05.122000 4700 scope.go:117] "RemoveContainer" containerID="d6519d39549e4cafa9a6067d639928383eaf545d245f7975404db2d3de28f026" Oct 07 11:48:08 crc kubenswrapper[4700]: I1007 11:48:08.056876 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-lp48h"] Oct 07 11:48:08 crc kubenswrapper[4700]: I1007 11:48:08.069816 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-lp48h"] Oct 07 11:48:09 crc kubenswrapper[4700]: I1007 11:48:09.976721 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84e22581-4f7e-45b6-9932-aaa790c7825f" path="/var/lib/kubelet/pods/84e22581-4f7e-45b6-9932-aaa790c7825f/volumes" Oct 07 11:48:10 crc kubenswrapper[4700]: I1007 11:48:10.038339 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-bqgg5"] Oct 07 11:48:10 crc kubenswrapper[4700]: I1007 11:48:10.053058 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-bqgg5"] Oct 07 11:48:11 crc kubenswrapper[4700]: I1007 11:48:11.028994 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-z8zs9"] Oct 07 11:48:11 crc kubenswrapper[4700]: I1007 11:48:11.042641 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-z8zs9"] Oct 07 11:48:11 crc kubenswrapper[4700]: I1007 11:48:11.968945 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="587378c1-b0c6-4d31-980f-0b6e8e271903" path="/var/lib/kubelet/pods/587378c1-b0c6-4d31-980f-0b6e8e271903/volumes" Oct 07 11:48:11 crc kubenswrapper[4700]: I1007 11:48:11.969730 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdc86b02-ca87-4bda-869a-6fd42e5f5f1b" path="/var/lib/kubelet/pods/bdc86b02-ca87-4bda-869a-6fd42e5f5f1b/volumes" Oct 07 11:48:14 crc kubenswrapper[4700]: I1007 11:48:14.957604 4700 scope.go:117] "RemoveContainer" containerID="cd159b1c4a4e5543066d303bc63fe54cd56f2bbe3439d5248b95da07598874b4" Oct 07 11:48:14 crc kubenswrapper[4700]: E1007 11:48:14.958128 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:48:25 crc kubenswrapper[4700]: I1007 11:48:25.958140 4700 scope.go:117] "RemoveContainer" containerID="cd159b1c4a4e5543066d303bc63fe54cd56f2bbe3439d5248b95da07598874b4" Oct 07 11:48:25 crc kubenswrapper[4700]: E1007 11:48:25.959346 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:48:40 crc kubenswrapper[4700]: I1007 11:48:40.958616 4700 scope.go:117] "RemoveContainer" containerID="cd159b1c4a4e5543066d303bc63fe54cd56f2bbe3439d5248b95da07598874b4" Oct 07 11:48:40 crc kubenswrapper[4700]: E1007 11:48:40.959388 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:48:47 crc kubenswrapper[4700]: I1007 11:48:47.652952 4700 generic.go:334] "Generic (PLEG): container finished" podID="e0170992-8798-4624-9953-368a237e9903" containerID="cde1144e619f5cafea29009ea1c4a359e971cd1e4226cfc611ef2c32daa87d9f" exitCode=0 Oct 07 11:48:47 crc kubenswrapper[4700]: I1007 11:48:47.653061 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hxcml" event={"ID":"e0170992-8798-4624-9953-368a237e9903","Type":"ContainerDied","Data":"cde1144e619f5cafea29009ea1c4a359e971cd1e4226cfc611ef2c32daa87d9f"} Oct 07 11:48:49 crc kubenswrapper[4700]: I1007 11:48:49.111063 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hxcml" Oct 07 11:48:49 crc kubenswrapper[4700]: I1007 11:48:49.186758 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjqqq\" (UniqueName: \"kubernetes.io/projected/e0170992-8798-4624-9953-368a237e9903-kube-api-access-wjqqq\") pod \"e0170992-8798-4624-9953-368a237e9903\" (UID: \"e0170992-8798-4624-9953-368a237e9903\") " Oct 07 11:48:49 crc kubenswrapper[4700]: I1007 11:48:49.186931 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0170992-8798-4624-9953-368a237e9903-inventory\") pod \"e0170992-8798-4624-9953-368a237e9903\" (UID: \"e0170992-8798-4624-9953-368a237e9903\") " Oct 07 11:48:49 crc kubenswrapper[4700]: I1007 11:48:49.186958 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0170992-8798-4624-9953-368a237e9903-ssh-key\") pod \"e0170992-8798-4624-9953-368a237e9903\" (UID: \"e0170992-8798-4624-9953-368a237e9903\") " Oct 07 11:48:49 crc kubenswrapper[4700]: I1007 11:48:49.193519 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0170992-8798-4624-9953-368a237e9903-kube-api-access-wjqqq" (OuterVolumeSpecName: "kube-api-access-wjqqq") pod "e0170992-8798-4624-9953-368a237e9903" (UID: "e0170992-8798-4624-9953-368a237e9903"). InnerVolumeSpecName "kube-api-access-wjqqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:48:49 crc kubenswrapper[4700]: I1007 11:48:49.220646 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0170992-8798-4624-9953-368a237e9903-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e0170992-8798-4624-9953-368a237e9903" (UID: "e0170992-8798-4624-9953-368a237e9903"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:48:49 crc kubenswrapper[4700]: I1007 11:48:49.221402 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0170992-8798-4624-9953-368a237e9903-inventory" (OuterVolumeSpecName: "inventory") pod "e0170992-8798-4624-9953-368a237e9903" (UID: "e0170992-8798-4624-9953-368a237e9903"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:48:49 crc kubenswrapper[4700]: I1007 11:48:49.289012 4700 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0170992-8798-4624-9953-368a237e9903-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 11:48:49 crc kubenswrapper[4700]: I1007 11:48:49.289063 4700 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0170992-8798-4624-9953-368a237e9903-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 11:48:49 crc kubenswrapper[4700]: I1007 11:48:49.289077 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjqqq\" (UniqueName: \"kubernetes.io/projected/e0170992-8798-4624-9953-368a237e9903-kube-api-access-wjqqq\") on node \"crc\" DevicePath \"\"" Oct 07 11:48:49 crc kubenswrapper[4700]: I1007 11:48:49.672712 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hxcml" event={"ID":"e0170992-8798-4624-9953-368a237e9903","Type":"ContainerDied","Data":"e9daa6ca01cba4a31f5edc333afb1bccbfbf4aa09cc888fed05912e7f9d205c1"} Oct 07 11:48:49 crc kubenswrapper[4700]: I1007 11:48:49.672924 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9daa6ca01cba4a31f5edc333afb1bccbfbf4aa09cc888fed05912e7f9d205c1" Oct 07 11:48:49 crc kubenswrapper[4700]: I1007 11:48:49.672805 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hxcml" Oct 07 11:48:49 crc kubenswrapper[4700]: I1007 11:48:49.775515 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4xzsk"] Oct 07 11:48:49 crc kubenswrapper[4700]: E1007 11:48:49.776006 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0170992-8798-4624-9953-368a237e9903" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 07 11:48:49 crc kubenswrapper[4700]: I1007 11:48:49.776027 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0170992-8798-4624-9953-368a237e9903" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 07 11:48:49 crc kubenswrapper[4700]: I1007 11:48:49.776272 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0170992-8798-4624-9953-368a237e9903" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 07 11:48:49 crc kubenswrapper[4700]: I1007 11:48:49.777144 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4xzsk" Oct 07 11:48:49 crc kubenswrapper[4700]: I1007 11:48:49.780908 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 11:48:49 crc kubenswrapper[4700]: I1007 11:48:49.781217 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 11:48:49 crc kubenswrapper[4700]: I1007 11:48:49.781508 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxxzn" Oct 07 11:48:49 crc kubenswrapper[4700]: I1007 11:48:49.781674 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 11:48:49 crc kubenswrapper[4700]: I1007 11:48:49.787514 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4xzsk"] Oct 07 11:48:49 crc kubenswrapper[4700]: I1007 11:48:49.897451 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fb8189cc-a34b-4793-91c0-7c4d5b837374-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-4xzsk\" (UID: \"fb8189cc-a34b-4793-91c0-7c4d5b837374\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4xzsk" Oct 07 11:48:49 crc kubenswrapper[4700]: I1007 11:48:49.897491 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb8189cc-a34b-4793-91c0-7c4d5b837374-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-4xzsk\" (UID: \"fb8189cc-a34b-4793-91c0-7c4d5b837374\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4xzsk" Oct 07 11:48:49 crc kubenswrapper[4700]: I1007 11:48:49.897522 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2s9z\" (UniqueName: \"kubernetes.io/projected/fb8189cc-a34b-4793-91c0-7c4d5b837374-kube-api-access-c2s9z\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-4xzsk\" (UID: \"fb8189cc-a34b-4793-91c0-7c4d5b837374\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4xzsk" Oct 07 11:48:49 crc kubenswrapper[4700]: I1007 11:48:49.999098 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fb8189cc-a34b-4793-91c0-7c4d5b837374-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-4xzsk\" (UID: \"fb8189cc-a34b-4793-91c0-7c4d5b837374\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4xzsk" Oct 07 11:48:49 crc kubenswrapper[4700]: I1007 11:48:49.999142 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb8189cc-a34b-4793-91c0-7c4d5b837374-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-4xzsk\" (UID: \"fb8189cc-a34b-4793-91c0-7c4d5b837374\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4xzsk" Oct 07 11:48:49 crc kubenswrapper[4700]: I1007 11:48:49.999162 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2s9z\" (UniqueName: \"kubernetes.io/projected/fb8189cc-a34b-4793-91c0-7c4d5b837374-kube-api-access-c2s9z\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-4xzsk\" (UID: \"fb8189cc-a34b-4793-91c0-7c4d5b837374\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4xzsk" Oct 07 11:48:50 crc kubenswrapper[4700]: I1007 11:48:50.002276 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fb8189cc-a34b-4793-91c0-7c4d5b837374-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-4xzsk\" (UID: \"fb8189cc-a34b-4793-91c0-7c4d5b837374\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4xzsk" Oct 07 11:48:50 crc kubenswrapper[4700]: I1007 11:48:50.002382 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb8189cc-a34b-4793-91c0-7c4d5b837374-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-4xzsk\" (UID: \"fb8189cc-a34b-4793-91c0-7c4d5b837374\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4xzsk" Oct 07 11:48:50 crc kubenswrapper[4700]: I1007 11:48:50.027567 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2s9z\" (UniqueName: \"kubernetes.io/projected/fb8189cc-a34b-4793-91c0-7c4d5b837374-kube-api-access-c2s9z\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-4xzsk\" (UID: \"fb8189cc-a34b-4793-91c0-7c4d5b837374\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4xzsk" Oct 07 11:48:50 crc kubenswrapper[4700]: I1007 11:48:50.098256 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4xzsk" Oct 07 11:48:50 crc kubenswrapper[4700]: I1007 11:48:50.686774 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4xzsk"] Oct 07 11:48:51 crc kubenswrapper[4700]: I1007 11:48:51.704230 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4xzsk" event={"ID":"fb8189cc-a34b-4793-91c0-7c4d5b837374","Type":"ContainerStarted","Data":"d65388c9a39191ac575c3a34b091b26962400bb1d951ce6cfe866b0f65942710"} Oct 07 11:48:51 crc kubenswrapper[4700]: I1007 11:48:51.704897 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4xzsk" event={"ID":"fb8189cc-a34b-4793-91c0-7c4d5b837374","Type":"ContainerStarted","Data":"e717b5643a8f5d12a499bbc6622dae4a236a8555ee3cc932fd980ad46d98365a"} Oct 07 11:48:51 crc kubenswrapper[4700]: I1007 11:48:51.723678 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4xzsk" podStartSLOduration=2.092235875 podStartE2EDuration="2.72366019s" podCreationTimestamp="2025-10-07 11:48:49 +0000 UTC" firstStartedPulling="2025-10-07 11:48:50.688445923 +0000 UTC m=+1697.484844952" lastFinishedPulling="2025-10-07 11:48:51.319870268 +0000 UTC m=+1698.116269267" observedRunningTime="2025-10-07 11:48:51.718583747 +0000 UTC m=+1698.514982756" watchObservedRunningTime="2025-10-07 11:48:51.72366019 +0000 UTC m=+1698.520059189" Oct 07 11:48:54 crc kubenswrapper[4700]: I1007 11:48:54.956944 4700 scope.go:117] "RemoveContainer" containerID="cd159b1c4a4e5543066d303bc63fe54cd56f2bbe3439d5248b95da07598874b4" Oct 07 11:48:54 crc kubenswrapper[4700]: E1007 11:48:54.957656 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:48:56 crc kubenswrapper[4700]: I1007 11:48:56.759965 4700 generic.go:334] "Generic (PLEG): container finished" podID="fb8189cc-a34b-4793-91c0-7c4d5b837374" containerID="d65388c9a39191ac575c3a34b091b26962400bb1d951ce6cfe866b0f65942710" exitCode=0 Oct 07 11:48:56 crc kubenswrapper[4700]: I1007 11:48:56.760077 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4xzsk" event={"ID":"fb8189cc-a34b-4793-91c0-7c4d5b837374","Type":"ContainerDied","Data":"d65388c9a39191ac575c3a34b091b26962400bb1d951ce6cfe866b0f65942710"} Oct 07 11:48:58 crc kubenswrapper[4700]: I1007 11:48:58.276660 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4xzsk" Oct 07 11:48:58 crc kubenswrapper[4700]: I1007 11:48:58.360334 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2s9z\" (UniqueName: \"kubernetes.io/projected/fb8189cc-a34b-4793-91c0-7c4d5b837374-kube-api-access-c2s9z\") pod \"fb8189cc-a34b-4793-91c0-7c4d5b837374\" (UID: \"fb8189cc-a34b-4793-91c0-7c4d5b837374\") " Oct 07 11:48:58 crc kubenswrapper[4700]: I1007 11:48:58.360505 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fb8189cc-a34b-4793-91c0-7c4d5b837374-ssh-key\") pod \"fb8189cc-a34b-4793-91c0-7c4d5b837374\" (UID: \"fb8189cc-a34b-4793-91c0-7c4d5b837374\") " Oct 07 11:48:58 crc kubenswrapper[4700]: I1007 11:48:58.360531 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb8189cc-a34b-4793-91c0-7c4d5b837374-inventory\") pod \"fb8189cc-a34b-4793-91c0-7c4d5b837374\" (UID: \"fb8189cc-a34b-4793-91c0-7c4d5b837374\") " Oct 07 11:48:58 crc kubenswrapper[4700]: I1007 11:48:58.366913 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb8189cc-a34b-4793-91c0-7c4d5b837374-kube-api-access-c2s9z" (OuterVolumeSpecName: "kube-api-access-c2s9z") pod "fb8189cc-a34b-4793-91c0-7c4d5b837374" (UID: "fb8189cc-a34b-4793-91c0-7c4d5b837374"). InnerVolumeSpecName "kube-api-access-c2s9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:48:58 crc kubenswrapper[4700]: I1007 11:48:58.391665 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb8189cc-a34b-4793-91c0-7c4d5b837374-inventory" (OuterVolumeSpecName: "inventory") pod "fb8189cc-a34b-4793-91c0-7c4d5b837374" (UID: "fb8189cc-a34b-4793-91c0-7c4d5b837374"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:48:58 crc kubenswrapper[4700]: I1007 11:48:58.408941 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb8189cc-a34b-4793-91c0-7c4d5b837374-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fb8189cc-a34b-4793-91c0-7c4d5b837374" (UID: "fb8189cc-a34b-4793-91c0-7c4d5b837374"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:48:58 crc kubenswrapper[4700]: I1007 11:48:58.462533 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2s9z\" (UniqueName: \"kubernetes.io/projected/fb8189cc-a34b-4793-91c0-7c4d5b837374-kube-api-access-c2s9z\") on node \"crc\" DevicePath \"\"" Oct 07 11:48:58 crc kubenswrapper[4700]: I1007 11:48:58.462565 4700 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fb8189cc-a34b-4793-91c0-7c4d5b837374-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 11:48:58 crc kubenswrapper[4700]: I1007 11:48:58.462573 4700 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb8189cc-a34b-4793-91c0-7c4d5b837374-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 11:48:58 crc kubenswrapper[4700]: I1007 11:48:58.782300 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4xzsk" event={"ID":"fb8189cc-a34b-4793-91c0-7c4d5b837374","Type":"ContainerDied","Data":"e717b5643a8f5d12a499bbc6622dae4a236a8555ee3cc932fd980ad46d98365a"} Oct 07 11:48:58 crc kubenswrapper[4700]: I1007 11:48:58.782382 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e717b5643a8f5d12a499bbc6622dae4a236a8555ee3cc932fd980ad46d98365a" Oct 07 11:48:58 crc kubenswrapper[4700]: I1007 11:48:58.782458 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4xzsk" Oct 07 11:48:58 crc kubenswrapper[4700]: I1007 11:48:58.913980 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-lnpzk"] Oct 07 11:48:58 crc kubenswrapper[4700]: E1007 11:48:58.914502 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb8189cc-a34b-4793-91c0-7c4d5b837374" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 07 11:48:58 crc kubenswrapper[4700]: I1007 11:48:58.914528 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb8189cc-a34b-4793-91c0-7c4d5b837374" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 07 11:48:58 crc kubenswrapper[4700]: I1007 11:48:58.914772 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb8189cc-a34b-4793-91c0-7c4d5b837374" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 07 11:48:58 crc kubenswrapper[4700]: I1007 11:48:58.915674 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lnpzk" Oct 07 11:48:58 crc kubenswrapper[4700]: I1007 11:48:58.919714 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 11:48:58 crc kubenswrapper[4700]: I1007 11:48:58.920216 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 11:48:58 crc kubenswrapper[4700]: I1007 11:48:58.921401 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxxzn" Oct 07 11:48:58 crc kubenswrapper[4700]: I1007 11:48:58.923265 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 11:48:58 crc kubenswrapper[4700]: I1007 11:48:58.934374 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-lnpzk"] Oct 07 11:48:58 crc kubenswrapper[4700]: I1007 11:48:58.973252 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjbb8\" (UniqueName: \"kubernetes.io/projected/52767ee1-91e5-47e9-b945-70e2a4df6ec8-kube-api-access-wjbb8\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lnpzk\" (UID: \"52767ee1-91e5-47e9-b945-70e2a4df6ec8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lnpzk" Oct 07 11:48:58 crc kubenswrapper[4700]: I1007 11:48:58.973500 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52767ee1-91e5-47e9-b945-70e2a4df6ec8-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lnpzk\" (UID: \"52767ee1-91e5-47e9-b945-70e2a4df6ec8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lnpzk" Oct 07 11:48:58 crc kubenswrapper[4700]: I1007 11:48:58.973998 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52767ee1-91e5-47e9-b945-70e2a4df6ec8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lnpzk\" (UID: \"52767ee1-91e5-47e9-b945-70e2a4df6ec8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lnpzk" Oct 07 11:48:59 crc kubenswrapper[4700]: I1007 11:48:59.075881 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjbb8\" (UniqueName: \"kubernetes.io/projected/52767ee1-91e5-47e9-b945-70e2a4df6ec8-kube-api-access-wjbb8\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lnpzk\" (UID: \"52767ee1-91e5-47e9-b945-70e2a4df6ec8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lnpzk" Oct 07 11:48:59 crc kubenswrapper[4700]: I1007 11:48:59.075971 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52767ee1-91e5-47e9-b945-70e2a4df6ec8-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lnpzk\" (UID: \"52767ee1-91e5-47e9-b945-70e2a4df6ec8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lnpzk" Oct 07 11:48:59 crc kubenswrapper[4700]: I1007 11:48:59.076001 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52767ee1-91e5-47e9-b945-70e2a4df6ec8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lnpzk\" (UID: \"52767ee1-91e5-47e9-b945-70e2a4df6ec8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lnpzk" Oct 07 11:48:59 crc kubenswrapper[4700]: I1007 11:48:59.081592 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52767ee1-91e5-47e9-b945-70e2a4df6ec8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lnpzk\" (UID: \"52767ee1-91e5-47e9-b945-70e2a4df6ec8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lnpzk" Oct 07 11:48:59 crc kubenswrapper[4700]: I1007 11:48:59.081682 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52767ee1-91e5-47e9-b945-70e2a4df6ec8-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lnpzk\" (UID: \"52767ee1-91e5-47e9-b945-70e2a4df6ec8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lnpzk" Oct 07 11:48:59 crc kubenswrapper[4700]: I1007 11:48:59.097883 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjbb8\" (UniqueName: \"kubernetes.io/projected/52767ee1-91e5-47e9-b945-70e2a4df6ec8-kube-api-access-wjbb8\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lnpzk\" (UID: \"52767ee1-91e5-47e9-b945-70e2a4df6ec8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lnpzk" Oct 07 11:48:59 crc kubenswrapper[4700]: I1007 11:48:59.248164 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lnpzk" Oct 07 11:48:59 crc kubenswrapper[4700]: I1007 11:48:59.788386 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-lnpzk"] Oct 07 11:49:00 crc kubenswrapper[4700]: I1007 11:49:00.803716 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lnpzk" event={"ID":"52767ee1-91e5-47e9-b945-70e2a4df6ec8","Type":"ContainerStarted","Data":"3f0929315f9de2e5c67938904987714d7d6c17cfaea04444b86b380affffee8e"} Oct 07 11:49:00 crc kubenswrapper[4700]: I1007 11:49:00.805285 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lnpzk" event={"ID":"52767ee1-91e5-47e9-b945-70e2a4df6ec8","Type":"ContainerStarted","Data":"77f15fd23fe90e0987bd315d20edbeb73a5a66a74a89526812b7f470cd8a09ce"} Oct 07 11:49:00 crc kubenswrapper[4700]: I1007 11:49:00.838512 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lnpzk" podStartSLOduration=2.457799788 podStartE2EDuration="2.838489735s" podCreationTimestamp="2025-10-07 11:48:58 +0000 UTC" firstStartedPulling="2025-10-07 11:48:59.803925466 +0000 UTC m=+1706.600324455" lastFinishedPulling="2025-10-07 11:49:00.184615413 +0000 UTC m=+1706.981014402" observedRunningTime="2025-10-07 11:49:00.822856606 +0000 UTC m=+1707.619255635" watchObservedRunningTime="2025-10-07 11:49:00.838489735 +0000 UTC m=+1707.634888734" Oct 07 11:49:03 crc kubenswrapper[4700]: I1007 11:49:03.059248 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-jxw7t"] Oct 07 11:49:03 crc kubenswrapper[4700]: I1007 11:49:03.074608 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-5dmmx"] Oct 07 11:49:03 crc kubenswrapper[4700]: I1007 11:49:03.087787 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-f4pcg"] Oct 07 11:49:03 crc kubenswrapper[4700]: I1007 11:49:03.097273 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-5dmmx"] Oct 07 11:49:03 crc kubenswrapper[4700]: I1007 11:49:03.106454 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-jxw7t"] Oct 07 11:49:03 crc kubenswrapper[4700]: I1007 11:49:03.115904 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-f4pcg"] Oct 07 11:49:03 crc kubenswrapper[4700]: I1007 11:49:03.969062 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1029c91b-8d5a-4de4-9fa3-c28f0ea3e5fa" path="/var/lib/kubelet/pods/1029c91b-8d5a-4de4-9fa3-c28f0ea3e5fa/volumes" Oct 07 11:49:03 crc kubenswrapper[4700]: I1007 11:49:03.970086 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d62cb423-c171-42e0-a579-ec4b427d440a" path="/var/lib/kubelet/pods/d62cb423-c171-42e0-a579-ec4b427d440a/volumes" Oct 07 11:49:03 crc kubenswrapper[4700]: I1007 11:49:03.970608 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef10aef3-784c-4c7a-a9a9-5f1e36e84d2a" path="/var/lib/kubelet/pods/ef10aef3-784c-4c7a-a9a9-5f1e36e84d2a/volumes" Oct 07 11:49:05 crc kubenswrapper[4700]: I1007 11:49:05.404068 4700 scope.go:117] "RemoveContainer" containerID="8b96c58a36186d71ac832d39b310bc7f1fae8b9a4826f510d713d33a95b97532" Oct 07 11:49:05 crc kubenswrapper[4700]: I1007 11:49:05.454754 4700 scope.go:117] "RemoveContainer" containerID="073d1599228f72beeb612a3a5d4bbdf370d05377ccd988afaeea78d51d21cd05" Oct 07 11:49:05 crc kubenswrapper[4700]: I1007 11:49:05.514587 4700 scope.go:117] "RemoveContainer" containerID="45857cec3ee0b45c8823a4b8f0dca498bc775ba0c6e193d6f84fd99f9ee93ba4" Oct 07 11:49:05 crc kubenswrapper[4700]: I1007 11:49:05.551775 4700 scope.go:117] "RemoveContainer" containerID="f47c5fc54b832e365d48fd596c6fe520b6734a1d06be329d0926b66509fcc143" Oct 07 11:49:05 crc kubenswrapper[4700]: I1007 11:49:05.600529 4700 scope.go:117] "RemoveContainer" containerID="ca5c7ddde7f22ed9ac8ab549ef3bd576dd79b6a5c2690b8e5d8c0384bc5092b3" Oct 07 11:49:05 crc kubenswrapper[4700]: I1007 11:49:05.620043 4700 scope.go:117] "RemoveContainer" containerID="f4bc679c9b20142445a482ac4aba8ea4e86969665d47bbf62800fd88ca36709c" Oct 07 11:49:06 crc kubenswrapper[4700]: I1007 11:49:06.957416 4700 scope.go:117] "RemoveContainer" containerID="cd159b1c4a4e5543066d303bc63fe54cd56f2bbe3439d5248b95da07598874b4" Oct 07 11:49:06 crc kubenswrapper[4700]: E1007 11:49:06.957978 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:49:14 crc kubenswrapper[4700]: I1007 11:49:14.039576 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-f251-account-create-fjksm"] Oct 07 11:49:14 crc kubenswrapper[4700]: I1007 11:49:14.052220 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-f251-account-create-fjksm"] Oct 07 11:49:14 crc kubenswrapper[4700]: I1007 11:49:14.065886 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-4be5-account-create-lnppp"] Oct 07 11:49:14 crc kubenswrapper[4700]: I1007 11:49:14.074750 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-0850-account-create-hvrkb"] Oct 07 11:49:14 crc kubenswrapper[4700]: I1007 11:49:14.082498 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-0850-account-create-hvrkb"] Oct 07 11:49:14 crc kubenswrapper[4700]: I1007 11:49:14.089181 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-4be5-account-create-lnppp"] Oct 07 11:49:15 crc kubenswrapper[4700]: I1007 11:49:15.975712 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e7a1185-eb36-4f41-83ad-98cf6b028339" path="/var/lib/kubelet/pods/3e7a1185-eb36-4f41-83ad-98cf6b028339/volumes" Oct 07 11:49:15 crc kubenswrapper[4700]: I1007 11:49:15.977201 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb5a0b33-a377-4d96-acc6-3e9eb27bff2d" path="/var/lib/kubelet/pods/cb5a0b33-a377-4d96-acc6-3e9eb27bff2d/volumes" Oct 07 11:49:15 crc kubenswrapper[4700]: I1007 11:49:15.978219 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff7a27a6-846b-4217-98d2-0bb89c409392" path="/var/lib/kubelet/pods/ff7a27a6-846b-4217-98d2-0bb89c409392/volumes" Oct 07 11:49:17 crc kubenswrapper[4700]: I1007 11:49:17.959128 4700 scope.go:117] "RemoveContainer" containerID="cd159b1c4a4e5543066d303bc63fe54cd56f2bbe3439d5248b95da07598874b4" Oct 07 11:49:17 crc kubenswrapper[4700]: E1007 11:49:17.960362 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:49:28 crc kubenswrapper[4700]: I1007 11:49:28.957739 4700 scope.go:117] "RemoveContainer" containerID="cd159b1c4a4e5543066d303bc63fe54cd56f2bbe3439d5248b95da07598874b4" Oct 07 11:49:28 crc kubenswrapper[4700]: E1007 11:49:28.959145 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:49:34 crc kubenswrapper[4700]: I1007 11:49:34.040281 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6cj2z"] Oct 07 11:49:34 crc kubenswrapper[4700]: I1007 11:49:34.054670 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6cj2z"] Oct 07 11:49:35 crc kubenswrapper[4700]: I1007 11:49:35.975459 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed147029-af21-44d2-9243-98161b542425" path="/var/lib/kubelet/pods/ed147029-af21-44d2-9243-98161b542425/volumes" Oct 07 11:49:41 crc kubenswrapper[4700]: I1007 11:49:41.299395 4700 generic.go:334] "Generic (PLEG): container finished" podID="52767ee1-91e5-47e9-b945-70e2a4df6ec8" containerID="3f0929315f9de2e5c67938904987714d7d6c17cfaea04444b86b380affffee8e" exitCode=0 Oct 07 11:49:41 crc kubenswrapper[4700]: I1007 11:49:41.299449 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lnpzk" event={"ID":"52767ee1-91e5-47e9-b945-70e2a4df6ec8","Type":"ContainerDied","Data":"3f0929315f9de2e5c67938904987714d7d6c17cfaea04444b86b380affffee8e"} Oct 07 11:49:42 crc kubenswrapper[4700]: I1007 11:49:42.694971 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lnpzk" Oct 07 11:49:42 crc kubenswrapper[4700]: I1007 11:49:42.844889 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52767ee1-91e5-47e9-b945-70e2a4df6ec8-ssh-key\") pod \"52767ee1-91e5-47e9-b945-70e2a4df6ec8\" (UID: \"52767ee1-91e5-47e9-b945-70e2a4df6ec8\") " Oct 07 11:49:42 crc kubenswrapper[4700]: I1007 11:49:42.844978 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52767ee1-91e5-47e9-b945-70e2a4df6ec8-inventory\") pod \"52767ee1-91e5-47e9-b945-70e2a4df6ec8\" (UID: \"52767ee1-91e5-47e9-b945-70e2a4df6ec8\") " Oct 07 11:49:42 crc kubenswrapper[4700]: I1007 11:49:42.845266 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjbb8\" (UniqueName: \"kubernetes.io/projected/52767ee1-91e5-47e9-b945-70e2a4df6ec8-kube-api-access-wjbb8\") pod \"52767ee1-91e5-47e9-b945-70e2a4df6ec8\" (UID: \"52767ee1-91e5-47e9-b945-70e2a4df6ec8\") " Oct 07 11:49:42 crc kubenswrapper[4700]: I1007 11:49:42.852820 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52767ee1-91e5-47e9-b945-70e2a4df6ec8-kube-api-access-wjbb8" (OuterVolumeSpecName: "kube-api-access-wjbb8") pod "52767ee1-91e5-47e9-b945-70e2a4df6ec8" (UID: "52767ee1-91e5-47e9-b945-70e2a4df6ec8"). InnerVolumeSpecName "kube-api-access-wjbb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:49:42 crc kubenswrapper[4700]: I1007 11:49:42.879799 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52767ee1-91e5-47e9-b945-70e2a4df6ec8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "52767ee1-91e5-47e9-b945-70e2a4df6ec8" (UID: "52767ee1-91e5-47e9-b945-70e2a4df6ec8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:49:42 crc kubenswrapper[4700]: I1007 11:49:42.881578 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52767ee1-91e5-47e9-b945-70e2a4df6ec8-inventory" (OuterVolumeSpecName: "inventory") pod "52767ee1-91e5-47e9-b945-70e2a4df6ec8" (UID: "52767ee1-91e5-47e9-b945-70e2a4df6ec8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:49:42 crc kubenswrapper[4700]: I1007 11:49:42.948435 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjbb8\" (UniqueName: \"kubernetes.io/projected/52767ee1-91e5-47e9-b945-70e2a4df6ec8-kube-api-access-wjbb8\") on node \"crc\" DevicePath \"\"" Oct 07 11:49:42 crc kubenswrapper[4700]: I1007 11:49:42.948473 4700 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52767ee1-91e5-47e9-b945-70e2a4df6ec8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 11:49:42 crc kubenswrapper[4700]: I1007 11:49:42.948486 4700 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52767ee1-91e5-47e9-b945-70e2a4df6ec8-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 11:49:42 crc kubenswrapper[4700]: I1007 11:49:42.957735 4700 scope.go:117] "RemoveContainer" containerID="cd159b1c4a4e5543066d303bc63fe54cd56f2bbe3439d5248b95da07598874b4" Oct 07 11:49:42 crc kubenswrapper[4700]: E1007 11:49:42.958137 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:49:43 crc kubenswrapper[4700]: I1007 11:49:43.320217 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lnpzk" event={"ID":"52767ee1-91e5-47e9-b945-70e2a4df6ec8","Type":"ContainerDied","Data":"77f15fd23fe90e0987bd315d20edbeb73a5a66a74a89526812b7f470cd8a09ce"} Oct 07 11:49:43 crc kubenswrapper[4700]: I1007 11:49:43.320267 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lnpzk" Oct 07 11:49:43 crc kubenswrapper[4700]: I1007 11:49:43.320273 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77f15fd23fe90e0987bd315d20edbeb73a5a66a74a89526812b7f470cd8a09ce" Oct 07 11:49:43 crc kubenswrapper[4700]: I1007 11:49:43.440011 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9rwp9"] Oct 07 11:49:43 crc kubenswrapper[4700]: E1007 11:49:43.441148 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52767ee1-91e5-47e9-b945-70e2a4df6ec8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 07 11:49:43 crc kubenswrapper[4700]: I1007 11:49:43.441228 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="52767ee1-91e5-47e9-b945-70e2a4df6ec8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 07 11:49:43 crc kubenswrapper[4700]: I1007 11:49:43.441841 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="52767ee1-91e5-47e9-b945-70e2a4df6ec8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 07 11:49:43 crc kubenswrapper[4700]: I1007 11:49:43.443585 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9rwp9" Oct 07 11:49:43 crc kubenswrapper[4700]: I1007 11:49:43.446340 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxxzn" Oct 07 11:49:43 crc kubenswrapper[4700]: I1007 11:49:43.447838 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 11:49:43 crc kubenswrapper[4700]: I1007 11:49:43.450136 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 11:49:43 crc kubenswrapper[4700]: I1007 11:49:43.454845 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 11:49:43 crc kubenswrapper[4700]: I1007 11:49:43.464030 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9rwp9"] Oct 07 11:49:43 crc kubenswrapper[4700]: I1007 11:49:43.561107 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9rwp9\" (UID: \"abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9rwp9" Oct 07 11:49:43 crc kubenswrapper[4700]: I1007 11:49:43.561323 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9rwp9\" (UID: \"abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9rwp9" Oct 07 11:49:43 crc kubenswrapper[4700]: I1007 11:49:43.561359 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6mvw\" (UniqueName: \"kubernetes.io/projected/abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff-kube-api-access-j6mvw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9rwp9\" (UID: \"abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9rwp9" Oct 07 11:49:43 crc kubenswrapper[4700]: I1007 11:49:43.662964 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9rwp9\" (UID: \"abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9rwp9" Oct 07 11:49:43 crc kubenswrapper[4700]: I1007 11:49:43.663317 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6mvw\" (UniqueName: \"kubernetes.io/projected/abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff-kube-api-access-j6mvw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9rwp9\" (UID: \"abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9rwp9" Oct 07 11:49:43 crc kubenswrapper[4700]: I1007 11:49:43.663431 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9rwp9\" (UID: \"abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9rwp9" Oct 07 11:49:43 crc kubenswrapper[4700]: I1007 11:49:43.668018 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9rwp9\" (UID: \"abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9rwp9" Oct 07 11:49:43 crc kubenswrapper[4700]: I1007 11:49:43.668593 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9rwp9\" (UID: \"abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9rwp9" Oct 07 11:49:43 crc kubenswrapper[4700]: I1007 11:49:43.681743 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6mvw\" (UniqueName: \"kubernetes.io/projected/abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff-kube-api-access-j6mvw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9rwp9\" (UID: \"abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9rwp9" Oct 07 11:49:43 crc kubenswrapper[4700]: I1007 11:49:43.791469 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9rwp9" Oct 07 11:49:44 crc kubenswrapper[4700]: I1007 11:49:44.365246 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9rwp9"] Oct 07 11:49:45 crc kubenswrapper[4700]: I1007 11:49:45.336750 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9rwp9" event={"ID":"abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff","Type":"ContainerStarted","Data":"0a33e635b3236dcaf7eb07c29383a1f636397013e38e480fc2de7ac73676d9d3"} Oct 07 11:49:45 crc kubenswrapper[4700]: I1007 11:49:45.337174 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9rwp9" event={"ID":"abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff","Type":"ContainerStarted","Data":"f6324501931d678276d9dee2b8de6631dea634e2fdeec2f93a139983b1aed7be"} Oct 07 11:49:45 crc kubenswrapper[4700]: I1007 11:49:45.361901 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9rwp9" podStartSLOduration=1.6887853069999998 podStartE2EDuration="2.361875762s" podCreationTimestamp="2025-10-07 11:49:43 +0000 UTC" firstStartedPulling="2025-10-07 11:49:44.373476419 +0000 UTC m=+1751.169875408" lastFinishedPulling="2025-10-07 11:49:45.046566864 +0000 UTC m=+1751.842965863" observedRunningTime="2025-10-07 11:49:45.352616459 +0000 UTC m=+1752.149015488" watchObservedRunningTime="2025-10-07 11:49:45.361875762 +0000 UTC m=+1752.158274791" Oct 07 11:49:55 crc kubenswrapper[4700]: I1007 11:49:55.957831 4700 scope.go:117] "RemoveContainer" containerID="cd159b1c4a4e5543066d303bc63fe54cd56f2bbe3439d5248b95da07598874b4" Oct 07 11:49:55 crc kubenswrapper[4700]: E1007 11:49:55.958948 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:49:57 crc kubenswrapper[4700]: I1007 11:49:57.041403 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hk898"] Oct 07 11:49:57 crc kubenswrapper[4700]: I1007 11:49:57.056229 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-7vgrf"] Oct 07 11:49:57 crc kubenswrapper[4700]: I1007 11:49:57.069665 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hk898"] Oct 07 11:49:57 crc kubenswrapper[4700]: I1007 11:49:57.085925 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-7vgrf"] Oct 07 11:49:57 crc kubenswrapper[4700]: I1007 11:49:57.976249 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3089c2b1-1959-42bb-87ff-1f9c4ab4d01d" path="/var/lib/kubelet/pods/3089c2b1-1959-42bb-87ff-1f9c4ab4d01d/volumes" Oct 07 11:49:57 crc kubenswrapper[4700]: I1007 11:49:57.977500 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f14d077-e353-4931-a13e-c2184c602276" path="/var/lib/kubelet/pods/8f14d077-e353-4931-a13e-c2184c602276/volumes" Oct 07 11:50:05 crc kubenswrapper[4700]: I1007 11:50:05.870497 4700 scope.go:117] "RemoveContainer" containerID="1ae71949fca53228ee2b9b377af103076f6d32d2cbc715fcabe1ee9cb078d549" Oct 07 11:50:05 crc kubenswrapper[4700]: I1007 11:50:05.910443 4700 scope.go:117] "RemoveContainer" containerID="b11a9bfdec60767c7fd2db0e8957dfd00bdf83fb9b82fb97e44dc837a3de0005" Oct 07 11:50:05 crc kubenswrapper[4700]: I1007 11:50:05.982093 4700 scope.go:117] "RemoveContainer" containerID="c1af685aad61d376c1cfc82a063190289f77886b4ad1e6339fbed2b04f28c047" Oct 07 11:50:06 crc kubenswrapper[4700]: I1007 11:50:06.038134 4700 scope.go:117] "RemoveContainer" containerID="deadea4bbdd14a2a9e9ae2298fb29d71b595d3d35bbee933462eb6a99a59727a" Oct 07 11:50:06 crc kubenswrapper[4700]: I1007 11:50:06.074745 4700 scope.go:117] "RemoveContainer" containerID="51cec8adf1716593ae4097d82c2d01a7ee177da51ae7618ccd19de52e48a8c00" Oct 07 11:50:06 crc kubenswrapper[4700]: I1007 11:50:06.131565 4700 scope.go:117] "RemoveContainer" containerID="a348301cf929b80c8678ceb34ed69c857905dc7228c177378d33e2b0745d4f80" Oct 07 11:50:08 crc kubenswrapper[4700]: I1007 11:50:08.958488 4700 scope.go:117] "RemoveContainer" containerID="cd159b1c4a4e5543066d303bc63fe54cd56f2bbe3439d5248b95da07598874b4" Oct 07 11:50:08 crc kubenswrapper[4700]: E1007 11:50:08.960976 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:50:23 crc kubenswrapper[4700]: I1007 11:50:23.969179 4700 scope.go:117] "RemoveContainer" containerID="cd159b1c4a4e5543066d303bc63fe54cd56f2bbe3439d5248b95da07598874b4" Oct 07 11:50:23 crc kubenswrapper[4700]: E1007 11:50:23.970461 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:50:33 crc kubenswrapper[4700]: I1007 11:50:33.946079 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dtsjn"] Oct 07 11:50:33 crc kubenswrapper[4700]: I1007 11:50:33.949205 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dtsjn" Oct 07 11:50:34 crc kubenswrapper[4700]: I1007 11:50:34.014089 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dtsjn"] Oct 07 11:50:34 crc kubenswrapper[4700]: I1007 11:50:34.082719 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5609da4e-ee0b-456b-8dbd-8fdc60676606-catalog-content\") pod \"community-operators-dtsjn\" (UID: \"5609da4e-ee0b-456b-8dbd-8fdc60676606\") " pod="openshift-marketplace/community-operators-dtsjn" Oct 07 11:50:34 crc kubenswrapper[4700]: I1007 11:50:34.082796 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84846\" (UniqueName: \"kubernetes.io/projected/5609da4e-ee0b-456b-8dbd-8fdc60676606-kube-api-access-84846\") pod \"community-operators-dtsjn\" (UID: \"5609da4e-ee0b-456b-8dbd-8fdc60676606\") " pod="openshift-marketplace/community-operators-dtsjn" Oct 07 11:50:34 crc kubenswrapper[4700]: I1007 11:50:34.082878 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5609da4e-ee0b-456b-8dbd-8fdc60676606-utilities\") pod \"community-operators-dtsjn\" (UID: \"5609da4e-ee0b-456b-8dbd-8fdc60676606\") " pod="openshift-marketplace/community-operators-dtsjn" Oct 07 11:50:34 crc kubenswrapper[4700]: I1007 11:50:34.184466 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5609da4e-ee0b-456b-8dbd-8fdc60676606-utilities\") pod \"community-operators-dtsjn\" (UID: \"5609da4e-ee0b-456b-8dbd-8fdc60676606\") " pod="openshift-marketplace/community-operators-dtsjn" Oct 07 11:50:34 crc kubenswrapper[4700]: I1007 11:50:34.184615 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5609da4e-ee0b-456b-8dbd-8fdc60676606-catalog-content\") pod \"community-operators-dtsjn\" (UID: \"5609da4e-ee0b-456b-8dbd-8fdc60676606\") " pod="openshift-marketplace/community-operators-dtsjn" Oct 07 11:50:34 crc kubenswrapper[4700]: I1007 11:50:34.184656 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84846\" (UniqueName: \"kubernetes.io/projected/5609da4e-ee0b-456b-8dbd-8fdc60676606-kube-api-access-84846\") pod \"community-operators-dtsjn\" (UID: \"5609da4e-ee0b-456b-8dbd-8fdc60676606\") " pod="openshift-marketplace/community-operators-dtsjn" Oct 07 11:50:34 crc kubenswrapper[4700]: I1007 11:50:34.185457 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5609da4e-ee0b-456b-8dbd-8fdc60676606-utilities\") pod \"community-operators-dtsjn\" (UID: \"5609da4e-ee0b-456b-8dbd-8fdc60676606\") " pod="openshift-marketplace/community-operators-dtsjn" Oct 07 11:50:34 crc kubenswrapper[4700]: I1007 11:50:34.185771 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5609da4e-ee0b-456b-8dbd-8fdc60676606-catalog-content\") pod \"community-operators-dtsjn\" (UID: \"5609da4e-ee0b-456b-8dbd-8fdc60676606\") " pod="openshift-marketplace/community-operators-dtsjn" Oct 07 11:50:34 crc kubenswrapper[4700]: I1007 11:50:34.206811 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84846\" (UniqueName: \"kubernetes.io/projected/5609da4e-ee0b-456b-8dbd-8fdc60676606-kube-api-access-84846\") pod \"community-operators-dtsjn\" (UID: \"5609da4e-ee0b-456b-8dbd-8fdc60676606\") " pod="openshift-marketplace/community-operators-dtsjn" Oct 07 11:50:34 crc kubenswrapper[4700]: I1007 11:50:34.291951 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dtsjn" Oct 07 11:50:34 crc kubenswrapper[4700]: I1007 11:50:34.885839 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dtsjn"] Oct 07 11:50:34 crc kubenswrapper[4700]: W1007 11:50:34.895487 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5609da4e_ee0b_456b_8dbd_8fdc60676606.slice/crio-40459dfbb0146aa383885657e681678105cc532040d1c0ed2b445f8d3b248bca WatchSource:0}: Error finding container 40459dfbb0146aa383885657e681678105cc532040d1c0ed2b445f8d3b248bca: Status 404 returned error can't find the container with id 40459dfbb0146aa383885657e681678105cc532040d1c0ed2b445f8d3b248bca Oct 07 11:50:34 crc kubenswrapper[4700]: I1007 11:50:34.957808 4700 scope.go:117] "RemoveContainer" containerID="cd159b1c4a4e5543066d303bc63fe54cd56f2bbe3439d5248b95da07598874b4" Oct 07 11:50:34 crc kubenswrapper[4700]: E1007 11:50:34.958061 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:50:35 crc kubenswrapper[4700]: I1007 11:50:35.906102 4700 generic.go:334] "Generic (PLEG): container finished" podID="5609da4e-ee0b-456b-8dbd-8fdc60676606" containerID="1f0162e243d59832b09cfe9ccf1dffd3af355d378d9cf0639248a9adf570c620" exitCode=0 Oct 07 11:50:35 crc kubenswrapper[4700]: I1007 11:50:35.906201 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dtsjn" event={"ID":"5609da4e-ee0b-456b-8dbd-8fdc60676606","Type":"ContainerDied","Data":"1f0162e243d59832b09cfe9ccf1dffd3af355d378d9cf0639248a9adf570c620"} Oct 07 11:50:35 crc kubenswrapper[4700]: I1007 11:50:35.906598 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dtsjn" event={"ID":"5609da4e-ee0b-456b-8dbd-8fdc60676606","Type":"ContainerStarted","Data":"40459dfbb0146aa383885657e681678105cc532040d1c0ed2b445f8d3b248bca"} Oct 07 11:50:36 crc kubenswrapper[4700]: I1007 11:50:36.918667 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dtsjn" event={"ID":"5609da4e-ee0b-456b-8dbd-8fdc60676606","Type":"ContainerStarted","Data":"f268bd25f6cebf10f15ba3d5b7148dd4ac118d3404cd422f4cfaea7423f37ce2"} Oct 07 11:50:37 crc kubenswrapper[4700]: I1007 11:50:37.944339 4700 generic.go:334] "Generic (PLEG): container finished" podID="5609da4e-ee0b-456b-8dbd-8fdc60676606" containerID="f268bd25f6cebf10f15ba3d5b7148dd4ac118d3404cd422f4cfaea7423f37ce2" exitCode=0 Oct 07 11:50:37 crc kubenswrapper[4700]: I1007 11:50:37.944409 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dtsjn" event={"ID":"5609da4e-ee0b-456b-8dbd-8fdc60676606","Type":"ContainerDied","Data":"f268bd25f6cebf10f15ba3d5b7148dd4ac118d3404cd422f4cfaea7423f37ce2"} Oct 07 11:50:38 crc kubenswrapper[4700]: I1007 11:50:38.956885 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dtsjn" event={"ID":"5609da4e-ee0b-456b-8dbd-8fdc60676606","Type":"ContainerStarted","Data":"5ab09cf6c3e4fb178af7a4db4c3db26f562df467bf4a78949444c623e7ff7dd3"} Oct 07 11:50:38 crc kubenswrapper[4700]: I1007 11:50:38.984525 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dtsjn" podStartSLOduration=3.361903432 podStartE2EDuration="5.984491688s" podCreationTimestamp="2025-10-07 11:50:33 +0000 UTC" firstStartedPulling="2025-10-07 11:50:35.908666587 +0000 UTC m=+1802.705065616" lastFinishedPulling="2025-10-07 11:50:38.531254843 +0000 UTC m=+1805.327653872" observedRunningTime="2025-10-07 11:50:38.974832505 +0000 UTC m=+1805.771231514" watchObservedRunningTime="2025-10-07 11:50:38.984491688 +0000 UTC m=+1805.780890727" Oct 07 11:50:43 crc kubenswrapper[4700]: I1007 11:50:43.062865 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-9lkqh"] Oct 07 11:50:43 crc kubenswrapper[4700]: I1007 11:50:43.076954 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-9lkqh"] Oct 07 11:50:43 crc kubenswrapper[4700]: I1007 11:50:43.977524 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fea0db12-6a5a-4452-bc47-31df9a1bb76e" path="/var/lib/kubelet/pods/fea0db12-6a5a-4452-bc47-31df9a1bb76e/volumes" Oct 07 11:50:44 crc kubenswrapper[4700]: I1007 11:50:44.011920 4700 generic.go:334] "Generic (PLEG): container finished" podID="abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff" containerID="0a33e635b3236dcaf7eb07c29383a1f636397013e38e480fc2de7ac73676d9d3" exitCode=2 Oct 07 11:50:44 crc kubenswrapper[4700]: I1007 11:50:44.011993 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9rwp9" event={"ID":"abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff","Type":"ContainerDied","Data":"0a33e635b3236dcaf7eb07c29383a1f636397013e38e480fc2de7ac73676d9d3"} Oct 07 11:50:44 crc kubenswrapper[4700]: I1007 11:50:44.292909 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dtsjn" Oct 07 11:50:44 crc kubenswrapper[4700]: I1007 11:50:44.292993 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dtsjn" Oct 07 11:50:44 crc kubenswrapper[4700]: I1007 11:50:44.361370 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dtsjn" Oct 07 11:50:45 crc kubenswrapper[4700]: I1007 11:50:45.114581 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dtsjn" Oct 07 11:50:45 crc kubenswrapper[4700]: I1007 11:50:45.550281 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9rwp9" Oct 07 11:50:45 crc kubenswrapper[4700]: I1007 11:50:45.625662 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6mvw\" (UniqueName: \"kubernetes.io/projected/abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff-kube-api-access-j6mvw\") pod \"abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff\" (UID: \"abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff\") " Oct 07 11:50:45 crc kubenswrapper[4700]: I1007 11:50:45.625755 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff-inventory\") pod \"abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff\" (UID: \"abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff\") " Oct 07 11:50:45 crc kubenswrapper[4700]: I1007 11:50:45.625893 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff-ssh-key\") pod \"abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff\" (UID: \"abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff\") " Oct 07 11:50:45 crc kubenswrapper[4700]: I1007 11:50:45.630800 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff-kube-api-access-j6mvw" (OuterVolumeSpecName: "kube-api-access-j6mvw") pod "abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff" (UID: "abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff"). InnerVolumeSpecName "kube-api-access-j6mvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:50:45 crc kubenswrapper[4700]: I1007 11:50:45.652152 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff" (UID: "abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:50:45 crc kubenswrapper[4700]: I1007 11:50:45.679680 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff-inventory" (OuterVolumeSpecName: "inventory") pod "abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff" (UID: "abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:50:45 crc kubenswrapper[4700]: I1007 11:50:45.729170 4700 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 11:50:45 crc kubenswrapper[4700]: I1007 11:50:45.729233 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6mvw\" (UniqueName: \"kubernetes.io/projected/abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff-kube-api-access-j6mvw\") on node \"crc\" DevicePath \"\"" Oct 07 11:50:45 crc kubenswrapper[4700]: I1007 11:50:45.729260 4700 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 11:50:45 crc kubenswrapper[4700]: I1007 11:50:45.926578 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dtsjn"] Oct 07 11:50:45 crc kubenswrapper[4700]: I1007 11:50:45.957371 4700 scope.go:117] "RemoveContainer" containerID="cd159b1c4a4e5543066d303bc63fe54cd56f2bbe3439d5248b95da07598874b4" Oct 07 11:50:46 crc kubenswrapper[4700]: I1007 11:50:46.034471 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9rwp9" event={"ID":"abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff","Type":"ContainerDied","Data":"f6324501931d678276d9dee2b8de6631dea634e2fdeec2f93a139983b1aed7be"} Oct 07 11:50:46 crc kubenswrapper[4700]: I1007 11:50:46.034528 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6324501931d678276d9dee2b8de6631dea634e2fdeec2f93a139983b1aed7be" Oct 07 11:50:46 crc kubenswrapper[4700]: I1007 11:50:46.034619 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9rwp9" Oct 07 11:50:47 crc kubenswrapper[4700]: I1007 11:50:47.046869 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" event={"ID":"97a77b38-e9b1-4243-ac3a-28d83d87cf15","Type":"ContainerStarted","Data":"b4eb2c0b76ac0ef29070f1fc5ad19b527d10f64c8c75273ac1ebd81f60388f25"} Oct 07 11:50:47 crc kubenswrapper[4700]: I1007 11:50:47.046970 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dtsjn" podUID="5609da4e-ee0b-456b-8dbd-8fdc60676606" containerName="registry-server" containerID="cri-o://5ab09cf6c3e4fb178af7a4db4c3db26f562df467bf4a78949444c623e7ff7dd3" gracePeriod=2 Oct 07 11:50:47 crc kubenswrapper[4700]: I1007 11:50:47.475957 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dtsjn" Oct 07 11:50:47 crc kubenswrapper[4700]: I1007 11:50:47.569746 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5609da4e-ee0b-456b-8dbd-8fdc60676606-utilities\") pod \"5609da4e-ee0b-456b-8dbd-8fdc60676606\" (UID: \"5609da4e-ee0b-456b-8dbd-8fdc60676606\") " Oct 07 11:50:47 crc kubenswrapper[4700]: I1007 11:50:47.569898 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84846\" (UniqueName: \"kubernetes.io/projected/5609da4e-ee0b-456b-8dbd-8fdc60676606-kube-api-access-84846\") pod \"5609da4e-ee0b-456b-8dbd-8fdc60676606\" (UID: \"5609da4e-ee0b-456b-8dbd-8fdc60676606\") " Oct 07 11:50:47 crc kubenswrapper[4700]: I1007 11:50:47.570061 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5609da4e-ee0b-456b-8dbd-8fdc60676606-catalog-content\") pod \"5609da4e-ee0b-456b-8dbd-8fdc60676606\" (UID: \"5609da4e-ee0b-456b-8dbd-8fdc60676606\") " Oct 07 11:50:47 crc kubenswrapper[4700]: I1007 11:50:47.571214 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5609da4e-ee0b-456b-8dbd-8fdc60676606-utilities" (OuterVolumeSpecName: "utilities") pod "5609da4e-ee0b-456b-8dbd-8fdc60676606" (UID: "5609da4e-ee0b-456b-8dbd-8fdc60676606"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:50:47 crc kubenswrapper[4700]: I1007 11:50:47.576687 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5609da4e-ee0b-456b-8dbd-8fdc60676606-kube-api-access-84846" (OuterVolumeSpecName: "kube-api-access-84846") pod "5609da4e-ee0b-456b-8dbd-8fdc60676606" (UID: "5609da4e-ee0b-456b-8dbd-8fdc60676606"). InnerVolumeSpecName "kube-api-access-84846". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:50:47 crc kubenswrapper[4700]: I1007 11:50:47.672786 4700 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5609da4e-ee0b-456b-8dbd-8fdc60676606-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 11:50:47 crc kubenswrapper[4700]: I1007 11:50:47.672837 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84846\" (UniqueName: \"kubernetes.io/projected/5609da4e-ee0b-456b-8dbd-8fdc60676606-kube-api-access-84846\") on node \"crc\" DevicePath \"\"" Oct 07 11:50:47 crc kubenswrapper[4700]: I1007 11:50:47.701112 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5609da4e-ee0b-456b-8dbd-8fdc60676606-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5609da4e-ee0b-456b-8dbd-8fdc60676606" (UID: "5609da4e-ee0b-456b-8dbd-8fdc60676606"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:50:47 crc kubenswrapper[4700]: I1007 11:50:47.776172 4700 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5609da4e-ee0b-456b-8dbd-8fdc60676606-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 11:50:48 crc kubenswrapper[4700]: I1007 11:50:48.061772 4700 generic.go:334] "Generic (PLEG): container finished" podID="5609da4e-ee0b-456b-8dbd-8fdc60676606" containerID="5ab09cf6c3e4fb178af7a4db4c3db26f562df467bf4a78949444c623e7ff7dd3" exitCode=0 Oct 07 11:50:48 crc kubenswrapper[4700]: I1007 11:50:48.061822 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dtsjn" event={"ID":"5609da4e-ee0b-456b-8dbd-8fdc60676606","Type":"ContainerDied","Data":"5ab09cf6c3e4fb178af7a4db4c3db26f562df467bf4a78949444c623e7ff7dd3"} Oct 07 11:50:48 crc kubenswrapper[4700]: I1007 11:50:48.061859 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dtsjn" event={"ID":"5609da4e-ee0b-456b-8dbd-8fdc60676606","Type":"ContainerDied","Data":"40459dfbb0146aa383885657e681678105cc532040d1c0ed2b445f8d3b248bca"} Oct 07 11:50:48 crc kubenswrapper[4700]: I1007 11:50:48.061882 4700 scope.go:117] "RemoveContainer" containerID="5ab09cf6c3e4fb178af7a4db4c3db26f562df467bf4a78949444c623e7ff7dd3" Oct 07 11:50:48 crc kubenswrapper[4700]: I1007 11:50:48.061886 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dtsjn" Oct 07 11:50:48 crc kubenswrapper[4700]: I1007 11:50:48.100435 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dtsjn"] Oct 07 11:50:48 crc kubenswrapper[4700]: I1007 11:50:48.107150 4700 scope.go:117] "RemoveContainer" containerID="f268bd25f6cebf10f15ba3d5b7148dd4ac118d3404cd422f4cfaea7423f37ce2" Oct 07 11:50:48 crc kubenswrapper[4700]: I1007 11:50:48.112567 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dtsjn"] Oct 07 11:50:48 crc kubenswrapper[4700]: I1007 11:50:48.138856 4700 scope.go:117] "RemoveContainer" containerID="1f0162e243d59832b09cfe9ccf1dffd3af355d378d9cf0639248a9adf570c620" Oct 07 11:50:48 crc kubenswrapper[4700]: I1007 11:50:48.209232 4700 scope.go:117] "RemoveContainer" containerID="5ab09cf6c3e4fb178af7a4db4c3db26f562df467bf4a78949444c623e7ff7dd3" Oct 07 11:50:48 crc kubenswrapper[4700]: E1007 11:50:48.210622 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ab09cf6c3e4fb178af7a4db4c3db26f562df467bf4a78949444c623e7ff7dd3\": container with ID starting with 5ab09cf6c3e4fb178af7a4db4c3db26f562df467bf4a78949444c623e7ff7dd3 not found: ID does not exist" containerID="5ab09cf6c3e4fb178af7a4db4c3db26f562df467bf4a78949444c623e7ff7dd3" Oct 07 11:50:48 crc kubenswrapper[4700]: I1007 11:50:48.210667 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ab09cf6c3e4fb178af7a4db4c3db26f562df467bf4a78949444c623e7ff7dd3"} err="failed to get container status \"5ab09cf6c3e4fb178af7a4db4c3db26f562df467bf4a78949444c623e7ff7dd3\": rpc error: code = NotFound desc = could not find container \"5ab09cf6c3e4fb178af7a4db4c3db26f562df467bf4a78949444c623e7ff7dd3\": container with ID starting with 5ab09cf6c3e4fb178af7a4db4c3db26f562df467bf4a78949444c623e7ff7dd3 not found: ID does not exist" Oct 07 11:50:48 crc kubenswrapper[4700]: I1007 11:50:48.210693 4700 scope.go:117] "RemoveContainer" containerID="f268bd25f6cebf10f15ba3d5b7148dd4ac118d3404cd422f4cfaea7423f37ce2" Oct 07 11:50:48 crc kubenswrapper[4700]: E1007 11:50:48.211147 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f268bd25f6cebf10f15ba3d5b7148dd4ac118d3404cd422f4cfaea7423f37ce2\": container with ID starting with f268bd25f6cebf10f15ba3d5b7148dd4ac118d3404cd422f4cfaea7423f37ce2 not found: ID does not exist" containerID="f268bd25f6cebf10f15ba3d5b7148dd4ac118d3404cd422f4cfaea7423f37ce2" Oct 07 11:50:48 crc kubenswrapper[4700]: I1007 11:50:48.211172 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f268bd25f6cebf10f15ba3d5b7148dd4ac118d3404cd422f4cfaea7423f37ce2"} err="failed to get container status \"f268bd25f6cebf10f15ba3d5b7148dd4ac118d3404cd422f4cfaea7423f37ce2\": rpc error: code = NotFound desc = could not find container \"f268bd25f6cebf10f15ba3d5b7148dd4ac118d3404cd422f4cfaea7423f37ce2\": container with ID starting with f268bd25f6cebf10f15ba3d5b7148dd4ac118d3404cd422f4cfaea7423f37ce2 not found: ID does not exist" Oct 07 11:50:48 crc kubenswrapper[4700]: I1007 11:50:48.211187 4700 scope.go:117] "RemoveContainer" containerID="1f0162e243d59832b09cfe9ccf1dffd3af355d378d9cf0639248a9adf570c620" Oct 07 11:50:48 crc kubenswrapper[4700]: E1007 11:50:48.211448 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f0162e243d59832b09cfe9ccf1dffd3af355d378d9cf0639248a9adf570c620\": container with ID starting with 1f0162e243d59832b09cfe9ccf1dffd3af355d378d9cf0639248a9adf570c620 not found: ID does not exist" containerID="1f0162e243d59832b09cfe9ccf1dffd3af355d378d9cf0639248a9adf570c620" Oct 07 11:50:48 crc kubenswrapper[4700]: I1007 11:50:48.211478 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f0162e243d59832b09cfe9ccf1dffd3af355d378d9cf0639248a9adf570c620"} err="failed to get container status \"1f0162e243d59832b09cfe9ccf1dffd3af355d378d9cf0639248a9adf570c620\": rpc error: code = NotFound desc = could not find container \"1f0162e243d59832b09cfe9ccf1dffd3af355d378d9cf0639248a9adf570c620\": container with ID starting with 1f0162e243d59832b09cfe9ccf1dffd3af355d378d9cf0639248a9adf570c620 not found: ID does not exist" Oct 07 11:50:49 crc kubenswrapper[4700]: I1007 11:50:49.976217 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5609da4e-ee0b-456b-8dbd-8fdc60676606" path="/var/lib/kubelet/pods/5609da4e-ee0b-456b-8dbd-8fdc60676606/volumes" Oct 07 11:50:53 crc kubenswrapper[4700]: I1007 11:50:53.034705 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6gh6p"] Oct 07 11:50:53 crc kubenswrapper[4700]: E1007 11:50:53.035572 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5609da4e-ee0b-456b-8dbd-8fdc60676606" containerName="extract-utilities" Oct 07 11:50:53 crc kubenswrapper[4700]: I1007 11:50:53.035585 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="5609da4e-ee0b-456b-8dbd-8fdc60676606" containerName="extract-utilities" Oct 07 11:50:53 crc kubenswrapper[4700]: E1007 11:50:53.035596 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5609da4e-ee0b-456b-8dbd-8fdc60676606" containerName="registry-server" Oct 07 11:50:53 crc kubenswrapper[4700]: I1007 11:50:53.035602 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="5609da4e-ee0b-456b-8dbd-8fdc60676606" containerName="registry-server" Oct 07 11:50:53 crc kubenswrapper[4700]: E1007 11:50:53.035625 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5609da4e-ee0b-456b-8dbd-8fdc60676606" containerName="extract-content" Oct 07 11:50:53 crc kubenswrapper[4700]: I1007 11:50:53.035632 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="5609da4e-ee0b-456b-8dbd-8fdc60676606" containerName="extract-content" Oct 07 11:50:53 crc kubenswrapper[4700]: E1007 11:50:53.035639 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 07 11:50:53 crc kubenswrapper[4700]: I1007 11:50:53.035646 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 07 11:50:53 crc kubenswrapper[4700]: I1007 11:50:53.035934 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 07 11:50:53 crc kubenswrapper[4700]: I1007 11:50:53.035948 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="5609da4e-ee0b-456b-8dbd-8fdc60676606" containerName="registry-server" Oct 07 11:50:53 crc kubenswrapper[4700]: I1007 11:50:53.036633 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6gh6p" Oct 07 11:50:53 crc kubenswrapper[4700]: I1007 11:50:53.040416 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 11:50:53 crc kubenswrapper[4700]: I1007 11:50:53.041080 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxxzn" Oct 07 11:50:53 crc kubenswrapper[4700]: I1007 11:50:53.041131 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 11:50:53 crc kubenswrapper[4700]: I1007 11:50:53.043069 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 11:50:53 crc kubenswrapper[4700]: I1007 11:50:53.055771 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6gh6p"] Oct 07 11:50:53 crc kubenswrapper[4700]: I1007 11:50:53.189536 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjj5c\" (UniqueName: \"kubernetes.io/projected/efce44a2-43b7-497a-bd61-81d0bbb5259b-kube-api-access-zjj5c\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6gh6p\" (UID: \"efce44a2-43b7-497a-bd61-81d0bbb5259b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6gh6p" Oct 07 11:50:53 crc kubenswrapper[4700]: I1007 11:50:53.189618 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efce44a2-43b7-497a-bd61-81d0bbb5259b-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6gh6p\" (UID: \"efce44a2-43b7-497a-bd61-81d0bbb5259b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6gh6p" Oct 07 11:50:53 crc kubenswrapper[4700]: I1007 11:50:53.189702 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/efce44a2-43b7-497a-bd61-81d0bbb5259b-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6gh6p\" (UID: \"efce44a2-43b7-497a-bd61-81d0bbb5259b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6gh6p" Oct 07 11:50:53 crc kubenswrapper[4700]: I1007 11:50:53.292273 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjj5c\" (UniqueName: \"kubernetes.io/projected/efce44a2-43b7-497a-bd61-81d0bbb5259b-kube-api-access-zjj5c\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6gh6p\" (UID: \"efce44a2-43b7-497a-bd61-81d0bbb5259b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6gh6p" Oct 07 11:50:53 crc kubenswrapper[4700]: I1007 11:50:53.292393 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efce44a2-43b7-497a-bd61-81d0bbb5259b-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6gh6p\" (UID: \"efce44a2-43b7-497a-bd61-81d0bbb5259b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6gh6p" Oct 07 11:50:53 crc kubenswrapper[4700]: I1007 11:50:53.292481 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/efce44a2-43b7-497a-bd61-81d0bbb5259b-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6gh6p\" (UID: \"efce44a2-43b7-497a-bd61-81d0bbb5259b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6gh6p" Oct 07 11:50:53 crc kubenswrapper[4700]: I1007 11:50:53.301416 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/efce44a2-43b7-497a-bd61-81d0bbb5259b-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6gh6p\" (UID: \"efce44a2-43b7-497a-bd61-81d0bbb5259b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6gh6p" Oct 07 11:50:53 crc kubenswrapper[4700]: I1007 11:50:53.301440 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efce44a2-43b7-497a-bd61-81d0bbb5259b-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6gh6p\" (UID: \"efce44a2-43b7-497a-bd61-81d0bbb5259b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6gh6p" Oct 07 11:50:53 crc kubenswrapper[4700]: I1007 11:50:53.315344 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjj5c\" (UniqueName: \"kubernetes.io/projected/efce44a2-43b7-497a-bd61-81d0bbb5259b-kube-api-access-zjj5c\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6gh6p\" (UID: \"efce44a2-43b7-497a-bd61-81d0bbb5259b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6gh6p" Oct 07 11:50:53 crc kubenswrapper[4700]: I1007 11:50:53.365646 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6gh6p" Oct 07 11:50:54 crc kubenswrapper[4700]: I1007 11:50:54.002639 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6gh6p"] Oct 07 11:50:54 crc kubenswrapper[4700]: I1007 11:50:54.126734 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6gh6p" event={"ID":"efce44a2-43b7-497a-bd61-81d0bbb5259b","Type":"ContainerStarted","Data":"fcd970ca94c6caf0a9ba6f774a443e6ba354cea4329bcfb38d65f46099f9b6c5"} Oct 07 11:50:55 crc kubenswrapper[4700]: I1007 11:50:55.135396 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6gh6p" event={"ID":"efce44a2-43b7-497a-bd61-81d0bbb5259b","Type":"ContainerStarted","Data":"c60f8feddfd290d5f61daeaad726902ae8c43412110a7a698a881773176cced7"} Oct 07 11:50:55 crc kubenswrapper[4700]: I1007 11:50:55.157097 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6gh6p" podStartSLOduration=1.7325196109999998 podStartE2EDuration="2.157073077s" podCreationTimestamp="2025-10-07 11:50:53 +0000 UTC" firstStartedPulling="2025-10-07 11:50:54.013874353 +0000 UTC m=+1820.810273352" lastFinishedPulling="2025-10-07 11:50:54.438427789 +0000 UTC m=+1821.234826818" observedRunningTime="2025-10-07 11:50:55.1533652 +0000 UTC m=+1821.949764209" watchObservedRunningTime="2025-10-07 11:50:55.157073077 +0000 UTC m=+1821.953472066" Oct 07 11:51:06 crc kubenswrapper[4700]: I1007 11:51:06.289432 4700 scope.go:117] "RemoveContainer" containerID="1464cd0c52b0ed5ed8665b4cfe7e385c1f777085b1d80c27077e90d3861e5c03" Oct 07 11:51:47 crc kubenswrapper[4700]: I1007 11:51:47.681873 4700 generic.go:334] "Generic (PLEG): container finished" podID="efce44a2-43b7-497a-bd61-81d0bbb5259b" containerID="c60f8feddfd290d5f61daeaad726902ae8c43412110a7a698a881773176cced7" exitCode=0 Oct 07 11:51:47 crc kubenswrapper[4700]: I1007 11:51:47.681950 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6gh6p" event={"ID":"efce44a2-43b7-497a-bd61-81d0bbb5259b","Type":"ContainerDied","Data":"c60f8feddfd290d5f61daeaad726902ae8c43412110a7a698a881773176cced7"} Oct 07 11:51:49 crc kubenswrapper[4700]: I1007 11:51:49.113027 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6gh6p" Oct 07 11:51:49 crc kubenswrapper[4700]: I1007 11:51:49.294564 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/efce44a2-43b7-497a-bd61-81d0bbb5259b-ssh-key\") pod \"efce44a2-43b7-497a-bd61-81d0bbb5259b\" (UID: \"efce44a2-43b7-497a-bd61-81d0bbb5259b\") " Oct 07 11:51:49 crc kubenswrapper[4700]: I1007 11:51:49.294654 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjj5c\" (UniqueName: \"kubernetes.io/projected/efce44a2-43b7-497a-bd61-81d0bbb5259b-kube-api-access-zjj5c\") pod \"efce44a2-43b7-497a-bd61-81d0bbb5259b\" (UID: \"efce44a2-43b7-497a-bd61-81d0bbb5259b\") " Oct 07 11:51:49 crc kubenswrapper[4700]: I1007 11:51:49.294815 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efce44a2-43b7-497a-bd61-81d0bbb5259b-inventory\") pod \"efce44a2-43b7-497a-bd61-81d0bbb5259b\" (UID: \"efce44a2-43b7-497a-bd61-81d0bbb5259b\") " Oct 07 11:51:49 crc kubenswrapper[4700]: I1007 11:51:49.299866 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efce44a2-43b7-497a-bd61-81d0bbb5259b-kube-api-access-zjj5c" (OuterVolumeSpecName: "kube-api-access-zjj5c") pod "efce44a2-43b7-497a-bd61-81d0bbb5259b" (UID: "efce44a2-43b7-497a-bd61-81d0bbb5259b"). InnerVolumeSpecName "kube-api-access-zjj5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:51:49 crc kubenswrapper[4700]: I1007 11:51:49.323805 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efce44a2-43b7-497a-bd61-81d0bbb5259b-inventory" (OuterVolumeSpecName: "inventory") pod "efce44a2-43b7-497a-bd61-81d0bbb5259b" (UID: "efce44a2-43b7-497a-bd61-81d0bbb5259b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:51:49 crc kubenswrapper[4700]: I1007 11:51:49.344246 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efce44a2-43b7-497a-bd61-81d0bbb5259b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "efce44a2-43b7-497a-bd61-81d0bbb5259b" (UID: "efce44a2-43b7-497a-bd61-81d0bbb5259b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:51:49 crc kubenswrapper[4700]: I1007 11:51:49.396690 4700 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efce44a2-43b7-497a-bd61-81d0bbb5259b-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 11:51:49 crc kubenswrapper[4700]: I1007 11:51:49.396723 4700 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/efce44a2-43b7-497a-bd61-81d0bbb5259b-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 11:51:49 crc kubenswrapper[4700]: I1007 11:51:49.396736 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjj5c\" (UniqueName: \"kubernetes.io/projected/efce44a2-43b7-497a-bd61-81d0bbb5259b-kube-api-access-zjj5c\") on node \"crc\" DevicePath \"\"" Oct 07 11:51:49 crc kubenswrapper[4700]: I1007 11:51:49.705895 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6gh6p" event={"ID":"efce44a2-43b7-497a-bd61-81d0bbb5259b","Type":"ContainerDied","Data":"fcd970ca94c6caf0a9ba6f774a443e6ba354cea4329bcfb38d65f46099f9b6c5"} Oct 07 11:51:49 crc kubenswrapper[4700]: I1007 11:51:49.705948 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcd970ca94c6caf0a9ba6f774a443e6ba354cea4329bcfb38d65f46099f9b6c5" Oct 07 11:51:49 crc kubenswrapper[4700]: I1007 11:51:49.706030 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6gh6p" Oct 07 11:51:49 crc kubenswrapper[4700]: I1007 11:51:49.825792 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-zdwmq"] Oct 07 11:51:49 crc kubenswrapper[4700]: E1007 11:51:49.826175 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efce44a2-43b7-497a-bd61-81d0bbb5259b" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 07 11:51:49 crc kubenswrapper[4700]: I1007 11:51:49.826194 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="efce44a2-43b7-497a-bd61-81d0bbb5259b" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 07 11:51:49 crc kubenswrapper[4700]: I1007 11:51:49.826470 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="efce44a2-43b7-497a-bd61-81d0bbb5259b" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 07 11:51:49 crc kubenswrapper[4700]: I1007 11:51:49.827031 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-zdwmq"] Oct 07 11:51:49 crc kubenswrapper[4700]: I1007 11:51:49.827105 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-zdwmq" Oct 07 11:51:49 crc kubenswrapper[4700]: I1007 11:51:49.835965 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 11:51:49 crc kubenswrapper[4700]: I1007 11:51:49.836414 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 11:51:49 crc kubenswrapper[4700]: I1007 11:51:49.837298 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxxzn" Oct 07 11:51:49 crc kubenswrapper[4700]: I1007 11:51:49.837564 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 11:51:50 crc kubenswrapper[4700]: I1007 11:51:50.008990 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/046d1005-6634-4a0a-b10c-f5e3faf34ba6-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-zdwmq\" (UID: \"046d1005-6634-4a0a-b10c-f5e3faf34ba6\") " pod="openstack/ssh-known-hosts-edpm-deployment-zdwmq" Oct 07 11:51:50 crc kubenswrapper[4700]: I1007 11:51:50.009639 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldjcs\" (UniqueName: \"kubernetes.io/projected/046d1005-6634-4a0a-b10c-f5e3faf34ba6-kube-api-access-ldjcs\") pod \"ssh-known-hosts-edpm-deployment-zdwmq\" (UID: \"046d1005-6634-4a0a-b10c-f5e3faf34ba6\") " pod="openstack/ssh-known-hosts-edpm-deployment-zdwmq" Oct 07 11:51:50 crc kubenswrapper[4700]: I1007 11:51:50.010596 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/046d1005-6634-4a0a-b10c-f5e3faf34ba6-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-zdwmq\" (UID: \"046d1005-6634-4a0a-b10c-f5e3faf34ba6\") " pod="openstack/ssh-known-hosts-edpm-deployment-zdwmq" Oct 07 11:51:50 crc kubenswrapper[4700]: I1007 11:51:50.113206 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldjcs\" (UniqueName: \"kubernetes.io/projected/046d1005-6634-4a0a-b10c-f5e3faf34ba6-kube-api-access-ldjcs\") pod \"ssh-known-hosts-edpm-deployment-zdwmq\" (UID: \"046d1005-6634-4a0a-b10c-f5e3faf34ba6\") " pod="openstack/ssh-known-hosts-edpm-deployment-zdwmq" Oct 07 11:51:50 crc kubenswrapper[4700]: I1007 11:51:50.115033 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/046d1005-6634-4a0a-b10c-f5e3faf34ba6-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-zdwmq\" (UID: \"046d1005-6634-4a0a-b10c-f5e3faf34ba6\") " pod="openstack/ssh-known-hosts-edpm-deployment-zdwmq" Oct 07 11:51:50 crc kubenswrapper[4700]: I1007 11:51:50.115292 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/046d1005-6634-4a0a-b10c-f5e3faf34ba6-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-zdwmq\" (UID: \"046d1005-6634-4a0a-b10c-f5e3faf34ba6\") " pod="openstack/ssh-known-hosts-edpm-deployment-zdwmq" Oct 07 11:51:50 crc kubenswrapper[4700]: I1007 11:51:50.124810 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/046d1005-6634-4a0a-b10c-f5e3faf34ba6-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-zdwmq\" (UID: \"046d1005-6634-4a0a-b10c-f5e3faf34ba6\") " pod="openstack/ssh-known-hosts-edpm-deployment-zdwmq" Oct 07 11:51:50 crc kubenswrapper[4700]: I1007 11:51:50.128258 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/046d1005-6634-4a0a-b10c-f5e3faf34ba6-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-zdwmq\" (UID: \"046d1005-6634-4a0a-b10c-f5e3faf34ba6\") " pod="openstack/ssh-known-hosts-edpm-deployment-zdwmq" Oct 07 11:51:50 crc kubenswrapper[4700]: I1007 11:51:50.147993 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldjcs\" (UniqueName: \"kubernetes.io/projected/046d1005-6634-4a0a-b10c-f5e3faf34ba6-kube-api-access-ldjcs\") pod \"ssh-known-hosts-edpm-deployment-zdwmq\" (UID: \"046d1005-6634-4a0a-b10c-f5e3faf34ba6\") " pod="openstack/ssh-known-hosts-edpm-deployment-zdwmq" Oct 07 11:51:50 crc kubenswrapper[4700]: I1007 11:51:50.157648 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-zdwmq" Oct 07 11:51:50 crc kubenswrapper[4700]: I1007 11:51:50.709059 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-zdwmq"] Oct 07 11:51:51 crc kubenswrapper[4700]: I1007 11:51:51.734985 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-zdwmq" event={"ID":"046d1005-6634-4a0a-b10c-f5e3faf34ba6","Type":"ContainerStarted","Data":"260ae30f9ce0a5ced135861f7a820b544f8927c8c2a5195dbc885a96801ab985"} Oct 07 11:51:51 crc kubenswrapper[4700]: I1007 11:51:51.735424 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-zdwmq" event={"ID":"046d1005-6634-4a0a-b10c-f5e3faf34ba6","Type":"ContainerStarted","Data":"bdbb67366aefc620c59d1149b45d639cc75f4b4b81ac3ffa76a24fe4e8807dcf"} Oct 07 11:51:51 crc kubenswrapper[4700]: I1007 11:51:51.761936 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-zdwmq" podStartSLOduration=2.3150546260000002 podStartE2EDuration="2.761909396s" podCreationTimestamp="2025-10-07 11:51:49 +0000 UTC" firstStartedPulling="2025-10-07 11:51:50.722492206 +0000 UTC m=+1877.518891195" lastFinishedPulling="2025-10-07 11:51:51.169346966 +0000 UTC m=+1877.965745965" observedRunningTime="2025-10-07 11:51:51.755719144 +0000 UTC m=+1878.552118153" watchObservedRunningTime="2025-10-07 11:51:51.761909396 +0000 UTC m=+1878.558308425" Oct 07 11:51:59 crc kubenswrapper[4700]: I1007 11:51:59.843270 4700 generic.go:334] "Generic (PLEG): container finished" podID="046d1005-6634-4a0a-b10c-f5e3faf34ba6" containerID="260ae30f9ce0a5ced135861f7a820b544f8927c8c2a5195dbc885a96801ab985" exitCode=0 Oct 07 11:51:59 crc kubenswrapper[4700]: I1007 11:51:59.843457 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-zdwmq" event={"ID":"046d1005-6634-4a0a-b10c-f5e3faf34ba6","Type":"ContainerDied","Data":"260ae30f9ce0a5ced135861f7a820b544f8927c8c2a5195dbc885a96801ab985"} Oct 07 11:52:01 crc kubenswrapper[4700]: I1007 11:52:01.282383 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-zdwmq" Oct 07 11:52:01 crc kubenswrapper[4700]: I1007 11:52:01.457142 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/046d1005-6634-4a0a-b10c-f5e3faf34ba6-ssh-key-openstack-edpm-ipam\") pod \"046d1005-6634-4a0a-b10c-f5e3faf34ba6\" (UID: \"046d1005-6634-4a0a-b10c-f5e3faf34ba6\") " Oct 07 11:52:01 crc kubenswrapper[4700]: I1007 11:52:01.457249 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/046d1005-6634-4a0a-b10c-f5e3faf34ba6-inventory-0\") pod \"046d1005-6634-4a0a-b10c-f5e3faf34ba6\" (UID: \"046d1005-6634-4a0a-b10c-f5e3faf34ba6\") " Oct 07 11:52:01 crc kubenswrapper[4700]: I1007 11:52:01.457373 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldjcs\" (UniqueName: \"kubernetes.io/projected/046d1005-6634-4a0a-b10c-f5e3faf34ba6-kube-api-access-ldjcs\") pod \"046d1005-6634-4a0a-b10c-f5e3faf34ba6\" (UID: \"046d1005-6634-4a0a-b10c-f5e3faf34ba6\") " Oct 07 11:52:01 crc kubenswrapper[4700]: I1007 11:52:01.467295 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/046d1005-6634-4a0a-b10c-f5e3faf34ba6-kube-api-access-ldjcs" (OuterVolumeSpecName: "kube-api-access-ldjcs") pod "046d1005-6634-4a0a-b10c-f5e3faf34ba6" (UID: "046d1005-6634-4a0a-b10c-f5e3faf34ba6"). InnerVolumeSpecName "kube-api-access-ldjcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:52:01 crc kubenswrapper[4700]: I1007 11:52:01.484481 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/046d1005-6634-4a0a-b10c-f5e3faf34ba6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "046d1005-6634-4a0a-b10c-f5e3faf34ba6" (UID: "046d1005-6634-4a0a-b10c-f5e3faf34ba6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:52:01 crc kubenswrapper[4700]: I1007 11:52:01.489535 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/046d1005-6634-4a0a-b10c-f5e3faf34ba6-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "046d1005-6634-4a0a-b10c-f5e3faf34ba6" (UID: "046d1005-6634-4a0a-b10c-f5e3faf34ba6"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:52:01 crc kubenswrapper[4700]: I1007 11:52:01.559416 4700 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/046d1005-6634-4a0a-b10c-f5e3faf34ba6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 07 11:52:01 crc kubenswrapper[4700]: I1007 11:52:01.559453 4700 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/046d1005-6634-4a0a-b10c-f5e3faf34ba6-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 07 11:52:01 crc kubenswrapper[4700]: I1007 11:52:01.559463 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldjcs\" (UniqueName: \"kubernetes.io/projected/046d1005-6634-4a0a-b10c-f5e3faf34ba6-kube-api-access-ldjcs\") on node \"crc\" DevicePath \"\"" Oct 07 11:52:01 crc kubenswrapper[4700]: I1007 11:52:01.866770 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-zdwmq" event={"ID":"046d1005-6634-4a0a-b10c-f5e3faf34ba6","Type":"ContainerDied","Data":"bdbb67366aefc620c59d1149b45d639cc75f4b4b81ac3ffa76a24fe4e8807dcf"} Oct 07 11:52:01 crc kubenswrapper[4700]: I1007 11:52:01.866813 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdbb67366aefc620c59d1149b45d639cc75f4b4b81ac3ffa76a24fe4e8807dcf" Oct 07 11:52:01 crc kubenswrapper[4700]: I1007 11:52:01.866903 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-zdwmq" Oct 07 11:52:01 crc kubenswrapper[4700]: I1007 11:52:01.947095 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-h694x"] Oct 07 11:52:01 crc kubenswrapper[4700]: E1007 11:52:01.947578 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="046d1005-6634-4a0a-b10c-f5e3faf34ba6" containerName="ssh-known-hosts-edpm-deployment" Oct 07 11:52:01 crc kubenswrapper[4700]: I1007 11:52:01.947603 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="046d1005-6634-4a0a-b10c-f5e3faf34ba6" containerName="ssh-known-hosts-edpm-deployment" Oct 07 11:52:01 crc kubenswrapper[4700]: I1007 11:52:01.947864 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="046d1005-6634-4a0a-b10c-f5e3faf34ba6" containerName="ssh-known-hosts-edpm-deployment" Oct 07 11:52:01 crc kubenswrapper[4700]: I1007 11:52:01.948680 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-h694x" Oct 07 11:52:01 crc kubenswrapper[4700]: I1007 11:52:01.950644 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 11:52:01 crc kubenswrapper[4700]: I1007 11:52:01.950728 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 11:52:01 crc kubenswrapper[4700]: I1007 11:52:01.950811 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxxzn" Oct 07 11:52:01 crc kubenswrapper[4700]: I1007 11:52:01.953705 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 11:52:01 crc kubenswrapper[4700]: I1007 11:52:01.975668 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-h694x"] Oct 07 11:52:02 crc kubenswrapper[4700]: I1007 11:52:02.068639 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpbb6\" (UniqueName: \"kubernetes.io/projected/e0c8a166-6dac-4916-a2a7-9367a5ba765c-kube-api-access-dpbb6\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-h694x\" (UID: \"e0c8a166-6dac-4916-a2a7-9367a5ba765c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-h694x" Oct 07 11:52:02 crc kubenswrapper[4700]: I1007 11:52:02.068860 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0c8a166-6dac-4916-a2a7-9367a5ba765c-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-h694x\" (UID: \"e0c8a166-6dac-4916-a2a7-9367a5ba765c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-h694x" Oct 07 11:52:02 crc kubenswrapper[4700]: I1007 11:52:02.069239 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0c8a166-6dac-4916-a2a7-9367a5ba765c-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-h694x\" (UID: \"e0c8a166-6dac-4916-a2a7-9367a5ba765c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-h694x" Oct 07 11:52:02 crc kubenswrapper[4700]: I1007 11:52:02.171195 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpbb6\" (UniqueName: \"kubernetes.io/projected/e0c8a166-6dac-4916-a2a7-9367a5ba765c-kube-api-access-dpbb6\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-h694x\" (UID: \"e0c8a166-6dac-4916-a2a7-9367a5ba765c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-h694x" Oct 07 11:52:02 crc kubenswrapper[4700]: I1007 11:52:02.171243 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0c8a166-6dac-4916-a2a7-9367a5ba765c-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-h694x\" (UID: \"e0c8a166-6dac-4916-a2a7-9367a5ba765c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-h694x" Oct 07 11:52:02 crc kubenswrapper[4700]: I1007 11:52:02.171321 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0c8a166-6dac-4916-a2a7-9367a5ba765c-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-h694x\" (UID: \"e0c8a166-6dac-4916-a2a7-9367a5ba765c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-h694x" Oct 07 11:52:02 crc kubenswrapper[4700]: I1007 11:52:02.176374 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0c8a166-6dac-4916-a2a7-9367a5ba765c-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-h694x\" (UID: \"e0c8a166-6dac-4916-a2a7-9367a5ba765c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-h694x" Oct 07 11:52:02 crc kubenswrapper[4700]: I1007 11:52:02.179057 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0c8a166-6dac-4916-a2a7-9367a5ba765c-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-h694x\" (UID: \"e0c8a166-6dac-4916-a2a7-9367a5ba765c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-h694x" Oct 07 11:52:02 crc kubenswrapper[4700]: I1007 11:52:02.191371 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpbb6\" (UniqueName: \"kubernetes.io/projected/e0c8a166-6dac-4916-a2a7-9367a5ba765c-kube-api-access-dpbb6\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-h694x\" (UID: \"e0c8a166-6dac-4916-a2a7-9367a5ba765c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-h694x" Oct 07 11:52:02 crc kubenswrapper[4700]: I1007 11:52:02.267980 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-h694x" Oct 07 11:52:02 crc kubenswrapper[4700]: I1007 11:52:02.833197 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-h694x"] Oct 07 11:52:02 crc kubenswrapper[4700]: I1007 11:52:02.875361 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-h694x" event={"ID":"e0c8a166-6dac-4916-a2a7-9367a5ba765c","Type":"ContainerStarted","Data":"fb9d844eb370ccff98636e9047b09cb311a02f0fcb256cda3035aefc70a46487"} Oct 07 11:52:03 crc kubenswrapper[4700]: I1007 11:52:03.885923 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-h694x" event={"ID":"e0c8a166-6dac-4916-a2a7-9367a5ba765c","Type":"ContainerStarted","Data":"c03e76ef912262baa7de43b59a215c019a68285cdc90f885117a6c23472f97df"} Oct 07 11:52:03 crc kubenswrapper[4700]: I1007 11:52:03.909717 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-h694x" podStartSLOduration=2.265242412 podStartE2EDuration="2.909695539s" podCreationTimestamp="2025-10-07 11:52:01 +0000 UTC" firstStartedPulling="2025-10-07 11:52:02.849928628 +0000 UTC m=+1889.646327617" lastFinishedPulling="2025-10-07 11:52:03.494381725 +0000 UTC m=+1890.290780744" observedRunningTime="2025-10-07 11:52:03.901221887 +0000 UTC m=+1890.697620876" watchObservedRunningTime="2025-10-07 11:52:03.909695539 +0000 UTC m=+1890.706094528" Oct 07 11:52:13 crc kubenswrapper[4700]: I1007 11:52:13.003017 4700 generic.go:334] "Generic (PLEG): container finished" podID="e0c8a166-6dac-4916-a2a7-9367a5ba765c" containerID="c03e76ef912262baa7de43b59a215c019a68285cdc90f885117a6c23472f97df" exitCode=0 Oct 07 11:52:13 crc kubenswrapper[4700]: I1007 11:52:13.003065 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-h694x" event={"ID":"e0c8a166-6dac-4916-a2a7-9367a5ba765c","Type":"ContainerDied","Data":"c03e76ef912262baa7de43b59a215c019a68285cdc90f885117a6c23472f97df"} Oct 07 11:52:14 crc kubenswrapper[4700]: I1007 11:52:14.503844 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-h694x" Oct 07 11:52:14 crc kubenswrapper[4700]: I1007 11:52:14.558962 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0c8a166-6dac-4916-a2a7-9367a5ba765c-ssh-key\") pod \"e0c8a166-6dac-4916-a2a7-9367a5ba765c\" (UID: \"e0c8a166-6dac-4916-a2a7-9367a5ba765c\") " Oct 07 11:52:14 crc kubenswrapper[4700]: I1007 11:52:14.559026 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0c8a166-6dac-4916-a2a7-9367a5ba765c-inventory\") pod \"e0c8a166-6dac-4916-a2a7-9367a5ba765c\" (UID: \"e0c8a166-6dac-4916-a2a7-9367a5ba765c\") " Oct 07 11:52:14 crc kubenswrapper[4700]: I1007 11:52:14.559109 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpbb6\" (UniqueName: \"kubernetes.io/projected/e0c8a166-6dac-4916-a2a7-9367a5ba765c-kube-api-access-dpbb6\") pod \"e0c8a166-6dac-4916-a2a7-9367a5ba765c\" (UID: \"e0c8a166-6dac-4916-a2a7-9367a5ba765c\") " Oct 07 11:52:14 crc kubenswrapper[4700]: I1007 11:52:14.567896 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0c8a166-6dac-4916-a2a7-9367a5ba765c-kube-api-access-dpbb6" (OuterVolumeSpecName: "kube-api-access-dpbb6") pod "e0c8a166-6dac-4916-a2a7-9367a5ba765c" (UID: "e0c8a166-6dac-4916-a2a7-9367a5ba765c"). InnerVolumeSpecName "kube-api-access-dpbb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:52:14 crc kubenswrapper[4700]: I1007 11:52:14.609951 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0c8a166-6dac-4916-a2a7-9367a5ba765c-inventory" (OuterVolumeSpecName: "inventory") pod "e0c8a166-6dac-4916-a2a7-9367a5ba765c" (UID: "e0c8a166-6dac-4916-a2a7-9367a5ba765c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:52:14 crc kubenswrapper[4700]: I1007 11:52:14.615499 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0c8a166-6dac-4916-a2a7-9367a5ba765c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e0c8a166-6dac-4916-a2a7-9367a5ba765c" (UID: "e0c8a166-6dac-4916-a2a7-9367a5ba765c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:52:14 crc kubenswrapper[4700]: I1007 11:52:14.660766 4700 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0c8a166-6dac-4916-a2a7-9367a5ba765c-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 11:52:14 crc kubenswrapper[4700]: I1007 11:52:14.660815 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpbb6\" (UniqueName: \"kubernetes.io/projected/e0c8a166-6dac-4916-a2a7-9367a5ba765c-kube-api-access-dpbb6\") on node \"crc\" DevicePath \"\"" Oct 07 11:52:14 crc kubenswrapper[4700]: I1007 11:52:14.660839 4700 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0c8a166-6dac-4916-a2a7-9367a5ba765c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 11:52:15 crc kubenswrapper[4700]: I1007 11:52:15.074592 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-h694x" event={"ID":"e0c8a166-6dac-4916-a2a7-9367a5ba765c","Type":"ContainerDied","Data":"fb9d844eb370ccff98636e9047b09cb311a02f0fcb256cda3035aefc70a46487"} Oct 07 11:52:15 crc kubenswrapper[4700]: I1007 11:52:15.074668 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb9d844eb370ccff98636e9047b09cb311a02f0fcb256cda3035aefc70a46487" Oct 07 11:52:15 crc kubenswrapper[4700]: I1007 11:52:15.074794 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-h694x" Oct 07 11:52:15 crc kubenswrapper[4700]: I1007 11:52:15.120586 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2p4sb"] Oct 07 11:52:15 crc kubenswrapper[4700]: E1007 11:52:15.121165 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0c8a166-6dac-4916-a2a7-9367a5ba765c" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 07 11:52:15 crc kubenswrapper[4700]: I1007 11:52:15.121192 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0c8a166-6dac-4916-a2a7-9367a5ba765c" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 07 11:52:15 crc kubenswrapper[4700]: I1007 11:52:15.121591 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0c8a166-6dac-4916-a2a7-9367a5ba765c" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 07 11:52:15 crc kubenswrapper[4700]: I1007 11:52:15.123342 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2p4sb" Oct 07 11:52:15 crc kubenswrapper[4700]: I1007 11:52:15.127006 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 11:52:15 crc kubenswrapper[4700]: I1007 11:52:15.128019 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 11:52:15 crc kubenswrapper[4700]: I1007 11:52:15.128428 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 11:52:15 crc kubenswrapper[4700]: I1007 11:52:15.128467 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxxzn" Oct 07 11:52:15 crc kubenswrapper[4700]: I1007 11:52:15.144850 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2p4sb"] Oct 07 11:52:15 crc kubenswrapper[4700]: I1007 11:52:15.174941 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c0097f4-e71a-4bfe-8425-c87d93929a43-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2p4sb\" (UID: \"5c0097f4-e71a-4bfe-8425-c87d93929a43\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2p4sb" Oct 07 11:52:15 crc kubenswrapper[4700]: I1007 11:52:15.175077 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdxsl\" (UniqueName: \"kubernetes.io/projected/5c0097f4-e71a-4bfe-8425-c87d93929a43-kube-api-access-rdxsl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2p4sb\" (UID: \"5c0097f4-e71a-4bfe-8425-c87d93929a43\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2p4sb" Oct 07 11:52:15 crc kubenswrapper[4700]: I1007 11:52:15.175208 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c0097f4-e71a-4bfe-8425-c87d93929a43-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2p4sb\" (UID: \"5c0097f4-e71a-4bfe-8425-c87d93929a43\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2p4sb" Oct 07 11:52:15 crc kubenswrapper[4700]: I1007 11:52:15.277117 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c0097f4-e71a-4bfe-8425-c87d93929a43-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2p4sb\" (UID: \"5c0097f4-e71a-4bfe-8425-c87d93929a43\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2p4sb" Oct 07 11:52:15 crc kubenswrapper[4700]: I1007 11:52:15.277277 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdxsl\" (UniqueName: \"kubernetes.io/projected/5c0097f4-e71a-4bfe-8425-c87d93929a43-kube-api-access-rdxsl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2p4sb\" (UID: \"5c0097f4-e71a-4bfe-8425-c87d93929a43\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2p4sb" Oct 07 11:52:15 crc kubenswrapper[4700]: I1007 11:52:15.277379 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c0097f4-e71a-4bfe-8425-c87d93929a43-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2p4sb\" (UID: \"5c0097f4-e71a-4bfe-8425-c87d93929a43\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2p4sb" Oct 07 11:52:15 crc kubenswrapper[4700]: I1007 11:52:15.284259 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c0097f4-e71a-4bfe-8425-c87d93929a43-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2p4sb\" (UID: \"5c0097f4-e71a-4bfe-8425-c87d93929a43\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2p4sb" Oct 07 11:52:15 crc kubenswrapper[4700]: I1007 11:52:15.285163 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c0097f4-e71a-4bfe-8425-c87d93929a43-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2p4sb\" (UID: \"5c0097f4-e71a-4bfe-8425-c87d93929a43\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2p4sb" Oct 07 11:52:15 crc kubenswrapper[4700]: I1007 11:52:15.296814 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdxsl\" (UniqueName: \"kubernetes.io/projected/5c0097f4-e71a-4bfe-8425-c87d93929a43-kube-api-access-rdxsl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2p4sb\" (UID: \"5c0097f4-e71a-4bfe-8425-c87d93929a43\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2p4sb" Oct 07 11:52:15 crc kubenswrapper[4700]: I1007 11:52:15.456803 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2p4sb" Oct 07 11:52:16 crc kubenswrapper[4700]: I1007 11:52:16.031406 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2p4sb"] Oct 07 11:52:16 crc kubenswrapper[4700]: I1007 11:52:16.086250 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2p4sb" event={"ID":"5c0097f4-e71a-4bfe-8425-c87d93929a43","Type":"ContainerStarted","Data":"511eb442a6123eb67853a82b48f466aed29594757d9e3aa7fcdfce127e1e68ca"} Oct 07 11:52:17 crc kubenswrapper[4700]: I1007 11:52:17.102955 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2p4sb" event={"ID":"5c0097f4-e71a-4bfe-8425-c87d93929a43","Type":"ContainerStarted","Data":"400d0419de57f9b4336d8fff5e776790a690498e0c87c2f9f87513b6b966e220"} Oct 07 11:52:17 crc kubenswrapper[4700]: I1007 11:52:17.146563 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2p4sb" podStartSLOduration=1.661842972 podStartE2EDuration="2.146536731s" podCreationTimestamp="2025-10-07 11:52:15 +0000 UTC" firstStartedPulling="2025-10-07 11:52:16.053735645 +0000 UTC m=+1902.850134644" lastFinishedPulling="2025-10-07 11:52:16.538429404 +0000 UTC m=+1903.334828403" observedRunningTime="2025-10-07 11:52:17.127293498 +0000 UTC m=+1903.923692517" watchObservedRunningTime="2025-10-07 11:52:17.146536731 +0000 UTC m=+1903.942935760" Oct 07 11:52:28 crc kubenswrapper[4700]: I1007 11:52:28.226040 4700 generic.go:334] "Generic (PLEG): container finished" podID="5c0097f4-e71a-4bfe-8425-c87d93929a43" containerID="400d0419de57f9b4336d8fff5e776790a690498e0c87c2f9f87513b6b966e220" exitCode=0 Oct 07 11:52:28 crc kubenswrapper[4700]: I1007 11:52:28.226156 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2p4sb" event={"ID":"5c0097f4-e71a-4bfe-8425-c87d93929a43","Type":"ContainerDied","Data":"400d0419de57f9b4336d8fff5e776790a690498e0c87c2f9f87513b6b966e220"} Oct 07 11:52:29 crc kubenswrapper[4700]: I1007 11:52:29.715297 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2p4sb" Oct 07 11:52:29 crc kubenswrapper[4700]: I1007 11:52:29.890794 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c0097f4-e71a-4bfe-8425-c87d93929a43-inventory\") pod \"5c0097f4-e71a-4bfe-8425-c87d93929a43\" (UID: \"5c0097f4-e71a-4bfe-8425-c87d93929a43\") " Oct 07 11:52:29 crc kubenswrapper[4700]: I1007 11:52:29.890848 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdxsl\" (UniqueName: \"kubernetes.io/projected/5c0097f4-e71a-4bfe-8425-c87d93929a43-kube-api-access-rdxsl\") pod \"5c0097f4-e71a-4bfe-8425-c87d93929a43\" (UID: \"5c0097f4-e71a-4bfe-8425-c87d93929a43\") " Oct 07 11:52:29 crc kubenswrapper[4700]: I1007 11:52:29.890911 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c0097f4-e71a-4bfe-8425-c87d93929a43-ssh-key\") pod \"5c0097f4-e71a-4bfe-8425-c87d93929a43\" (UID: \"5c0097f4-e71a-4bfe-8425-c87d93929a43\") " Oct 07 11:52:29 crc kubenswrapper[4700]: I1007 11:52:29.900988 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c0097f4-e71a-4bfe-8425-c87d93929a43-kube-api-access-rdxsl" (OuterVolumeSpecName: "kube-api-access-rdxsl") pod "5c0097f4-e71a-4bfe-8425-c87d93929a43" (UID: "5c0097f4-e71a-4bfe-8425-c87d93929a43"). InnerVolumeSpecName "kube-api-access-rdxsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:52:29 crc kubenswrapper[4700]: I1007 11:52:29.945050 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c0097f4-e71a-4bfe-8425-c87d93929a43-inventory" (OuterVolumeSpecName: "inventory") pod "5c0097f4-e71a-4bfe-8425-c87d93929a43" (UID: "5c0097f4-e71a-4bfe-8425-c87d93929a43"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:52:29 crc kubenswrapper[4700]: I1007 11:52:29.947519 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c0097f4-e71a-4bfe-8425-c87d93929a43-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5c0097f4-e71a-4bfe-8425-c87d93929a43" (UID: "5c0097f4-e71a-4bfe-8425-c87d93929a43"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:52:29 crc kubenswrapper[4700]: I1007 11:52:29.992992 4700 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c0097f4-e71a-4bfe-8425-c87d93929a43-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 11:52:29 crc kubenswrapper[4700]: I1007 11:52:29.993018 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdxsl\" (UniqueName: \"kubernetes.io/projected/5c0097f4-e71a-4bfe-8425-c87d93929a43-kube-api-access-rdxsl\") on node \"crc\" DevicePath \"\"" Oct 07 11:52:29 crc kubenswrapper[4700]: I1007 11:52:29.993027 4700 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c0097f4-e71a-4bfe-8425-c87d93929a43-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.257425 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2p4sb" event={"ID":"5c0097f4-e71a-4bfe-8425-c87d93929a43","Type":"ContainerDied","Data":"511eb442a6123eb67853a82b48f466aed29594757d9e3aa7fcdfce127e1e68ca"} Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.257501 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="511eb442a6123eb67853a82b48f466aed29594757d9e3aa7fcdfce127e1e68ca" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.257722 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2p4sb" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.468690 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh"] Oct 07 11:52:30 crc kubenswrapper[4700]: E1007 11:52:30.469540 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c0097f4-e71a-4bfe-8425-c87d93929a43" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.469591 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c0097f4-e71a-4bfe-8425-c87d93929a43" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.469996 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c0097f4-e71a-4bfe-8425-c87d93929a43" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.471122 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.475747 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.476431 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.476681 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.476773 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.476979 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.478468 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.479245 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.479986 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh"] Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.480158 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxxzn" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.606274 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.606713 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.606816 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.606847 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.606959 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.607024 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.607055 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.607105 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.607197 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.607296 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.607404 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.607438 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42gqw\" (UniqueName: \"kubernetes.io/projected/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-kube-api-access-42gqw\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.607561 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.607641 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.709206 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.710386 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.710439 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.710475 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.710522 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42gqw\" (UniqueName: \"kubernetes.io/projected/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-kube-api-access-42gqw\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.710566 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.710601 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.710661 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.710746 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.710800 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.710842 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.711078 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.711115 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.711156 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.715928 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.718082 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.719166 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.719583 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.719913 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.722452 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.722475 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.723178 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.723571 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.725069 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.725919 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.728672 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.729419 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.738557 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42gqw\" (UniqueName: \"kubernetes.io/projected/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-kube-api-access-42gqw\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:30 crc kubenswrapper[4700]: I1007 11:52:30.797839 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:52:31 crc kubenswrapper[4700]: I1007 11:52:31.418599 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh"] Oct 07 11:52:31 crc kubenswrapper[4700]: W1007 11:52:31.426458 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cfb28ea_af7c_4518_8e0a_34a0e87fb5f5.slice/crio-cd6b59ca524d97b216181d3a4539f2a464de0299effde4095a66299e7736147f WatchSource:0}: Error finding container cd6b59ca524d97b216181d3a4539f2a464de0299effde4095a66299e7736147f: Status 404 returned error can't find the container with id cd6b59ca524d97b216181d3a4539f2a464de0299effde4095a66299e7736147f Oct 07 11:52:31 crc kubenswrapper[4700]: I1007 11:52:31.430178 4700 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 11:52:32 crc kubenswrapper[4700]: I1007 11:52:32.283250 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" event={"ID":"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5","Type":"ContainerStarted","Data":"afeb3eb5a83ee53477a6845fc54491b830cf89a2671ca38f56cca40fa54af22c"} Oct 07 11:52:32 crc kubenswrapper[4700]: I1007 11:52:32.283556 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" event={"ID":"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5","Type":"ContainerStarted","Data":"cd6b59ca524d97b216181d3a4539f2a464de0299effde4095a66299e7736147f"} Oct 07 11:52:32 crc kubenswrapper[4700]: I1007 11:52:32.314840 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" podStartSLOduration=1.907054981 podStartE2EDuration="2.314815077s" podCreationTimestamp="2025-10-07 11:52:30 +0000 UTC" firstStartedPulling="2025-10-07 11:52:31.429697475 +0000 UTC m=+1918.226096474" lastFinishedPulling="2025-10-07 11:52:31.837457541 +0000 UTC m=+1918.633856570" observedRunningTime="2025-10-07 11:52:32.30687802 +0000 UTC m=+1919.103277049" watchObservedRunningTime="2025-10-07 11:52:32.314815077 +0000 UTC m=+1919.111214096" Oct 07 11:53:15 crc kubenswrapper[4700]: I1007 11:53:15.333839 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 11:53:15 crc kubenswrapper[4700]: I1007 11:53:15.335418 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 11:53:17 crc kubenswrapper[4700]: I1007 11:53:17.789705 4700 generic.go:334] "Generic (PLEG): container finished" podID="1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5" containerID="afeb3eb5a83ee53477a6845fc54491b830cf89a2671ca38f56cca40fa54af22c" exitCode=0 Oct 07 11:53:17 crc kubenswrapper[4700]: I1007 11:53:17.789854 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" event={"ID":"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5","Type":"ContainerDied","Data":"afeb3eb5a83ee53477a6845fc54491b830cf89a2671ca38f56cca40fa54af22c"} Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.338245 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.392098 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-telemetry-combined-ca-bundle\") pod \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.392156 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-ovn-combined-ca-bundle\") pod \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.392193 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-libvirt-combined-ca-bundle\") pod \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.392227 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.392251 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.392294 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42gqw\" (UniqueName: \"kubernetes.io/projected/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-kube-api-access-42gqw\") pod \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.392333 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-neutron-metadata-combined-ca-bundle\") pod \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.392383 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-repo-setup-combined-ca-bundle\") pod \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.392430 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.393403 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-inventory\") pod \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.393445 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.393482 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-ssh-key\") pod \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.393529 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-nova-combined-ca-bundle\") pod \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.393556 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-bootstrap-combined-ca-bundle\") pod \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\" (UID: \"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5\") " Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.401198 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5" (UID: "1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.401647 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5" (UID: "1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.401742 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5" (UID: "1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.405658 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5" (UID: "1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.405673 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5" (UID: "1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.405765 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5" (UID: "1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.405754 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5" (UID: "1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.405812 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5" (UID: "1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.405907 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5" (UID: "1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.407005 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5" (UID: "1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.407342 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-kube-api-access-42gqw" (OuterVolumeSpecName: "kube-api-access-42gqw") pod "1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5" (UID: "1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5"). InnerVolumeSpecName "kube-api-access-42gqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.413182 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5" (UID: "1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.428064 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5" (UID: "1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.439457 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-inventory" (OuterVolumeSpecName: "inventory") pod "1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5" (UID: "1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.495751 4700 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.495781 4700 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.495795 4700 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.495806 4700 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.495815 4700 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.495832 4700 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.495841 4700 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.495850 4700 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.495858 4700 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.495867 4700 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.495875 4700 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.495883 4700 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.495892 4700 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.495901 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42gqw\" (UniqueName: \"kubernetes.io/projected/1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5-kube-api-access-42gqw\") on node \"crc\" DevicePath \"\"" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.817623 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" event={"ID":"1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5","Type":"ContainerDied","Data":"cd6b59ca524d97b216181d3a4539f2a464de0299effde4095a66299e7736147f"} Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.817662 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd6b59ca524d97b216181d3a4539f2a464de0299effde4095a66299e7736147f" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.817707 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.946123 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-s25ms"] Oct 07 11:53:19 crc kubenswrapper[4700]: E1007 11:53:19.946592 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.946616 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.946893 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.947752 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s25ms" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.951145 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.951160 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxxzn" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.951448 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.952743 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.951083 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 11:53:19 crc kubenswrapper[4700]: I1007 11:53:19.976002 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-s25ms"] Oct 07 11:53:20 crc kubenswrapper[4700]: I1007 11:53:20.007904 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19561523-a9ea-4632-9aa7-6be23fa3eee5-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s25ms\" (UID: \"19561523-a9ea-4632-9aa7-6be23fa3eee5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s25ms" Oct 07 11:53:20 crc kubenswrapper[4700]: I1007 11:53:20.007998 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/19561523-a9ea-4632-9aa7-6be23fa3eee5-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s25ms\" (UID: \"19561523-a9ea-4632-9aa7-6be23fa3eee5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s25ms" Oct 07 11:53:20 crc kubenswrapper[4700]: I1007 11:53:20.008085 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19561523-a9ea-4632-9aa7-6be23fa3eee5-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s25ms\" (UID: \"19561523-a9ea-4632-9aa7-6be23fa3eee5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s25ms" Oct 07 11:53:20 crc kubenswrapper[4700]: I1007 11:53:20.008132 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x554\" (UniqueName: \"kubernetes.io/projected/19561523-a9ea-4632-9aa7-6be23fa3eee5-kube-api-access-6x554\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s25ms\" (UID: \"19561523-a9ea-4632-9aa7-6be23fa3eee5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s25ms" Oct 07 11:53:20 crc kubenswrapper[4700]: I1007 11:53:20.008235 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19561523-a9ea-4632-9aa7-6be23fa3eee5-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s25ms\" (UID: \"19561523-a9ea-4632-9aa7-6be23fa3eee5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s25ms" Oct 07 11:53:20 crc kubenswrapper[4700]: I1007 11:53:20.109750 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19561523-a9ea-4632-9aa7-6be23fa3eee5-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s25ms\" (UID: \"19561523-a9ea-4632-9aa7-6be23fa3eee5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s25ms" Oct 07 11:53:20 crc kubenswrapper[4700]: I1007 11:53:20.109813 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/19561523-a9ea-4632-9aa7-6be23fa3eee5-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s25ms\" (UID: \"19561523-a9ea-4632-9aa7-6be23fa3eee5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s25ms" Oct 07 11:53:20 crc kubenswrapper[4700]: I1007 11:53:20.109881 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19561523-a9ea-4632-9aa7-6be23fa3eee5-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s25ms\" (UID: \"19561523-a9ea-4632-9aa7-6be23fa3eee5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s25ms" Oct 07 11:53:20 crc kubenswrapper[4700]: I1007 11:53:20.109913 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x554\" (UniqueName: \"kubernetes.io/projected/19561523-a9ea-4632-9aa7-6be23fa3eee5-kube-api-access-6x554\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s25ms\" (UID: \"19561523-a9ea-4632-9aa7-6be23fa3eee5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s25ms" Oct 07 11:53:20 crc kubenswrapper[4700]: I1007 11:53:20.109982 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19561523-a9ea-4632-9aa7-6be23fa3eee5-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s25ms\" (UID: \"19561523-a9ea-4632-9aa7-6be23fa3eee5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s25ms" Oct 07 11:53:20 crc kubenswrapper[4700]: I1007 11:53:20.111739 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/19561523-a9ea-4632-9aa7-6be23fa3eee5-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s25ms\" (UID: \"19561523-a9ea-4632-9aa7-6be23fa3eee5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s25ms" Oct 07 11:53:20 crc kubenswrapper[4700]: I1007 11:53:20.113723 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19561523-a9ea-4632-9aa7-6be23fa3eee5-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s25ms\" (UID: \"19561523-a9ea-4632-9aa7-6be23fa3eee5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s25ms" Oct 07 11:53:20 crc kubenswrapper[4700]: I1007 11:53:20.115180 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19561523-a9ea-4632-9aa7-6be23fa3eee5-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s25ms\" (UID: \"19561523-a9ea-4632-9aa7-6be23fa3eee5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s25ms" Oct 07 11:53:20 crc kubenswrapper[4700]: I1007 11:53:20.117110 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19561523-a9ea-4632-9aa7-6be23fa3eee5-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s25ms\" (UID: \"19561523-a9ea-4632-9aa7-6be23fa3eee5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s25ms" Oct 07 11:53:20 crc kubenswrapper[4700]: I1007 11:53:20.133867 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x554\" (UniqueName: \"kubernetes.io/projected/19561523-a9ea-4632-9aa7-6be23fa3eee5-kube-api-access-6x554\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s25ms\" (UID: \"19561523-a9ea-4632-9aa7-6be23fa3eee5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s25ms" Oct 07 11:53:20 crc kubenswrapper[4700]: I1007 11:53:20.292143 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s25ms" Oct 07 11:53:20 crc kubenswrapper[4700]: I1007 11:53:20.879675 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-s25ms"] Oct 07 11:53:21 crc kubenswrapper[4700]: I1007 11:53:21.845819 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s25ms" event={"ID":"19561523-a9ea-4632-9aa7-6be23fa3eee5","Type":"ContainerStarted","Data":"c951e29d912a33f10f214c8e5b1e0694fe011e9a59fed1c6878560169118a908"} Oct 07 11:53:21 crc kubenswrapper[4700]: I1007 11:53:21.846182 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s25ms" event={"ID":"19561523-a9ea-4632-9aa7-6be23fa3eee5","Type":"ContainerStarted","Data":"c1b50e0ed7f78b843c78e0b0c9ebd0905c6402f2ca5a9f348583ec1b192f09b3"} Oct 07 11:53:21 crc kubenswrapper[4700]: I1007 11:53:21.872596 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s25ms" podStartSLOduration=2.308043773 podStartE2EDuration="2.87257987s" podCreationTimestamp="2025-10-07 11:53:19 +0000 UTC" firstStartedPulling="2025-10-07 11:53:20.888760765 +0000 UTC m=+1967.685159754" lastFinishedPulling="2025-10-07 11:53:21.453296832 +0000 UTC m=+1968.249695851" observedRunningTime="2025-10-07 11:53:21.869747186 +0000 UTC m=+1968.666146215" watchObservedRunningTime="2025-10-07 11:53:21.87257987 +0000 UTC m=+1968.668978859" Oct 07 11:53:45 crc kubenswrapper[4700]: I1007 11:53:45.333557 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 11:53:45 crc kubenswrapper[4700]: I1007 11:53:45.334132 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 11:54:15 crc kubenswrapper[4700]: I1007 11:54:15.334160 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 11:54:15 crc kubenswrapper[4700]: I1007 11:54:15.334822 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 11:54:15 crc kubenswrapper[4700]: I1007 11:54:15.334877 4700 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" Oct 07 11:54:15 crc kubenswrapper[4700]: I1007 11:54:15.335905 4700 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b4eb2c0b76ac0ef29070f1fc5ad19b527d10f64c8c75273ac1ebd81f60388f25"} pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 11:54:15 crc kubenswrapper[4700]: I1007 11:54:15.335998 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" containerID="cri-o://b4eb2c0b76ac0ef29070f1fc5ad19b527d10f64c8c75273ac1ebd81f60388f25" gracePeriod=600 Oct 07 11:54:15 crc kubenswrapper[4700]: I1007 11:54:15.480383 4700 generic.go:334] "Generic (PLEG): container finished" podID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerID="b4eb2c0b76ac0ef29070f1fc5ad19b527d10f64c8c75273ac1ebd81f60388f25" exitCode=0 Oct 07 11:54:15 crc kubenswrapper[4700]: I1007 11:54:15.480458 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" event={"ID":"97a77b38-e9b1-4243-ac3a-28d83d87cf15","Type":"ContainerDied","Data":"b4eb2c0b76ac0ef29070f1fc5ad19b527d10f64c8c75273ac1ebd81f60388f25"} Oct 07 11:54:15 crc kubenswrapper[4700]: I1007 11:54:15.480724 4700 scope.go:117] "RemoveContainer" containerID="cd159b1c4a4e5543066d303bc63fe54cd56f2bbe3439d5248b95da07598874b4" Oct 07 11:54:16 crc kubenswrapper[4700]: I1007 11:54:16.493148 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" event={"ID":"97a77b38-e9b1-4243-ac3a-28d83d87cf15","Type":"ContainerStarted","Data":"e576be0f7f83ef624f0d7177a7f95beb1640df75972c7a17cbd531d88581d1f2"} Oct 07 11:54:36 crc kubenswrapper[4700]: I1007 11:54:36.736524 4700 generic.go:334] "Generic (PLEG): container finished" podID="19561523-a9ea-4632-9aa7-6be23fa3eee5" containerID="c951e29d912a33f10f214c8e5b1e0694fe011e9a59fed1c6878560169118a908" exitCode=0 Oct 07 11:54:36 crc kubenswrapper[4700]: I1007 11:54:36.736617 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s25ms" event={"ID":"19561523-a9ea-4632-9aa7-6be23fa3eee5","Type":"ContainerDied","Data":"c951e29d912a33f10f214c8e5b1e0694fe011e9a59fed1c6878560169118a908"} Oct 07 11:54:38 crc kubenswrapper[4700]: I1007 11:54:38.179208 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s25ms" Oct 07 11:54:38 crc kubenswrapper[4700]: I1007 11:54:38.277478 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/19561523-a9ea-4632-9aa7-6be23fa3eee5-ovncontroller-config-0\") pod \"19561523-a9ea-4632-9aa7-6be23fa3eee5\" (UID: \"19561523-a9ea-4632-9aa7-6be23fa3eee5\") " Oct 07 11:54:38 crc kubenswrapper[4700]: I1007 11:54:38.277545 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x554\" (UniqueName: \"kubernetes.io/projected/19561523-a9ea-4632-9aa7-6be23fa3eee5-kube-api-access-6x554\") pod \"19561523-a9ea-4632-9aa7-6be23fa3eee5\" (UID: \"19561523-a9ea-4632-9aa7-6be23fa3eee5\") " Oct 07 11:54:38 crc kubenswrapper[4700]: I1007 11:54:38.277633 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19561523-a9ea-4632-9aa7-6be23fa3eee5-inventory\") pod \"19561523-a9ea-4632-9aa7-6be23fa3eee5\" (UID: \"19561523-a9ea-4632-9aa7-6be23fa3eee5\") " Oct 07 11:54:38 crc kubenswrapper[4700]: I1007 11:54:38.277743 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19561523-a9ea-4632-9aa7-6be23fa3eee5-ssh-key\") pod \"19561523-a9ea-4632-9aa7-6be23fa3eee5\" (UID: \"19561523-a9ea-4632-9aa7-6be23fa3eee5\") " Oct 07 11:54:38 crc kubenswrapper[4700]: I1007 11:54:38.277759 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19561523-a9ea-4632-9aa7-6be23fa3eee5-ovn-combined-ca-bundle\") pod \"19561523-a9ea-4632-9aa7-6be23fa3eee5\" (UID: \"19561523-a9ea-4632-9aa7-6be23fa3eee5\") " Oct 07 11:54:38 crc kubenswrapper[4700]: I1007 11:54:38.283012 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19561523-a9ea-4632-9aa7-6be23fa3eee5-kube-api-access-6x554" (OuterVolumeSpecName: "kube-api-access-6x554") pod "19561523-a9ea-4632-9aa7-6be23fa3eee5" (UID: "19561523-a9ea-4632-9aa7-6be23fa3eee5"). InnerVolumeSpecName "kube-api-access-6x554". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:54:38 crc kubenswrapper[4700]: I1007 11:54:38.289599 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19561523-a9ea-4632-9aa7-6be23fa3eee5-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "19561523-a9ea-4632-9aa7-6be23fa3eee5" (UID: "19561523-a9ea-4632-9aa7-6be23fa3eee5"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:54:38 crc kubenswrapper[4700]: I1007 11:54:38.309513 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19561523-a9ea-4632-9aa7-6be23fa3eee5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "19561523-a9ea-4632-9aa7-6be23fa3eee5" (UID: "19561523-a9ea-4632-9aa7-6be23fa3eee5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:54:38 crc kubenswrapper[4700]: I1007 11:54:38.325689 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19561523-a9ea-4632-9aa7-6be23fa3eee5-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "19561523-a9ea-4632-9aa7-6be23fa3eee5" (UID: "19561523-a9ea-4632-9aa7-6be23fa3eee5"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 11:54:38 crc kubenswrapper[4700]: I1007 11:54:38.327672 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19561523-a9ea-4632-9aa7-6be23fa3eee5-inventory" (OuterVolumeSpecName: "inventory") pod "19561523-a9ea-4632-9aa7-6be23fa3eee5" (UID: "19561523-a9ea-4632-9aa7-6be23fa3eee5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:54:38 crc kubenswrapper[4700]: I1007 11:54:38.380492 4700 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19561523-a9ea-4632-9aa7-6be23fa3eee5-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 11:54:38 crc kubenswrapper[4700]: I1007 11:54:38.380538 4700 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19561523-a9ea-4632-9aa7-6be23fa3eee5-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:54:38 crc kubenswrapper[4700]: I1007 11:54:38.380561 4700 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/19561523-a9ea-4632-9aa7-6be23fa3eee5-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 07 11:54:38 crc kubenswrapper[4700]: I1007 11:54:38.380589 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x554\" (UniqueName: \"kubernetes.io/projected/19561523-a9ea-4632-9aa7-6be23fa3eee5-kube-api-access-6x554\") on node \"crc\" DevicePath \"\"" Oct 07 11:54:38 crc kubenswrapper[4700]: I1007 11:54:38.380606 4700 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19561523-a9ea-4632-9aa7-6be23fa3eee5-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 11:54:38 crc kubenswrapper[4700]: I1007 11:54:38.765427 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s25ms" event={"ID":"19561523-a9ea-4632-9aa7-6be23fa3eee5","Type":"ContainerDied","Data":"c1b50e0ed7f78b843c78e0b0c9ebd0905c6402f2ca5a9f348583ec1b192f09b3"} Oct 07 11:54:38 crc kubenswrapper[4700]: I1007 11:54:38.765490 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1b50e0ed7f78b843c78e0b0c9ebd0905c6402f2ca5a9f348583ec1b192f09b3" Oct 07 11:54:38 crc kubenswrapper[4700]: I1007 11:54:38.765513 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s25ms" Oct 07 11:54:38 crc kubenswrapper[4700]: I1007 11:54:38.884932 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d"] Oct 07 11:54:38 crc kubenswrapper[4700]: E1007 11:54:38.885507 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19561523-a9ea-4632-9aa7-6be23fa3eee5" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 07 11:54:38 crc kubenswrapper[4700]: I1007 11:54:38.885533 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="19561523-a9ea-4632-9aa7-6be23fa3eee5" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 07 11:54:38 crc kubenswrapper[4700]: I1007 11:54:38.885803 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="19561523-a9ea-4632-9aa7-6be23fa3eee5" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 07 11:54:38 crc kubenswrapper[4700]: I1007 11:54:38.888121 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d" Oct 07 11:54:38 crc kubenswrapper[4700]: I1007 11:54:38.890913 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 11:54:38 crc kubenswrapper[4700]: I1007 11:54:38.894694 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 07 11:54:38 crc kubenswrapper[4700]: I1007 11:54:38.895366 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 11:54:38 crc kubenswrapper[4700]: I1007 11:54:38.896655 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 07 11:54:38 crc kubenswrapper[4700]: I1007 11:54:38.897767 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxxzn" Oct 07 11:54:38 crc kubenswrapper[4700]: I1007 11:54:38.898658 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d"] Oct 07 11:54:38 crc kubenswrapper[4700]: I1007 11:54:38.899485 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 11:54:38 crc kubenswrapper[4700]: I1007 11:54:38.996268 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e614fc07-932e-461a-9921-3471f4649838-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d\" (UID: \"e614fc07-932e-461a-9921-3471f4649838\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d" Oct 07 11:54:38 crc kubenswrapper[4700]: I1007 11:54:38.996450 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzb8z\" (UniqueName: \"kubernetes.io/projected/e614fc07-932e-461a-9921-3471f4649838-kube-api-access-vzb8z\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d\" (UID: \"e614fc07-932e-461a-9921-3471f4649838\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d" Oct 07 11:54:38 crc kubenswrapper[4700]: I1007 11:54:38.996606 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e614fc07-932e-461a-9921-3471f4649838-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d\" (UID: \"e614fc07-932e-461a-9921-3471f4649838\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d" Oct 07 11:54:38 crc kubenswrapper[4700]: I1007 11:54:38.996659 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e614fc07-932e-461a-9921-3471f4649838-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d\" (UID: \"e614fc07-932e-461a-9921-3471f4649838\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d" Oct 07 11:54:38 crc kubenswrapper[4700]: I1007 11:54:38.996862 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e614fc07-932e-461a-9921-3471f4649838-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d\" (UID: \"e614fc07-932e-461a-9921-3471f4649838\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d" Oct 07 11:54:38 crc kubenswrapper[4700]: I1007 11:54:38.996918 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e614fc07-932e-461a-9921-3471f4649838-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d\" (UID: \"e614fc07-932e-461a-9921-3471f4649838\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d" Oct 07 11:54:39 crc kubenswrapper[4700]: I1007 11:54:39.100261 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e614fc07-932e-461a-9921-3471f4649838-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d\" (UID: \"e614fc07-932e-461a-9921-3471f4649838\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d" Oct 07 11:54:39 crc kubenswrapper[4700]: I1007 11:54:39.100447 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzb8z\" (UniqueName: \"kubernetes.io/projected/e614fc07-932e-461a-9921-3471f4649838-kube-api-access-vzb8z\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d\" (UID: \"e614fc07-932e-461a-9921-3471f4649838\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d" Oct 07 11:54:39 crc kubenswrapper[4700]: I1007 11:54:39.100653 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e614fc07-932e-461a-9921-3471f4649838-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d\" (UID: \"e614fc07-932e-461a-9921-3471f4649838\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d" Oct 07 11:54:39 crc kubenswrapper[4700]: I1007 11:54:39.100723 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e614fc07-932e-461a-9921-3471f4649838-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d\" (UID: \"e614fc07-932e-461a-9921-3471f4649838\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d" Oct 07 11:54:39 crc kubenswrapper[4700]: I1007 11:54:39.101037 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e614fc07-932e-461a-9921-3471f4649838-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d\" (UID: \"e614fc07-932e-461a-9921-3471f4649838\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d" Oct 07 11:54:39 crc kubenswrapper[4700]: I1007 11:54:39.101113 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e614fc07-932e-461a-9921-3471f4649838-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d\" (UID: \"e614fc07-932e-461a-9921-3471f4649838\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d" Oct 07 11:54:39 crc kubenswrapper[4700]: I1007 11:54:39.106082 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e614fc07-932e-461a-9921-3471f4649838-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d\" (UID: \"e614fc07-932e-461a-9921-3471f4649838\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d" Oct 07 11:54:39 crc kubenswrapper[4700]: I1007 11:54:39.107083 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e614fc07-932e-461a-9921-3471f4649838-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d\" (UID: \"e614fc07-932e-461a-9921-3471f4649838\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d" Oct 07 11:54:39 crc kubenswrapper[4700]: I1007 11:54:39.107964 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e614fc07-932e-461a-9921-3471f4649838-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d\" (UID: \"e614fc07-932e-461a-9921-3471f4649838\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d" Oct 07 11:54:39 crc kubenswrapper[4700]: I1007 11:54:39.108785 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e614fc07-932e-461a-9921-3471f4649838-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d\" (UID: \"e614fc07-932e-461a-9921-3471f4649838\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d" Oct 07 11:54:39 crc kubenswrapper[4700]: I1007 11:54:39.108831 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e614fc07-932e-461a-9921-3471f4649838-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d\" (UID: \"e614fc07-932e-461a-9921-3471f4649838\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d" Oct 07 11:54:39 crc kubenswrapper[4700]: I1007 11:54:39.124759 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzb8z\" (UniqueName: \"kubernetes.io/projected/e614fc07-932e-461a-9921-3471f4649838-kube-api-access-vzb8z\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d\" (UID: \"e614fc07-932e-461a-9921-3471f4649838\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d" Oct 07 11:54:39 crc kubenswrapper[4700]: I1007 11:54:39.227179 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d" Oct 07 11:54:39 crc kubenswrapper[4700]: I1007 11:54:39.808945 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d"] Oct 07 11:54:40 crc kubenswrapper[4700]: I1007 11:54:40.793056 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d" event={"ID":"e614fc07-932e-461a-9921-3471f4649838","Type":"ContainerStarted","Data":"32ad92cb5818c3f8c2b506e5358bf9c681f6aa9b1f41f1d2ae37c4a3a943537e"} Oct 07 11:54:41 crc kubenswrapper[4700]: I1007 11:54:41.808631 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d" event={"ID":"e614fc07-932e-461a-9921-3471f4649838","Type":"ContainerStarted","Data":"67ab9784e3d2792655a9392431976612571b56a91eb505dda88dc1fbeb18f392"} Oct 07 11:54:41 crc kubenswrapper[4700]: I1007 11:54:41.845521 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d" podStartSLOduration=2.851859942 podStartE2EDuration="3.845494903s" podCreationTimestamp="2025-10-07 11:54:38 +0000 UTC" firstStartedPulling="2025-10-07 11:54:39.806328293 +0000 UTC m=+2046.602727282" lastFinishedPulling="2025-10-07 11:54:40.799963224 +0000 UTC m=+2047.596362243" observedRunningTime="2025-10-07 11:54:41.833471069 +0000 UTC m=+2048.629870098" watchObservedRunningTime="2025-10-07 11:54:41.845494903 +0000 UTC m=+2048.641893932" Oct 07 11:55:17 crc kubenswrapper[4700]: I1007 11:55:17.869213 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k8j6d"] Oct 07 11:55:17 crc kubenswrapper[4700]: I1007 11:55:17.874750 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k8j6d" Oct 07 11:55:17 crc kubenswrapper[4700]: I1007 11:55:17.883482 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k8j6d"] Oct 07 11:55:18 crc kubenswrapper[4700]: I1007 11:55:18.024767 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c3eb338-4420-4398-88ab-98f8f244ce97-utilities\") pod \"certified-operators-k8j6d\" (UID: \"5c3eb338-4420-4398-88ab-98f8f244ce97\") " pod="openshift-marketplace/certified-operators-k8j6d" Oct 07 11:55:18 crc kubenswrapper[4700]: I1007 11:55:18.025065 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62hb7\" (UniqueName: \"kubernetes.io/projected/5c3eb338-4420-4398-88ab-98f8f244ce97-kube-api-access-62hb7\") pod \"certified-operators-k8j6d\" (UID: \"5c3eb338-4420-4398-88ab-98f8f244ce97\") " pod="openshift-marketplace/certified-operators-k8j6d" Oct 07 11:55:18 crc kubenswrapper[4700]: I1007 11:55:18.025152 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c3eb338-4420-4398-88ab-98f8f244ce97-catalog-content\") pod \"certified-operators-k8j6d\" (UID: \"5c3eb338-4420-4398-88ab-98f8f244ce97\") " pod="openshift-marketplace/certified-operators-k8j6d" Oct 07 11:55:18 crc kubenswrapper[4700]: I1007 11:55:18.126931 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c3eb338-4420-4398-88ab-98f8f244ce97-utilities\") pod \"certified-operators-k8j6d\" (UID: \"5c3eb338-4420-4398-88ab-98f8f244ce97\") " pod="openshift-marketplace/certified-operators-k8j6d" Oct 07 11:55:18 crc kubenswrapper[4700]: I1007 11:55:18.127016 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62hb7\" (UniqueName: \"kubernetes.io/projected/5c3eb338-4420-4398-88ab-98f8f244ce97-kube-api-access-62hb7\") pod \"certified-operators-k8j6d\" (UID: \"5c3eb338-4420-4398-88ab-98f8f244ce97\") " pod="openshift-marketplace/certified-operators-k8j6d" Oct 07 11:55:18 crc kubenswrapper[4700]: I1007 11:55:18.127040 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c3eb338-4420-4398-88ab-98f8f244ce97-catalog-content\") pod \"certified-operators-k8j6d\" (UID: \"5c3eb338-4420-4398-88ab-98f8f244ce97\") " pod="openshift-marketplace/certified-operators-k8j6d" Oct 07 11:55:18 crc kubenswrapper[4700]: I1007 11:55:18.127563 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c3eb338-4420-4398-88ab-98f8f244ce97-catalog-content\") pod \"certified-operators-k8j6d\" (UID: \"5c3eb338-4420-4398-88ab-98f8f244ce97\") " pod="openshift-marketplace/certified-operators-k8j6d" Oct 07 11:55:18 crc kubenswrapper[4700]: I1007 11:55:18.127652 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c3eb338-4420-4398-88ab-98f8f244ce97-utilities\") pod \"certified-operators-k8j6d\" (UID: \"5c3eb338-4420-4398-88ab-98f8f244ce97\") " pod="openshift-marketplace/certified-operators-k8j6d" Oct 07 11:55:18 crc kubenswrapper[4700]: I1007 11:55:18.156210 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62hb7\" (UniqueName: \"kubernetes.io/projected/5c3eb338-4420-4398-88ab-98f8f244ce97-kube-api-access-62hb7\") pod \"certified-operators-k8j6d\" (UID: \"5c3eb338-4420-4398-88ab-98f8f244ce97\") " pod="openshift-marketplace/certified-operators-k8j6d" Oct 07 11:55:18 crc kubenswrapper[4700]: I1007 11:55:18.208940 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k8j6d" Oct 07 11:55:18 crc kubenswrapper[4700]: I1007 11:55:18.695988 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k8j6d"] Oct 07 11:55:19 crc kubenswrapper[4700]: I1007 11:55:19.260541 4700 generic.go:334] "Generic (PLEG): container finished" podID="5c3eb338-4420-4398-88ab-98f8f244ce97" containerID="71639cfe3454d9f2637592d210ad23e9fc72fd52b928344bf9808e5e0da635fc" exitCode=0 Oct 07 11:55:19 crc kubenswrapper[4700]: I1007 11:55:19.260599 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k8j6d" event={"ID":"5c3eb338-4420-4398-88ab-98f8f244ce97","Type":"ContainerDied","Data":"71639cfe3454d9f2637592d210ad23e9fc72fd52b928344bf9808e5e0da635fc"} Oct 07 11:55:19 crc kubenswrapper[4700]: I1007 11:55:19.260630 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k8j6d" event={"ID":"5c3eb338-4420-4398-88ab-98f8f244ce97","Type":"ContainerStarted","Data":"c75dd2891b0f92c573cef1ceb461f014b49b7e72f876c134901c62271b82f7a3"} Oct 07 11:55:20 crc kubenswrapper[4700]: I1007 11:55:20.279606 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k8j6d" event={"ID":"5c3eb338-4420-4398-88ab-98f8f244ce97","Type":"ContainerStarted","Data":"0331d289b01738b7c015cfb0daa0c7d1d86a7e0e50586fb5accdea9f532b031e"} Oct 07 11:55:21 crc kubenswrapper[4700]: I1007 11:55:21.292029 4700 generic.go:334] "Generic (PLEG): container finished" podID="5c3eb338-4420-4398-88ab-98f8f244ce97" containerID="0331d289b01738b7c015cfb0daa0c7d1d86a7e0e50586fb5accdea9f532b031e" exitCode=0 Oct 07 11:55:21 crc kubenswrapper[4700]: I1007 11:55:21.292077 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k8j6d" event={"ID":"5c3eb338-4420-4398-88ab-98f8f244ce97","Type":"ContainerDied","Data":"0331d289b01738b7c015cfb0daa0c7d1d86a7e0e50586fb5accdea9f532b031e"} Oct 07 11:55:22 crc kubenswrapper[4700]: I1007 11:55:22.306974 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k8j6d" event={"ID":"5c3eb338-4420-4398-88ab-98f8f244ce97","Type":"ContainerStarted","Data":"645d56f23a97d0be39d06b6eab71fb01dacc59836363213b3cbe82cedf9e4078"} Oct 07 11:55:22 crc kubenswrapper[4700]: I1007 11:55:22.347026 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k8j6d" podStartSLOduration=2.735761716 podStartE2EDuration="5.347004379s" podCreationTimestamp="2025-10-07 11:55:17 +0000 UTC" firstStartedPulling="2025-10-07 11:55:19.263018575 +0000 UTC m=+2086.059417564" lastFinishedPulling="2025-10-07 11:55:21.874261238 +0000 UTC m=+2088.670660227" observedRunningTime="2025-10-07 11:55:22.343687612 +0000 UTC m=+2089.140086631" watchObservedRunningTime="2025-10-07 11:55:22.347004379 +0000 UTC m=+2089.143403378" Oct 07 11:55:28 crc kubenswrapper[4700]: I1007 11:55:28.209111 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k8j6d" Oct 07 11:55:28 crc kubenswrapper[4700]: I1007 11:55:28.210199 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k8j6d" Oct 07 11:55:28 crc kubenswrapper[4700]: I1007 11:55:28.278963 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k8j6d" Oct 07 11:55:28 crc kubenswrapper[4700]: I1007 11:55:28.447544 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k8j6d" Oct 07 11:55:28 crc kubenswrapper[4700]: I1007 11:55:28.524628 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k8j6d"] Oct 07 11:55:30 crc kubenswrapper[4700]: I1007 11:55:30.416639 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k8j6d" podUID="5c3eb338-4420-4398-88ab-98f8f244ce97" containerName="registry-server" containerID="cri-o://645d56f23a97d0be39d06b6eab71fb01dacc59836363213b3cbe82cedf9e4078" gracePeriod=2 Oct 07 11:55:31 crc kubenswrapper[4700]: I1007 11:55:31.067520 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k8j6d" Oct 07 11:55:31 crc kubenswrapper[4700]: I1007 11:55:31.137739 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62hb7\" (UniqueName: \"kubernetes.io/projected/5c3eb338-4420-4398-88ab-98f8f244ce97-kube-api-access-62hb7\") pod \"5c3eb338-4420-4398-88ab-98f8f244ce97\" (UID: \"5c3eb338-4420-4398-88ab-98f8f244ce97\") " Oct 07 11:55:31 crc kubenswrapper[4700]: I1007 11:55:31.138005 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c3eb338-4420-4398-88ab-98f8f244ce97-catalog-content\") pod \"5c3eb338-4420-4398-88ab-98f8f244ce97\" (UID: \"5c3eb338-4420-4398-88ab-98f8f244ce97\") " Oct 07 11:55:31 crc kubenswrapper[4700]: I1007 11:55:31.138054 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c3eb338-4420-4398-88ab-98f8f244ce97-utilities\") pod \"5c3eb338-4420-4398-88ab-98f8f244ce97\" (UID: \"5c3eb338-4420-4398-88ab-98f8f244ce97\") " Oct 07 11:55:31 crc kubenswrapper[4700]: I1007 11:55:31.143338 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c3eb338-4420-4398-88ab-98f8f244ce97-kube-api-access-62hb7" (OuterVolumeSpecName: "kube-api-access-62hb7") pod "5c3eb338-4420-4398-88ab-98f8f244ce97" (UID: "5c3eb338-4420-4398-88ab-98f8f244ce97"). InnerVolumeSpecName "kube-api-access-62hb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:55:31 crc kubenswrapper[4700]: I1007 11:55:31.144338 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c3eb338-4420-4398-88ab-98f8f244ce97-utilities" (OuterVolumeSpecName: "utilities") pod "5c3eb338-4420-4398-88ab-98f8f244ce97" (UID: "5c3eb338-4420-4398-88ab-98f8f244ce97"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:55:31 crc kubenswrapper[4700]: I1007 11:55:31.186188 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c3eb338-4420-4398-88ab-98f8f244ce97-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c3eb338-4420-4398-88ab-98f8f244ce97" (UID: "5c3eb338-4420-4398-88ab-98f8f244ce97"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:55:31 crc kubenswrapper[4700]: I1007 11:55:31.241853 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62hb7\" (UniqueName: \"kubernetes.io/projected/5c3eb338-4420-4398-88ab-98f8f244ce97-kube-api-access-62hb7\") on node \"crc\" DevicePath \"\"" Oct 07 11:55:31 crc kubenswrapper[4700]: I1007 11:55:31.241913 4700 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c3eb338-4420-4398-88ab-98f8f244ce97-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 11:55:31 crc kubenswrapper[4700]: I1007 11:55:31.241936 4700 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c3eb338-4420-4398-88ab-98f8f244ce97-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 11:55:31 crc kubenswrapper[4700]: I1007 11:55:31.430299 4700 generic.go:334] "Generic (PLEG): container finished" podID="5c3eb338-4420-4398-88ab-98f8f244ce97" containerID="645d56f23a97d0be39d06b6eab71fb01dacc59836363213b3cbe82cedf9e4078" exitCode=0 Oct 07 11:55:31 crc kubenswrapper[4700]: I1007 11:55:31.430374 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k8j6d" Oct 07 11:55:31 crc kubenswrapper[4700]: I1007 11:55:31.430374 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k8j6d" event={"ID":"5c3eb338-4420-4398-88ab-98f8f244ce97","Type":"ContainerDied","Data":"645d56f23a97d0be39d06b6eab71fb01dacc59836363213b3cbe82cedf9e4078"} Oct 07 11:55:31 crc kubenswrapper[4700]: I1007 11:55:31.430456 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k8j6d" event={"ID":"5c3eb338-4420-4398-88ab-98f8f244ce97","Type":"ContainerDied","Data":"c75dd2891b0f92c573cef1ceb461f014b49b7e72f876c134901c62271b82f7a3"} Oct 07 11:55:31 crc kubenswrapper[4700]: I1007 11:55:31.430488 4700 scope.go:117] "RemoveContainer" containerID="645d56f23a97d0be39d06b6eab71fb01dacc59836363213b3cbe82cedf9e4078" Oct 07 11:55:31 crc kubenswrapper[4700]: I1007 11:55:31.454363 4700 scope.go:117] "RemoveContainer" containerID="0331d289b01738b7c015cfb0daa0c7d1d86a7e0e50586fb5accdea9f532b031e" Oct 07 11:55:31 crc kubenswrapper[4700]: I1007 11:55:31.474674 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k8j6d"] Oct 07 11:55:31 crc kubenswrapper[4700]: I1007 11:55:31.484795 4700 scope.go:117] "RemoveContainer" containerID="71639cfe3454d9f2637592d210ad23e9fc72fd52b928344bf9808e5e0da635fc" Oct 07 11:55:31 crc kubenswrapper[4700]: I1007 11:55:31.487602 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k8j6d"] Oct 07 11:55:31 crc kubenswrapper[4700]: I1007 11:55:31.524897 4700 scope.go:117] "RemoveContainer" containerID="645d56f23a97d0be39d06b6eab71fb01dacc59836363213b3cbe82cedf9e4078" Oct 07 11:55:31 crc kubenswrapper[4700]: E1007 11:55:31.525543 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"645d56f23a97d0be39d06b6eab71fb01dacc59836363213b3cbe82cedf9e4078\": container with ID starting with 645d56f23a97d0be39d06b6eab71fb01dacc59836363213b3cbe82cedf9e4078 not found: ID does not exist" containerID="645d56f23a97d0be39d06b6eab71fb01dacc59836363213b3cbe82cedf9e4078" Oct 07 11:55:31 crc kubenswrapper[4700]: I1007 11:55:31.525602 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"645d56f23a97d0be39d06b6eab71fb01dacc59836363213b3cbe82cedf9e4078"} err="failed to get container status \"645d56f23a97d0be39d06b6eab71fb01dacc59836363213b3cbe82cedf9e4078\": rpc error: code = NotFound desc = could not find container \"645d56f23a97d0be39d06b6eab71fb01dacc59836363213b3cbe82cedf9e4078\": container with ID starting with 645d56f23a97d0be39d06b6eab71fb01dacc59836363213b3cbe82cedf9e4078 not found: ID does not exist" Oct 07 11:55:31 crc kubenswrapper[4700]: I1007 11:55:31.525691 4700 scope.go:117] "RemoveContainer" containerID="0331d289b01738b7c015cfb0daa0c7d1d86a7e0e50586fb5accdea9f532b031e" Oct 07 11:55:31 crc kubenswrapper[4700]: E1007 11:55:31.526187 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0331d289b01738b7c015cfb0daa0c7d1d86a7e0e50586fb5accdea9f532b031e\": container with ID starting with 0331d289b01738b7c015cfb0daa0c7d1d86a7e0e50586fb5accdea9f532b031e not found: ID does not exist" containerID="0331d289b01738b7c015cfb0daa0c7d1d86a7e0e50586fb5accdea9f532b031e" Oct 07 11:55:31 crc kubenswrapper[4700]: I1007 11:55:31.526216 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0331d289b01738b7c015cfb0daa0c7d1d86a7e0e50586fb5accdea9f532b031e"} err="failed to get container status \"0331d289b01738b7c015cfb0daa0c7d1d86a7e0e50586fb5accdea9f532b031e\": rpc error: code = NotFound desc = could not find container \"0331d289b01738b7c015cfb0daa0c7d1d86a7e0e50586fb5accdea9f532b031e\": container with ID starting with 0331d289b01738b7c015cfb0daa0c7d1d86a7e0e50586fb5accdea9f532b031e not found: ID does not exist" Oct 07 11:55:31 crc kubenswrapper[4700]: I1007 11:55:31.526235 4700 scope.go:117] "RemoveContainer" containerID="71639cfe3454d9f2637592d210ad23e9fc72fd52b928344bf9808e5e0da635fc" Oct 07 11:55:31 crc kubenswrapper[4700]: E1007 11:55:31.526832 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71639cfe3454d9f2637592d210ad23e9fc72fd52b928344bf9808e5e0da635fc\": container with ID starting with 71639cfe3454d9f2637592d210ad23e9fc72fd52b928344bf9808e5e0da635fc not found: ID does not exist" containerID="71639cfe3454d9f2637592d210ad23e9fc72fd52b928344bf9808e5e0da635fc" Oct 07 11:55:31 crc kubenswrapper[4700]: I1007 11:55:31.526894 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71639cfe3454d9f2637592d210ad23e9fc72fd52b928344bf9808e5e0da635fc"} err="failed to get container status \"71639cfe3454d9f2637592d210ad23e9fc72fd52b928344bf9808e5e0da635fc\": rpc error: code = NotFound desc = could not find container \"71639cfe3454d9f2637592d210ad23e9fc72fd52b928344bf9808e5e0da635fc\": container with ID starting with 71639cfe3454d9f2637592d210ad23e9fc72fd52b928344bf9808e5e0da635fc not found: ID does not exist" Oct 07 11:55:31 crc kubenswrapper[4700]: I1007 11:55:31.972350 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c3eb338-4420-4398-88ab-98f8f244ce97" path="/var/lib/kubelet/pods/5c3eb338-4420-4398-88ab-98f8f244ce97/volumes" Oct 07 11:55:39 crc kubenswrapper[4700]: I1007 11:55:39.543352 4700 generic.go:334] "Generic (PLEG): container finished" podID="e614fc07-932e-461a-9921-3471f4649838" containerID="67ab9784e3d2792655a9392431976612571b56a91eb505dda88dc1fbeb18f392" exitCode=0 Oct 07 11:55:39 crc kubenswrapper[4700]: I1007 11:55:39.543459 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d" event={"ID":"e614fc07-932e-461a-9921-3471f4649838","Type":"ContainerDied","Data":"67ab9784e3d2792655a9392431976612571b56a91eb505dda88dc1fbeb18f392"} Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.096432 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.156016 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e614fc07-932e-461a-9921-3471f4649838-nova-metadata-neutron-config-0\") pod \"e614fc07-932e-461a-9921-3471f4649838\" (UID: \"e614fc07-932e-461a-9921-3471f4649838\") " Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.156139 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e614fc07-932e-461a-9921-3471f4649838-neutron-metadata-combined-ca-bundle\") pod \"e614fc07-932e-461a-9921-3471f4649838\" (UID: \"e614fc07-932e-461a-9921-3471f4649838\") " Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.156204 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e614fc07-932e-461a-9921-3471f4649838-neutron-ovn-metadata-agent-neutron-config-0\") pod \"e614fc07-932e-461a-9921-3471f4649838\" (UID: \"e614fc07-932e-461a-9921-3471f4649838\") " Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.156285 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e614fc07-932e-461a-9921-3471f4649838-ssh-key\") pod \"e614fc07-932e-461a-9921-3471f4649838\" (UID: \"e614fc07-932e-461a-9921-3471f4649838\") " Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.156382 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e614fc07-932e-461a-9921-3471f4649838-inventory\") pod \"e614fc07-932e-461a-9921-3471f4649838\" (UID: \"e614fc07-932e-461a-9921-3471f4649838\") " Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.156428 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzb8z\" (UniqueName: \"kubernetes.io/projected/e614fc07-932e-461a-9921-3471f4649838-kube-api-access-vzb8z\") pod \"e614fc07-932e-461a-9921-3471f4649838\" (UID: \"e614fc07-932e-461a-9921-3471f4649838\") " Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.165731 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e614fc07-932e-461a-9921-3471f4649838-kube-api-access-vzb8z" (OuterVolumeSpecName: "kube-api-access-vzb8z") pod "e614fc07-932e-461a-9921-3471f4649838" (UID: "e614fc07-932e-461a-9921-3471f4649838"). InnerVolumeSpecName "kube-api-access-vzb8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.167828 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e614fc07-932e-461a-9921-3471f4649838-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "e614fc07-932e-461a-9921-3471f4649838" (UID: "e614fc07-932e-461a-9921-3471f4649838"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.195634 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e614fc07-932e-461a-9921-3471f4649838-inventory" (OuterVolumeSpecName: "inventory") pod "e614fc07-932e-461a-9921-3471f4649838" (UID: "e614fc07-932e-461a-9921-3471f4649838"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.203835 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e614fc07-932e-461a-9921-3471f4649838-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "e614fc07-932e-461a-9921-3471f4649838" (UID: "e614fc07-932e-461a-9921-3471f4649838"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.211356 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e614fc07-932e-461a-9921-3471f4649838-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e614fc07-932e-461a-9921-3471f4649838" (UID: "e614fc07-932e-461a-9921-3471f4649838"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.217002 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e614fc07-932e-461a-9921-3471f4649838-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "e614fc07-932e-461a-9921-3471f4649838" (UID: "e614fc07-932e-461a-9921-3471f4649838"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.261247 4700 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e614fc07-932e-461a-9921-3471f4649838-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.261495 4700 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e614fc07-932e-461a-9921-3471f4649838-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.261588 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzb8z\" (UniqueName: \"kubernetes.io/projected/e614fc07-932e-461a-9921-3471f4649838-kube-api-access-vzb8z\") on node \"crc\" DevicePath \"\"" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.261638 4700 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e614fc07-932e-461a-9921-3471f4649838-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.261655 4700 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e614fc07-932e-461a-9921-3471f4649838-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.261674 4700 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e614fc07-932e-461a-9921-3471f4649838-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.572216 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d" event={"ID":"e614fc07-932e-461a-9921-3471f4649838","Type":"ContainerDied","Data":"32ad92cb5818c3f8c2b506e5358bf9c681f6aa9b1f41f1d2ae37c4a3a943537e"} Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.572291 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32ad92cb5818c3f8c2b506e5358bf9c681f6aa9b1f41f1d2ae37c4a3a943537e" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.572345 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.717909 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n"] Oct 07 11:55:41 crc kubenswrapper[4700]: E1007 11:55:41.718388 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c3eb338-4420-4398-88ab-98f8f244ce97" containerName="extract-content" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.718411 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c3eb338-4420-4398-88ab-98f8f244ce97" containerName="extract-content" Oct 07 11:55:41 crc kubenswrapper[4700]: E1007 11:55:41.718424 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c3eb338-4420-4398-88ab-98f8f244ce97" containerName="registry-server" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.718433 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c3eb338-4420-4398-88ab-98f8f244ce97" containerName="registry-server" Oct 07 11:55:41 crc kubenswrapper[4700]: E1007 11:55:41.718456 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c3eb338-4420-4398-88ab-98f8f244ce97" containerName="extract-utilities" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.718464 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c3eb338-4420-4398-88ab-98f8f244ce97" containerName="extract-utilities" Oct 07 11:55:41 crc kubenswrapper[4700]: E1007 11:55:41.718488 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e614fc07-932e-461a-9921-3471f4649838" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.718497 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="e614fc07-932e-461a-9921-3471f4649838" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.718746 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c3eb338-4420-4398-88ab-98f8f244ce97" containerName="registry-server" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.718764 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="e614fc07-932e-461a-9921-3471f4649838" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.719501 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.722398 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.722536 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxxzn" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.722378 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.724248 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.725763 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.742225 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n"] Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.772975 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/075a58e4-36cd-4194-a235-b75f63adb1e2-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n\" (UID: \"075a58e4-36cd-4194-a235-b75f63adb1e2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.773029 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/075a58e4-36cd-4194-a235-b75f63adb1e2-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n\" (UID: \"075a58e4-36cd-4194-a235-b75f63adb1e2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.773176 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trq2f\" (UniqueName: \"kubernetes.io/projected/075a58e4-36cd-4194-a235-b75f63adb1e2-kube-api-access-trq2f\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n\" (UID: \"075a58e4-36cd-4194-a235-b75f63adb1e2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.773371 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/075a58e4-36cd-4194-a235-b75f63adb1e2-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n\" (UID: \"075a58e4-36cd-4194-a235-b75f63adb1e2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.773414 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/075a58e4-36cd-4194-a235-b75f63adb1e2-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n\" (UID: \"075a58e4-36cd-4194-a235-b75f63adb1e2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.875228 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/075a58e4-36cd-4194-a235-b75f63adb1e2-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n\" (UID: \"075a58e4-36cd-4194-a235-b75f63adb1e2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.875570 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/075a58e4-36cd-4194-a235-b75f63adb1e2-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n\" (UID: \"075a58e4-36cd-4194-a235-b75f63adb1e2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.875823 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trq2f\" (UniqueName: \"kubernetes.io/projected/075a58e4-36cd-4194-a235-b75f63adb1e2-kube-api-access-trq2f\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n\" (UID: \"075a58e4-36cd-4194-a235-b75f63adb1e2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.876092 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/075a58e4-36cd-4194-a235-b75f63adb1e2-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n\" (UID: \"075a58e4-36cd-4194-a235-b75f63adb1e2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.876265 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/075a58e4-36cd-4194-a235-b75f63adb1e2-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n\" (UID: \"075a58e4-36cd-4194-a235-b75f63adb1e2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.881682 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/075a58e4-36cd-4194-a235-b75f63adb1e2-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n\" (UID: \"075a58e4-36cd-4194-a235-b75f63adb1e2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.882017 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/075a58e4-36cd-4194-a235-b75f63adb1e2-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n\" (UID: \"075a58e4-36cd-4194-a235-b75f63adb1e2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.882443 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/075a58e4-36cd-4194-a235-b75f63adb1e2-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n\" (UID: \"075a58e4-36cd-4194-a235-b75f63adb1e2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.882980 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/075a58e4-36cd-4194-a235-b75f63adb1e2-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n\" (UID: \"075a58e4-36cd-4194-a235-b75f63adb1e2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n" Oct 07 11:55:41 crc kubenswrapper[4700]: I1007 11:55:41.905578 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trq2f\" (UniqueName: \"kubernetes.io/projected/075a58e4-36cd-4194-a235-b75f63adb1e2-kube-api-access-trq2f\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n\" (UID: \"075a58e4-36cd-4194-a235-b75f63adb1e2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n" Oct 07 11:55:42 crc kubenswrapper[4700]: I1007 11:55:42.042673 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n" Oct 07 11:55:42 crc kubenswrapper[4700]: I1007 11:55:42.462712 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n"] Oct 07 11:55:42 crc kubenswrapper[4700]: I1007 11:55:42.585820 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n" event={"ID":"075a58e4-36cd-4194-a235-b75f63adb1e2","Type":"ContainerStarted","Data":"e6deb03dff1561c1990d6d7561c96d86fb04003aaacfb372fe89ef12968681a4"} Oct 07 11:55:43 crc kubenswrapper[4700]: I1007 11:55:43.600665 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n" event={"ID":"075a58e4-36cd-4194-a235-b75f63adb1e2","Type":"ContainerStarted","Data":"167bfe90c857057da426e3d082c802766c69d2c2d4ac1aef144adae1634c00e0"} Oct 07 11:55:43 crc kubenswrapper[4700]: I1007 11:55:43.638376 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n" podStartSLOduration=2.062034968 podStartE2EDuration="2.63834456s" podCreationTimestamp="2025-10-07 11:55:41 +0000 UTC" firstStartedPulling="2025-10-07 11:55:42.476710291 +0000 UTC m=+2109.273109290" lastFinishedPulling="2025-10-07 11:55:43.053019863 +0000 UTC m=+2109.849418882" observedRunningTime="2025-10-07 11:55:43.623785649 +0000 UTC m=+2110.420184708" watchObservedRunningTime="2025-10-07 11:55:43.63834456 +0000 UTC m=+2110.434743599" Oct 07 11:56:10 crc kubenswrapper[4700]: I1007 11:56:10.214093 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vnxfl"] Oct 07 11:56:10 crc kubenswrapper[4700]: I1007 11:56:10.219576 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vnxfl" Oct 07 11:56:10 crc kubenswrapper[4700]: I1007 11:56:10.239449 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vnxfl"] Oct 07 11:56:10 crc kubenswrapper[4700]: I1007 11:56:10.344921 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424-catalog-content\") pod \"redhat-operators-vnxfl\" (UID: \"9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424\") " pod="openshift-marketplace/redhat-operators-vnxfl" Oct 07 11:56:10 crc kubenswrapper[4700]: I1007 11:56:10.345570 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxt6v\" (UniqueName: \"kubernetes.io/projected/9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424-kube-api-access-kxt6v\") pod \"redhat-operators-vnxfl\" (UID: \"9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424\") " pod="openshift-marketplace/redhat-operators-vnxfl" Oct 07 11:56:10 crc kubenswrapper[4700]: I1007 11:56:10.345702 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424-utilities\") pod \"redhat-operators-vnxfl\" (UID: \"9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424\") " pod="openshift-marketplace/redhat-operators-vnxfl" Oct 07 11:56:10 crc kubenswrapper[4700]: I1007 11:56:10.447691 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424-catalog-content\") pod \"redhat-operators-vnxfl\" (UID: \"9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424\") " pod="openshift-marketplace/redhat-operators-vnxfl" Oct 07 11:56:10 crc kubenswrapper[4700]: I1007 11:56:10.447792 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxt6v\" (UniqueName: \"kubernetes.io/projected/9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424-kube-api-access-kxt6v\") pod \"redhat-operators-vnxfl\" (UID: \"9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424\") " pod="openshift-marketplace/redhat-operators-vnxfl" Oct 07 11:56:10 crc kubenswrapper[4700]: I1007 11:56:10.447845 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424-utilities\") pod \"redhat-operators-vnxfl\" (UID: \"9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424\") " pod="openshift-marketplace/redhat-operators-vnxfl" Oct 07 11:56:10 crc kubenswrapper[4700]: I1007 11:56:10.448394 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424-utilities\") pod \"redhat-operators-vnxfl\" (UID: \"9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424\") " pod="openshift-marketplace/redhat-operators-vnxfl" Oct 07 11:56:10 crc kubenswrapper[4700]: I1007 11:56:10.448690 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424-catalog-content\") pod \"redhat-operators-vnxfl\" (UID: \"9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424\") " pod="openshift-marketplace/redhat-operators-vnxfl" Oct 07 11:56:10 crc kubenswrapper[4700]: I1007 11:56:10.475113 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxt6v\" (UniqueName: \"kubernetes.io/projected/9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424-kube-api-access-kxt6v\") pod \"redhat-operators-vnxfl\" (UID: \"9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424\") " pod="openshift-marketplace/redhat-operators-vnxfl" Oct 07 11:56:10 crc kubenswrapper[4700]: I1007 11:56:10.563039 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vnxfl" Oct 07 11:56:11 crc kubenswrapper[4700]: I1007 11:56:11.046170 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vnxfl"] Oct 07 11:56:11 crc kubenswrapper[4700]: I1007 11:56:11.907279 4700 generic.go:334] "Generic (PLEG): container finished" podID="9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424" containerID="d76ce96aa573abc0e27519a7972b4942c2f435aa231d263f7d3bef1d492fa2e4" exitCode=0 Oct 07 11:56:11 crc kubenswrapper[4700]: I1007 11:56:11.907555 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vnxfl" event={"ID":"9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424","Type":"ContainerDied","Data":"d76ce96aa573abc0e27519a7972b4942c2f435aa231d263f7d3bef1d492fa2e4"} Oct 07 11:56:11 crc kubenswrapper[4700]: I1007 11:56:11.907582 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vnxfl" event={"ID":"9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424","Type":"ContainerStarted","Data":"1f5623cb2c92722d800f2cfca25e42df88cfc33d242722a47bec499fbf7e5cf4"} Oct 07 11:56:12 crc kubenswrapper[4700]: I1007 11:56:12.920714 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vnxfl" event={"ID":"9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424","Type":"ContainerStarted","Data":"ba4f30955ccf4e6b7e633335b0880c2152c5eb5e35c39af5c864fe485b0913ee"} Oct 07 11:56:13 crc kubenswrapper[4700]: I1007 11:56:13.939282 4700 generic.go:334] "Generic (PLEG): container finished" podID="9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424" containerID="ba4f30955ccf4e6b7e633335b0880c2152c5eb5e35c39af5c864fe485b0913ee" exitCode=0 Oct 07 11:56:13 crc kubenswrapper[4700]: I1007 11:56:13.939365 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vnxfl" event={"ID":"9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424","Type":"ContainerDied","Data":"ba4f30955ccf4e6b7e633335b0880c2152c5eb5e35c39af5c864fe485b0913ee"} Oct 07 11:56:14 crc kubenswrapper[4700]: I1007 11:56:14.953010 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vnxfl" event={"ID":"9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424","Type":"ContainerStarted","Data":"bde22037c377acff0d1a9bdfdcc56d4c767df5f1bf051a67a7c9d18ba1c0f0b5"} Oct 07 11:56:15 crc kubenswrapper[4700]: I1007 11:56:15.000438 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vnxfl" podStartSLOduration=2.491841088 podStartE2EDuration="5.000408593s" podCreationTimestamp="2025-10-07 11:56:10 +0000 UTC" firstStartedPulling="2025-10-07 11:56:11.90919452 +0000 UTC m=+2138.705593509" lastFinishedPulling="2025-10-07 11:56:14.417761995 +0000 UTC m=+2141.214161014" observedRunningTime="2025-10-07 11:56:14.980752738 +0000 UTC m=+2141.777151767" watchObservedRunningTime="2025-10-07 11:56:15.000408593 +0000 UTC m=+2141.796807642" Oct 07 11:56:15 crc kubenswrapper[4700]: I1007 11:56:15.334478 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 11:56:15 crc kubenswrapper[4700]: I1007 11:56:15.334550 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 11:56:20 crc kubenswrapper[4700]: I1007 11:56:20.563749 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vnxfl" Oct 07 11:56:20 crc kubenswrapper[4700]: I1007 11:56:20.564353 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vnxfl" Oct 07 11:56:20 crc kubenswrapper[4700]: I1007 11:56:20.625207 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vnxfl" Oct 07 11:56:21 crc kubenswrapper[4700]: I1007 11:56:21.087128 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vnxfl" Oct 07 11:56:21 crc kubenswrapper[4700]: I1007 11:56:21.152256 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vnxfl"] Oct 07 11:56:23 crc kubenswrapper[4700]: I1007 11:56:23.034456 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vnxfl" podUID="9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424" containerName="registry-server" containerID="cri-o://bde22037c377acff0d1a9bdfdcc56d4c767df5f1bf051a67a7c9d18ba1c0f0b5" gracePeriod=2 Oct 07 11:56:23 crc kubenswrapper[4700]: I1007 11:56:23.517926 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vnxfl" Oct 07 11:56:23 crc kubenswrapper[4700]: I1007 11:56:23.627230 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424-catalog-content\") pod \"9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424\" (UID: \"9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424\") " Oct 07 11:56:23 crc kubenswrapper[4700]: I1007 11:56:23.627295 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424-utilities\") pod \"9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424\" (UID: \"9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424\") " Oct 07 11:56:23 crc kubenswrapper[4700]: I1007 11:56:23.627538 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxt6v\" (UniqueName: \"kubernetes.io/projected/9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424-kube-api-access-kxt6v\") pod \"9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424\" (UID: \"9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424\") " Oct 07 11:56:23 crc kubenswrapper[4700]: I1007 11:56:23.628290 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424-utilities" (OuterVolumeSpecName: "utilities") pod "9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424" (UID: "9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:56:23 crc kubenswrapper[4700]: I1007 11:56:23.629014 4700 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 11:56:23 crc kubenswrapper[4700]: I1007 11:56:23.633101 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424-kube-api-access-kxt6v" (OuterVolumeSpecName: "kube-api-access-kxt6v") pod "9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424" (UID: "9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424"). InnerVolumeSpecName "kube-api-access-kxt6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 11:56:23 crc kubenswrapper[4700]: I1007 11:56:23.731732 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxt6v\" (UniqueName: \"kubernetes.io/projected/9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424-kube-api-access-kxt6v\") on node \"crc\" DevicePath \"\"" Oct 07 11:56:24 crc kubenswrapper[4700]: I1007 11:56:24.050100 4700 generic.go:334] "Generic (PLEG): container finished" podID="9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424" containerID="bde22037c377acff0d1a9bdfdcc56d4c767df5f1bf051a67a7c9d18ba1c0f0b5" exitCode=0 Oct 07 11:56:24 crc kubenswrapper[4700]: I1007 11:56:24.050188 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vnxfl" Oct 07 11:56:24 crc kubenswrapper[4700]: I1007 11:56:24.050238 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vnxfl" event={"ID":"9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424","Type":"ContainerDied","Data":"bde22037c377acff0d1a9bdfdcc56d4c767df5f1bf051a67a7c9d18ba1c0f0b5"} Oct 07 11:56:24 crc kubenswrapper[4700]: I1007 11:56:24.050641 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vnxfl" event={"ID":"9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424","Type":"ContainerDied","Data":"1f5623cb2c92722d800f2cfca25e42df88cfc33d242722a47bec499fbf7e5cf4"} Oct 07 11:56:24 crc kubenswrapper[4700]: I1007 11:56:24.050677 4700 scope.go:117] "RemoveContainer" containerID="bde22037c377acff0d1a9bdfdcc56d4c767df5f1bf051a67a7c9d18ba1c0f0b5" Oct 07 11:56:24 crc kubenswrapper[4700]: I1007 11:56:24.164242 4700 scope.go:117] "RemoveContainer" containerID="ba4f30955ccf4e6b7e633335b0880c2152c5eb5e35c39af5c864fe485b0913ee" Oct 07 11:56:24 crc kubenswrapper[4700]: I1007 11:56:24.202176 4700 scope.go:117] "RemoveContainer" containerID="d76ce96aa573abc0e27519a7972b4942c2f435aa231d263f7d3bef1d492fa2e4" Oct 07 11:56:24 crc kubenswrapper[4700]: I1007 11:56:24.260647 4700 scope.go:117] "RemoveContainer" containerID="bde22037c377acff0d1a9bdfdcc56d4c767df5f1bf051a67a7c9d18ba1c0f0b5" Oct 07 11:56:24 crc kubenswrapper[4700]: E1007 11:56:24.261047 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bde22037c377acff0d1a9bdfdcc56d4c767df5f1bf051a67a7c9d18ba1c0f0b5\": container with ID starting with bde22037c377acff0d1a9bdfdcc56d4c767df5f1bf051a67a7c9d18ba1c0f0b5 not found: ID does not exist" containerID="bde22037c377acff0d1a9bdfdcc56d4c767df5f1bf051a67a7c9d18ba1c0f0b5" Oct 07 11:56:24 crc kubenswrapper[4700]: I1007 11:56:24.261177 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bde22037c377acff0d1a9bdfdcc56d4c767df5f1bf051a67a7c9d18ba1c0f0b5"} err="failed to get container status \"bde22037c377acff0d1a9bdfdcc56d4c767df5f1bf051a67a7c9d18ba1c0f0b5\": rpc error: code = NotFound desc = could not find container \"bde22037c377acff0d1a9bdfdcc56d4c767df5f1bf051a67a7c9d18ba1c0f0b5\": container with ID starting with bde22037c377acff0d1a9bdfdcc56d4c767df5f1bf051a67a7c9d18ba1c0f0b5 not found: ID does not exist" Oct 07 11:56:24 crc kubenswrapper[4700]: I1007 11:56:24.261214 4700 scope.go:117] "RemoveContainer" containerID="ba4f30955ccf4e6b7e633335b0880c2152c5eb5e35c39af5c864fe485b0913ee" Oct 07 11:56:24 crc kubenswrapper[4700]: E1007 11:56:24.261615 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba4f30955ccf4e6b7e633335b0880c2152c5eb5e35c39af5c864fe485b0913ee\": container with ID starting with ba4f30955ccf4e6b7e633335b0880c2152c5eb5e35c39af5c864fe485b0913ee not found: ID does not exist" containerID="ba4f30955ccf4e6b7e633335b0880c2152c5eb5e35c39af5c864fe485b0913ee" Oct 07 11:56:24 crc kubenswrapper[4700]: I1007 11:56:24.261637 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba4f30955ccf4e6b7e633335b0880c2152c5eb5e35c39af5c864fe485b0913ee"} err="failed to get container status \"ba4f30955ccf4e6b7e633335b0880c2152c5eb5e35c39af5c864fe485b0913ee\": rpc error: code = NotFound desc = could not find container \"ba4f30955ccf4e6b7e633335b0880c2152c5eb5e35c39af5c864fe485b0913ee\": container with ID starting with ba4f30955ccf4e6b7e633335b0880c2152c5eb5e35c39af5c864fe485b0913ee not found: ID does not exist" Oct 07 11:56:24 crc kubenswrapper[4700]: I1007 11:56:24.261652 4700 scope.go:117] "RemoveContainer" containerID="d76ce96aa573abc0e27519a7972b4942c2f435aa231d263f7d3bef1d492fa2e4" Oct 07 11:56:24 crc kubenswrapper[4700]: E1007 11:56:24.261905 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d76ce96aa573abc0e27519a7972b4942c2f435aa231d263f7d3bef1d492fa2e4\": container with ID starting with d76ce96aa573abc0e27519a7972b4942c2f435aa231d263f7d3bef1d492fa2e4 not found: ID does not exist" containerID="d76ce96aa573abc0e27519a7972b4942c2f435aa231d263f7d3bef1d492fa2e4" Oct 07 11:56:24 crc kubenswrapper[4700]: I1007 11:56:24.261926 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d76ce96aa573abc0e27519a7972b4942c2f435aa231d263f7d3bef1d492fa2e4"} err="failed to get container status \"d76ce96aa573abc0e27519a7972b4942c2f435aa231d263f7d3bef1d492fa2e4\": rpc error: code = NotFound desc = could not find container \"d76ce96aa573abc0e27519a7972b4942c2f435aa231d263f7d3bef1d492fa2e4\": container with ID starting with d76ce96aa573abc0e27519a7972b4942c2f435aa231d263f7d3bef1d492fa2e4 not found: ID does not exist" Oct 07 11:56:24 crc kubenswrapper[4700]: I1007 11:56:24.354454 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424" (UID: "9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 11:56:24 crc kubenswrapper[4700]: I1007 11:56:24.444689 4700 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 11:56:24 crc kubenswrapper[4700]: I1007 11:56:24.691653 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vnxfl"] Oct 07 11:56:24 crc kubenswrapper[4700]: I1007 11:56:24.699954 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vnxfl"] Oct 07 11:56:25 crc kubenswrapper[4700]: I1007 11:56:25.978003 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424" path="/var/lib/kubelet/pods/9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424/volumes" Oct 07 11:56:45 crc kubenswrapper[4700]: I1007 11:56:45.333374 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 11:56:45 crc kubenswrapper[4700]: I1007 11:56:45.334074 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 11:57:15 crc kubenswrapper[4700]: I1007 11:57:15.333409 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 11:57:15 crc kubenswrapper[4700]: I1007 11:57:15.334259 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 11:57:15 crc kubenswrapper[4700]: I1007 11:57:15.334343 4700 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" Oct 07 11:57:15 crc kubenswrapper[4700]: I1007 11:57:15.335421 4700 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e576be0f7f83ef624f0d7177a7f95beb1640df75972c7a17cbd531d88581d1f2"} pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 11:57:15 crc kubenswrapper[4700]: I1007 11:57:15.335493 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" containerID="cri-o://e576be0f7f83ef624f0d7177a7f95beb1640df75972c7a17cbd531d88581d1f2" gracePeriod=600 Oct 07 11:57:15 crc kubenswrapper[4700]: E1007 11:57:15.458493 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:57:15 crc kubenswrapper[4700]: I1007 11:57:15.654596 4700 generic.go:334] "Generic (PLEG): container finished" podID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerID="e576be0f7f83ef624f0d7177a7f95beb1640df75972c7a17cbd531d88581d1f2" exitCode=0 Oct 07 11:57:15 crc kubenswrapper[4700]: I1007 11:57:15.654661 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" event={"ID":"97a77b38-e9b1-4243-ac3a-28d83d87cf15","Type":"ContainerDied","Data":"e576be0f7f83ef624f0d7177a7f95beb1640df75972c7a17cbd531d88581d1f2"} Oct 07 11:57:15 crc kubenswrapper[4700]: I1007 11:57:15.654708 4700 scope.go:117] "RemoveContainer" containerID="b4eb2c0b76ac0ef29070f1fc5ad19b527d10f64c8c75273ac1ebd81f60388f25" Oct 07 11:57:15 crc kubenswrapper[4700]: I1007 11:57:15.655630 4700 scope.go:117] "RemoveContainer" containerID="e576be0f7f83ef624f0d7177a7f95beb1640df75972c7a17cbd531d88581d1f2" Oct 07 11:57:15 crc kubenswrapper[4700]: E1007 11:57:15.656101 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:57:27 crc kubenswrapper[4700]: I1007 11:57:27.957756 4700 scope.go:117] "RemoveContainer" containerID="e576be0f7f83ef624f0d7177a7f95beb1640df75972c7a17cbd531d88581d1f2" Oct 07 11:57:27 crc kubenswrapper[4700]: E1007 11:57:27.959192 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:57:42 crc kubenswrapper[4700]: I1007 11:57:42.958919 4700 scope.go:117] "RemoveContainer" containerID="e576be0f7f83ef624f0d7177a7f95beb1640df75972c7a17cbd531d88581d1f2" Oct 07 11:57:42 crc kubenswrapper[4700]: E1007 11:57:42.959828 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:57:57 crc kubenswrapper[4700]: I1007 11:57:57.958628 4700 scope.go:117] "RemoveContainer" containerID="e576be0f7f83ef624f0d7177a7f95beb1640df75972c7a17cbd531d88581d1f2" Oct 07 11:57:57 crc kubenswrapper[4700]: E1007 11:57:57.959726 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:58:09 crc kubenswrapper[4700]: I1007 11:58:09.958027 4700 scope.go:117] "RemoveContainer" containerID="e576be0f7f83ef624f0d7177a7f95beb1640df75972c7a17cbd531d88581d1f2" Oct 07 11:58:09 crc kubenswrapper[4700]: E1007 11:58:09.959110 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:58:21 crc kubenswrapper[4700]: I1007 11:58:21.957758 4700 scope.go:117] "RemoveContainer" containerID="e576be0f7f83ef624f0d7177a7f95beb1640df75972c7a17cbd531d88581d1f2" Oct 07 11:58:21 crc kubenswrapper[4700]: E1007 11:58:21.958818 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:58:33 crc kubenswrapper[4700]: I1007 11:58:33.970995 4700 scope.go:117] "RemoveContainer" containerID="e576be0f7f83ef624f0d7177a7f95beb1640df75972c7a17cbd531d88581d1f2" Oct 07 11:58:33 crc kubenswrapper[4700]: E1007 11:58:33.978923 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:58:44 crc kubenswrapper[4700]: I1007 11:58:44.958622 4700 scope.go:117] "RemoveContainer" containerID="e576be0f7f83ef624f0d7177a7f95beb1640df75972c7a17cbd531d88581d1f2" Oct 07 11:58:44 crc kubenswrapper[4700]: E1007 11:58:44.959878 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:58:55 crc kubenswrapper[4700]: I1007 11:58:55.958756 4700 scope.go:117] "RemoveContainer" containerID="e576be0f7f83ef624f0d7177a7f95beb1640df75972c7a17cbd531d88581d1f2" Oct 07 11:58:55 crc kubenswrapper[4700]: E1007 11:58:55.960143 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:59:08 crc kubenswrapper[4700]: I1007 11:59:08.957704 4700 scope.go:117] "RemoveContainer" containerID="e576be0f7f83ef624f0d7177a7f95beb1640df75972c7a17cbd531d88581d1f2" Oct 07 11:59:08 crc kubenswrapper[4700]: E1007 11:59:08.958834 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:59:22 crc kubenswrapper[4700]: I1007 11:59:22.957547 4700 scope.go:117] "RemoveContainer" containerID="e576be0f7f83ef624f0d7177a7f95beb1640df75972c7a17cbd531d88581d1f2" Oct 07 11:59:22 crc kubenswrapper[4700]: E1007 11:59:22.958703 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:59:34 crc kubenswrapper[4700]: I1007 11:59:34.958437 4700 scope.go:117] "RemoveContainer" containerID="e576be0f7f83ef624f0d7177a7f95beb1640df75972c7a17cbd531d88581d1f2" Oct 07 11:59:34 crc kubenswrapper[4700]: E1007 11:59:34.959532 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:59:45 crc kubenswrapper[4700]: I1007 11:59:45.958434 4700 scope.go:117] "RemoveContainer" containerID="e576be0f7f83ef624f0d7177a7f95beb1640df75972c7a17cbd531d88581d1f2" Oct 07 11:59:45 crc kubenswrapper[4700]: E1007 11:59:45.959579 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 11:59:59 crc kubenswrapper[4700]: I1007 11:59:59.957322 4700 scope.go:117] "RemoveContainer" containerID="e576be0f7f83ef624f0d7177a7f95beb1640df75972c7a17cbd531d88581d1f2" Oct 07 11:59:59 crc kubenswrapper[4700]: E1007 11:59:59.957997 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:00:00 crc kubenswrapper[4700]: I1007 12:00:00.167118 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330640-wcjw9"] Oct 07 12:00:00 crc kubenswrapper[4700]: E1007 12:00:00.167713 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424" containerName="extract-utilities" Oct 07 12:00:00 crc kubenswrapper[4700]: I1007 12:00:00.167742 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424" containerName="extract-utilities" Oct 07 12:00:00 crc kubenswrapper[4700]: E1007 12:00:00.167804 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424" containerName="extract-content" Oct 07 12:00:00 crc kubenswrapper[4700]: I1007 12:00:00.167817 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424" containerName="extract-content" Oct 07 12:00:00 crc kubenswrapper[4700]: E1007 12:00:00.167846 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424" containerName="registry-server" Oct 07 12:00:00 crc kubenswrapper[4700]: I1007 12:00:00.167859 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424" containerName="registry-server" Oct 07 12:00:00 crc kubenswrapper[4700]: I1007 12:00:00.168217 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="9144fe13-c1c2-4c49-8b0d-b1c0c5d7d424" containerName="registry-server" Oct 07 12:00:00 crc kubenswrapper[4700]: I1007 12:00:00.169238 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330640-wcjw9" Oct 07 12:00:00 crc kubenswrapper[4700]: I1007 12:00:00.171294 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 12:00:00 crc kubenswrapper[4700]: I1007 12:00:00.187221 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 12:00:00 crc kubenswrapper[4700]: I1007 12:00:00.188224 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330640-wcjw9"] Oct 07 12:00:00 crc kubenswrapper[4700]: I1007 12:00:00.352064 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7cf63e0e-9098-45a4-90ed-82228b6ceec5-secret-volume\") pod \"collect-profiles-29330640-wcjw9\" (UID: \"7cf63e0e-9098-45a4-90ed-82228b6ceec5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330640-wcjw9" Oct 07 12:00:00 crc kubenswrapper[4700]: I1007 12:00:00.352264 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psc9g\" (UniqueName: \"kubernetes.io/projected/7cf63e0e-9098-45a4-90ed-82228b6ceec5-kube-api-access-psc9g\") pod \"collect-profiles-29330640-wcjw9\" (UID: \"7cf63e0e-9098-45a4-90ed-82228b6ceec5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330640-wcjw9" Oct 07 12:00:00 crc kubenswrapper[4700]: I1007 12:00:00.352943 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7cf63e0e-9098-45a4-90ed-82228b6ceec5-config-volume\") pod \"collect-profiles-29330640-wcjw9\" (UID: \"7cf63e0e-9098-45a4-90ed-82228b6ceec5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330640-wcjw9" Oct 07 12:00:00 crc kubenswrapper[4700]: I1007 12:00:00.454437 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7cf63e0e-9098-45a4-90ed-82228b6ceec5-config-volume\") pod \"collect-profiles-29330640-wcjw9\" (UID: \"7cf63e0e-9098-45a4-90ed-82228b6ceec5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330640-wcjw9" Oct 07 12:00:00 crc kubenswrapper[4700]: I1007 12:00:00.454501 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7cf63e0e-9098-45a4-90ed-82228b6ceec5-secret-volume\") pod \"collect-profiles-29330640-wcjw9\" (UID: \"7cf63e0e-9098-45a4-90ed-82228b6ceec5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330640-wcjw9" Oct 07 12:00:00 crc kubenswrapper[4700]: I1007 12:00:00.454557 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psc9g\" (UniqueName: \"kubernetes.io/projected/7cf63e0e-9098-45a4-90ed-82228b6ceec5-kube-api-access-psc9g\") pod \"collect-profiles-29330640-wcjw9\" (UID: \"7cf63e0e-9098-45a4-90ed-82228b6ceec5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330640-wcjw9" Oct 07 12:00:00 crc kubenswrapper[4700]: I1007 12:00:00.455440 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7cf63e0e-9098-45a4-90ed-82228b6ceec5-config-volume\") pod \"collect-profiles-29330640-wcjw9\" (UID: \"7cf63e0e-9098-45a4-90ed-82228b6ceec5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330640-wcjw9" Oct 07 12:00:00 crc kubenswrapper[4700]: I1007 12:00:00.465536 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7cf63e0e-9098-45a4-90ed-82228b6ceec5-secret-volume\") pod \"collect-profiles-29330640-wcjw9\" (UID: \"7cf63e0e-9098-45a4-90ed-82228b6ceec5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330640-wcjw9" Oct 07 12:00:00 crc kubenswrapper[4700]: I1007 12:00:00.472065 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psc9g\" (UniqueName: \"kubernetes.io/projected/7cf63e0e-9098-45a4-90ed-82228b6ceec5-kube-api-access-psc9g\") pod \"collect-profiles-29330640-wcjw9\" (UID: \"7cf63e0e-9098-45a4-90ed-82228b6ceec5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330640-wcjw9" Oct 07 12:00:00 crc kubenswrapper[4700]: I1007 12:00:00.506765 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330640-wcjw9" Oct 07 12:00:00 crc kubenswrapper[4700]: I1007 12:00:00.947234 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330640-wcjw9"] Oct 07 12:00:00 crc kubenswrapper[4700]: W1007 12:00:00.955724 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cf63e0e_9098_45a4_90ed_82228b6ceec5.slice/crio-eaa85e36095f22135d6a18b3dfa1b5f165e4372a7070b8ce2677947554ae94e7 WatchSource:0}: Error finding container eaa85e36095f22135d6a18b3dfa1b5f165e4372a7070b8ce2677947554ae94e7: Status 404 returned error can't find the container with id eaa85e36095f22135d6a18b3dfa1b5f165e4372a7070b8ce2677947554ae94e7 Oct 07 12:00:01 crc kubenswrapper[4700]: I1007 12:00:01.624366 4700 generic.go:334] "Generic (PLEG): container finished" podID="7cf63e0e-9098-45a4-90ed-82228b6ceec5" containerID="6343de72b982464b6c9ab2cbac31892653779d0587e1b18ef965317e3b3483c9" exitCode=0 Oct 07 12:00:01 crc kubenswrapper[4700]: I1007 12:00:01.624657 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330640-wcjw9" event={"ID":"7cf63e0e-9098-45a4-90ed-82228b6ceec5","Type":"ContainerDied","Data":"6343de72b982464b6c9ab2cbac31892653779d0587e1b18ef965317e3b3483c9"} Oct 07 12:00:01 crc kubenswrapper[4700]: I1007 12:00:01.624687 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330640-wcjw9" event={"ID":"7cf63e0e-9098-45a4-90ed-82228b6ceec5","Type":"ContainerStarted","Data":"eaa85e36095f22135d6a18b3dfa1b5f165e4372a7070b8ce2677947554ae94e7"} Oct 07 12:00:03 crc kubenswrapper[4700]: I1007 12:00:03.046811 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330640-wcjw9" Oct 07 12:00:03 crc kubenswrapper[4700]: I1007 12:00:03.231069 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7cf63e0e-9098-45a4-90ed-82228b6ceec5-secret-volume\") pod \"7cf63e0e-9098-45a4-90ed-82228b6ceec5\" (UID: \"7cf63e0e-9098-45a4-90ed-82228b6ceec5\") " Oct 07 12:00:03 crc kubenswrapper[4700]: I1007 12:00:03.231653 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7cf63e0e-9098-45a4-90ed-82228b6ceec5-config-volume\") pod \"7cf63e0e-9098-45a4-90ed-82228b6ceec5\" (UID: \"7cf63e0e-9098-45a4-90ed-82228b6ceec5\") " Oct 07 12:00:03 crc kubenswrapper[4700]: I1007 12:00:03.231905 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psc9g\" (UniqueName: \"kubernetes.io/projected/7cf63e0e-9098-45a4-90ed-82228b6ceec5-kube-api-access-psc9g\") pod \"7cf63e0e-9098-45a4-90ed-82228b6ceec5\" (UID: \"7cf63e0e-9098-45a4-90ed-82228b6ceec5\") " Oct 07 12:00:03 crc kubenswrapper[4700]: I1007 12:00:03.232651 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cf63e0e-9098-45a4-90ed-82228b6ceec5-config-volume" (OuterVolumeSpecName: "config-volume") pod "7cf63e0e-9098-45a4-90ed-82228b6ceec5" (UID: "7cf63e0e-9098-45a4-90ed-82228b6ceec5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:00:03 crc kubenswrapper[4700]: I1007 12:00:03.242856 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cf63e0e-9098-45a4-90ed-82228b6ceec5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7cf63e0e-9098-45a4-90ed-82228b6ceec5" (UID: "7cf63e0e-9098-45a4-90ed-82228b6ceec5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:00:03 crc kubenswrapper[4700]: I1007 12:00:03.243759 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cf63e0e-9098-45a4-90ed-82228b6ceec5-kube-api-access-psc9g" (OuterVolumeSpecName: "kube-api-access-psc9g") pod "7cf63e0e-9098-45a4-90ed-82228b6ceec5" (UID: "7cf63e0e-9098-45a4-90ed-82228b6ceec5"). InnerVolumeSpecName "kube-api-access-psc9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:00:03 crc kubenswrapper[4700]: I1007 12:00:03.335192 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psc9g\" (UniqueName: \"kubernetes.io/projected/7cf63e0e-9098-45a4-90ed-82228b6ceec5-kube-api-access-psc9g\") on node \"crc\" DevicePath \"\"" Oct 07 12:00:03 crc kubenswrapper[4700]: I1007 12:00:03.335262 4700 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7cf63e0e-9098-45a4-90ed-82228b6ceec5-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 12:00:03 crc kubenswrapper[4700]: I1007 12:00:03.335291 4700 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7cf63e0e-9098-45a4-90ed-82228b6ceec5-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 12:00:03 crc kubenswrapper[4700]: I1007 12:00:03.653348 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330640-wcjw9" event={"ID":"7cf63e0e-9098-45a4-90ed-82228b6ceec5","Type":"ContainerDied","Data":"eaa85e36095f22135d6a18b3dfa1b5f165e4372a7070b8ce2677947554ae94e7"} Oct 07 12:00:03 crc kubenswrapper[4700]: I1007 12:00:03.653428 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eaa85e36095f22135d6a18b3dfa1b5f165e4372a7070b8ce2677947554ae94e7" Oct 07 12:00:03 crc kubenswrapper[4700]: I1007 12:00:03.653386 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330640-wcjw9" Oct 07 12:00:04 crc kubenswrapper[4700]: I1007 12:00:04.157571 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330595-k68rm"] Oct 07 12:00:04 crc kubenswrapper[4700]: I1007 12:00:04.168514 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330595-k68rm"] Oct 07 12:00:05 crc kubenswrapper[4700]: I1007 12:00:05.971021 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a1405be-c33d-417f-839f-cb2f16ee0b70" path="/var/lib/kubelet/pods/2a1405be-c33d-417f-839f-cb2f16ee0b70/volumes" Oct 07 12:00:06 crc kubenswrapper[4700]: I1007 12:00:06.631514 4700 scope.go:117] "RemoveContainer" containerID="b0b165c7d26d1d296c43965c0f804e9e8afa33837599722f01c5286d61e1fa05" Oct 07 12:00:13 crc kubenswrapper[4700]: I1007 12:00:13.972155 4700 scope.go:117] "RemoveContainer" containerID="e576be0f7f83ef624f0d7177a7f95beb1640df75972c7a17cbd531d88581d1f2" Oct 07 12:00:13 crc kubenswrapper[4700]: E1007 12:00:13.973441 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:00:24 crc kubenswrapper[4700]: I1007 12:00:24.958022 4700 scope.go:117] "RemoveContainer" containerID="e576be0f7f83ef624f0d7177a7f95beb1640df75972c7a17cbd531d88581d1f2" Oct 07 12:00:24 crc kubenswrapper[4700]: E1007 12:00:24.958898 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:00:37 crc kubenswrapper[4700]: I1007 12:00:37.958210 4700 scope.go:117] "RemoveContainer" containerID="e576be0f7f83ef624f0d7177a7f95beb1640df75972c7a17cbd531d88581d1f2" Oct 07 12:00:37 crc kubenswrapper[4700]: E1007 12:00:37.959209 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:00:43 crc kubenswrapper[4700]: I1007 12:00:43.089215 4700 generic.go:334] "Generic (PLEG): container finished" podID="075a58e4-36cd-4194-a235-b75f63adb1e2" containerID="167bfe90c857057da426e3d082c802766c69d2c2d4ac1aef144adae1634c00e0" exitCode=0 Oct 07 12:00:43 crc kubenswrapper[4700]: I1007 12:00:43.089380 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n" event={"ID":"075a58e4-36cd-4194-a235-b75f63adb1e2","Type":"ContainerDied","Data":"167bfe90c857057da426e3d082c802766c69d2c2d4ac1aef144adae1634c00e0"} Oct 07 12:00:44 crc kubenswrapper[4700]: I1007 12:00:44.589554 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n" Oct 07 12:00:44 crc kubenswrapper[4700]: I1007 12:00:44.700206 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trq2f\" (UniqueName: \"kubernetes.io/projected/075a58e4-36cd-4194-a235-b75f63adb1e2-kube-api-access-trq2f\") pod \"075a58e4-36cd-4194-a235-b75f63adb1e2\" (UID: \"075a58e4-36cd-4194-a235-b75f63adb1e2\") " Oct 07 12:00:44 crc kubenswrapper[4700]: I1007 12:00:44.700270 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/075a58e4-36cd-4194-a235-b75f63adb1e2-libvirt-combined-ca-bundle\") pod \"075a58e4-36cd-4194-a235-b75f63adb1e2\" (UID: \"075a58e4-36cd-4194-a235-b75f63adb1e2\") " Oct 07 12:00:44 crc kubenswrapper[4700]: I1007 12:00:44.700407 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/075a58e4-36cd-4194-a235-b75f63adb1e2-inventory\") pod \"075a58e4-36cd-4194-a235-b75f63adb1e2\" (UID: \"075a58e4-36cd-4194-a235-b75f63adb1e2\") " Oct 07 12:00:44 crc kubenswrapper[4700]: I1007 12:00:44.700504 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/075a58e4-36cd-4194-a235-b75f63adb1e2-libvirt-secret-0\") pod \"075a58e4-36cd-4194-a235-b75f63adb1e2\" (UID: \"075a58e4-36cd-4194-a235-b75f63adb1e2\") " Oct 07 12:00:44 crc kubenswrapper[4700]: I1007 12:00:44.700530 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/075a58e4-36cd-4194-a235-b75f63adb1e2-ssh-key\") pod \"075a58e4-36cd-4194-a235-b75f63adb1e2\" (UID: \"075a58e4-36cd-4194-a235-b75f63adb1e2\") " Oct 07 12:00:44 crc kubenswrapper[4700]: I1007 12:00:44.711575 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/075a58e4-36cd-4194-a235-b75f63adb1e2-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "075a58e4-36cd-4194-a235-b75f63adb1e2" (UID: "075a58e4-36cd-4194-a235-b75f63adb1e2"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:00:44 crc kubenswrapper[4700]: I1007 12:00:44.713783 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/075a58e4-36cd-4194-a235-b75f63adb1e2-kube-api-access-trq2f" (OuterVolumeSpecName: "kube-api-access-trq2f") pod "075a58e4-36cd-4194-a235-b75f63adb1e2" (UID: "075a58e4-36cd-4194-a235-b75f63adb1e2"). InnerVolumeSpecName "kube-api-access-trq2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:00:44 crc kubenswrapper[4700]: I1007 12:00:44.736732 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/075a58e4-36cd-4194-a235-b75f63adb1e2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "075a58e4-36cd-4194-a235-b75f63adb1e2" (UID: "075a58e4-36cd-4194-a235-b75f63adb1e2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:00:44 crc kubenswrapper[4700]: I1007 12:00:44.737297 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/075a58e4-36cd-4194-a235-b75f63adb1e2-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "075a58e4-36cd-4194-a235-b75f63adb1e2" (UID: "075a58e4-36cd-4194-a235-b75f63adb1e2"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:00:44 crc kubenswrapper[4700]: I1007 12:00:44.746973 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/075a58e4-36cd-4194-a235-b75f63adb1e2-inventory" (OuterVolumeSpecName: "inventory") pod "075a58e4-36cd-4194-a235-b75f63adb1e2" (UID: "075a58e4-36cd-4194-a235-b75f63adb1e2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:00:44 crc kubenswrapper[4700]: I1007 12:00:44.802358 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trq2f\" (UniqueName: \"kubernetes.io/projected/075a58e4-36cd-4194-a235-b75f63adb1e2-kube-api-access-trq2f\") on node \"crc\" DevicePath \"\"" Oct 07 12:00:44 crc kubenswrapper[4700]: I1007 12:00:44.802392 4700 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/075a58e4-36cd-4194-a235-b75f63adb1e2-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:00:44 crc kubenswrapper[4700]: I1007 12:00:44.802402 4700 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/075a58e4-36cd-4194-a235-b75f63adb1e2-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 12:00:44 crc kubenswrapper[4700]: I1007 12:00:44.802414 4700 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/075a58e4-36cd-4194-a235-b75f63adb1e2-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 07 12:00:44 crc kubenswrapper[4700]: I1007 12:00:44.802422 4700 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/075a58e4-36cd-4194-a235-b75f63adb1e2-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.113676 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n" event={"ID":"075a58e4-36cd-4194-a235-b75f63adb1e2","Type":"ContainerDied","Data":"e6deb03dff1561c1990d6d7561c96d86fb04003aaacfb372fe89ef12968681a4"} Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.113929 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6deb03dff1561c1990d6d7561c96d86fb04003aaacfb372fe89ef12968681a4" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.113807 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.241508 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf"] Oct 07 12:00:45 crc kubenswrapper[4700]: E1007 12:00:45.242074 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="075a58e4-36cd-4194-a235-b75f63adb1e2" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.242105 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="075a58e4-36cd-4194-a235-b75f63adb1e2" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 07 12:00:45 crc kubenswrapper[4700]: E1007 12:00:45.242142 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf63e0e-9098-45a4-90ed-82228b6ceec5" containerName="collect-profiles" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.242155 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf63e0e-9098-45a4-90ed-82228b6ceec5" containerName="collect-profiles" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.242539 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="075a58e4-36cd-4194-a235-b75f63adb1e2" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.242586 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf63e0e-9098-45a4-90ed-82228b6ceec5" containerName="collect-profiles" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.243716 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.246142 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.246721 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxxzn" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.246925 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.246750 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.246797 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.247106 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.248095 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.255819 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf"] Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.312946 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wlkdf\" (UID: \"f28c07c7-b33b-4203-a814-25cc5156660b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.313010 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wlkdf\" (UID: \"f28c07c7-b33b-4203-a814-25cc5156660b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.313054 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f28c07c7-b33b-4203-a814-25cc5156660b-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wlkdf\" (UID: \"f28c07c7-b33b-4203-a814-25cc5156660b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.313082 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wlkdf\" (UID: \"f28c07c7-b33b-4203-a814-25cc5156660b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.313142 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wlkdf\" (UID: \"f28c07c7-b33b-4203-a814-25cc5156660b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.313171 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wlkdf\" (UID: \"f28c07c7-b33b-4203-a814-25cc5156660b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.313208 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r5r8\" (UniqueName: \"kubernetes.io/projected/f28c07c7-b33b-4203-a814-25cc5156660b-kube-api-access-5r5r8\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wlkdf\" (UID: \"f28c07c7-b33b-4203-a814-25cc5156660b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.313224 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wlkdf\" (UID: \"f28c07c7-b33b-4203-a814-25cc5156660b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.313243 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wlkdf\" (UID: \"f28c07c7-b33b-4203-a814-25cc5156660b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.415553 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wlkdf\" (UID: \"f28c07c7-b33b-4203-a814-25cc5156660b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.416402 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wlkdf\" (UID: \"f28c07c7-b33b-4203-a814-25cc5156660b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.416605 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f28c07c7-b33b-4203-a814-25cc5156660b-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wlkdf\" (UID: \"f28c07c7-b33b-4203-a814-25cc5156660b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.416767 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wlkdf\" (UID: \"f28c07c7-b33b-4203-a814-25cc5156660b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.416920 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wlkdf\" (UID: \"f28c07c7-b33b-4203-a814-25cc5156660b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.417053 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wlkdf\" (UID: \"f28c07c7-b33b-4203-a814-25cc5156660b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.417215 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5r5r8\" (UniqueName: \"kubernetes.io/projected/f28c07c7-b33b-4203-a814-25cc5156660b-kube-api-access-5r5r8\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wlkdf\" (UID: \"f28c07c7-b33b-4203-a814-25cc5156660b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.417353 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wlkdf\" (UID: \"f28c07c7-b33b-4203-a814-25cc5156660b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.417454 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wlkdf\" (UID: \"f28c07c7-b33b-4203-a814-25cc5156660b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.417455 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f28c07c7-b33b-4203-a814-25cc5156660b-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wlkdf\" (UID: \"f28c07c7-b33b-4203-a814-25cc5156660b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.420726 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wlkdf\" (UID: \"f28c07c7-b33b-4203-a814-25cc5156660b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.421029 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wlkdf\" (UID: \"f28c07c7-b33b-4203-a814-25cc5156660b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.421689 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wlkdf\" (UID: \"f28c07c7-b33b-4203-a814-25cc5156660b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.425241 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wlkdf\" (UID: \"f28c07c7-b33b-4203-a814-25cc5156660b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.425576 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wlkdf\" (UID: \"f28c07c7-b33b-4203-a814-25cc5156660b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.427646 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wlkdf\" (UID: \"f28c07c7-b33b-4203-a814-25cc5156660b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.429777 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wlkdf\" (UID: \"f28c07c7-b33b-4203-a814-25cc5156660b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.436976 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r5r8\" (UniqueName: \"kubernetes.io/projected/f28c07c7-b33b-4203-a814-25cc5156660b-kube-api-access-5r5r8\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wlkdf\" (UID: \"f28c07c7-b33b-4203-a814-25cc5156660b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf" Oct 07 12:00:45 crc kubenswrapper[4700]: I1007 12:00:45.576023 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf" Oct 07 12:00:46 crc kubenswrapper[4700]: I1007 12:00:46.169078 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf"] Oct 07 12:00:46 crc kubenswrapper[4700]: I1007 12:00:46.184206 4700 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 12:00:47 crc kubenswrapper[4700]: I1007 12:00:47.140176 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf" event={"ID":"f28c07c7-b33b-4203-a814-25cc5156660b","Type":"ContainerStarted","Data":"f4694769b03171dfd998ce75e5e1ba3c302a95ec34edd2d9681265a0f5a471e8"} Oct 07 12:00:48 crc kubenswrapper[4700]: I1007 12:00:48.151326 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf" event={"ID":"f28c07c7-b33b-4203-a814-25cc5156660b","Type":"ContainerStarted","Data":"f41f51ba566fdb2fb72859be08695186e391d3f90ed606d98a6daeec61351edb"} Oct 07 12:00:48 crc kubenswrapper[4700]: I1007 12:00:48.177954 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf" podStartSLOduration=2.464551606 podStartE2EDuration="3.177935044s" podCreationTimestamp="2025-10-07 12:00:45 +0000 UTC" firstStartedPulling="2025-10-07 12:00:46.183773564 +0000 UTC m=+2412.980172593" lastFinishedPulling="2025-10-07 12:00:46.897157042 +0000 UTC m=+2413.693556031" observedRunningTime="2025-10-07 12:00:48.169738259 +0000 UTC m=+2414.966137298" watchObservedRunningTime="2025-10-07 12:00:48.177935044 +0000 UTC m=+2414.974334033" Oct 07 12:00:50 crc kubenswrapper[4700]: I1007 12:00:50.959509 4700 scope.go:117] "RemoveContainer" containerID="e576be0f7f83ef624f0d7177a7f95beb1640df75972c7a17cbd531d88581d1f2" Oct 07 12:00:50 crc kubenswrapper[4700]: E1007 12:00:50.960017 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:01:00 crc kubenswrapper[4700]: I1007 12:01:00.156147 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29330641-xmvkv"] Oct 07 12:01:00 crc kubenswrapper[4700]: I1007 12:01:00.158711 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29330641-xmvkv" Oct 07 12:01:00 crc kubenswrapper[4700]: I1007 12:01:00.176605 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29330641-xmvkv"] Oct 07 12:01:00 crc kubenswrapper[4700]: I1007 12:01:00.229851 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdgx7\" (UniqueName: \"kubernetes.io/projected/41b7be8c-afe7-4893-a50a-2e73d28bb1a9-kube-api-access-zdgx7\") pod \"keystone-cron-29330641-xmvkv\" (UID: \"41b7be8c-afe7-4893-a50a-2e73d28bb1a9\") " pod="openstack/keystone-cron-29330641-xmvkv" Oct 07 12:01:00 crc kubenswrapper[4700]: I1007 12:01:00.230666 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/41b7be8c-afe7-4893-a50a-2e73d28bb1a9-fernet-keys\") pod \"keystone-cron-29330641-xmvkv\" (UID: \"41b7be8c-afe7-4893-a50a-2e73d28bb1a9\") " pod="openstack/keystone-cron-29330641-xmvkv" Oct 07 12:01:00 crc kubenswrapper[4700]: I1007 12:01:00.230793 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41b7be8c-afe7-4893-a50a-2e73d28bb1a9-combined-ca-bundle\") pod \"keystone-cron-29330641-xmvkv\" (UID: \"41b7be8c-afe7-4893-a50a-2e73d28bb1a9\") " pod="openstack/keystone-cron-29330641-xmvkv" Oct 07 12:01:00 crc kubenswrapper[4700]: I1007 12:01:00.230988 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41b7be8c-afe7-4893-a50a-2e73d28bb1a9-config-data\") pod \"keystone-cron-29330641-xmvkv\" (UID: \"41b7be8c-afe7-4893-a50a-2e73d28bb1a9\") " pod="openstack/keystone-cron-29330641-xmvkv" Oct 07 12:01:00 crc kubenswrapper[4700]: I1007 12:01:00.333496 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdgx7\" (UniqueName: \"kubernetes.io/projected/41b7be8c-afe7-4893-a50a-2e73d28bb1a9-kube-api-access-zdgx7\") pod \"keystone-cron-29330641-xmvkv\" (UID: \"41b7be8c-afe7-4893-a50a-2e73d28bb1a9\") " pod="openstack/keystone-cron-29330641-xmvkv" Oct 07 12:01:00 crc kubenswrapper[4700]: I1007 12:01:00.333733 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/41b7be8c-afe7-4893-a50a-2e73d28bb1a9-fernet-keys\") pod \"keystone-cron-29330641-xmvkv\" (UID: \"41b7be8c-afe7-4893-a50a-2e73d28bb1a9\") " pod="openstack/keystone-cron-29330641-xmvkv" Oct 07 12:01:00 crc kubenswrapper[4700]: I1007 12:01:00.333804 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41b7be8c-afe7-4893-a50a-2e73d28bb1a9-combined-ca-bundle\") pod \"keystone-cron-29330641-xmvkv\" (UID: \"41b7be8c-afe7-4893-a50a-2e73d28bb1a9\") " pod="openstack/keystone-cron-29330641-xmvkv" Oct 07 12:01:00 crc kubenswrapper[4700]: I1007 12:01:00.333865 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41b7be8c-afe7-4893-a50a-2e73d28bb1a9-config-data\") pod \"keystone-cron-29330641-xmvkv\" (UID: \"41b7be8c-afe7-4893-a50a-2e73d28bb1a9\") " pod="openstack/keystone-cron-29330641-xmvkv" Oct 07 12:01:00 crc kubenswrapper[4700]: I1007 12:01:00.340896 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41b7be8c-afe7-4893-a50a-2e73d28bb1a9-config-data\") pod \"keystone-cron-29330641-xmvkv\" (UID: \"41b7be8c-afe7-4893-a50a-2e73d28bb1a9\") " pod="openstack/keystone-cron-29330641-xmvkv" Oct 07 12:01:00 crc kubenswrapper[4700]: I1007 12:01:00.340950 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/41b7be8c-afe7-4893-a50a-2e73d28bb1a9-fernet-keys\") pod \"keystone-cron-29330641-xmvkv\" (UID: \"41b7be8c-afe7-4893-a50a-2e73d28bb1a9\") " pod="openstack/keystone-cron-29330641-xmvkv" Oct 07 12:01:00 crc kubenswrapper[4700]: I1007 12:01:00.349951 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41b7be8c-afe7-4893-a50a-2e73d28bb1a9-combined-ca-bundle\") pod \"keystone-cron-29330641-xmvkv\" (UID: \"41b7be8c-afe7-4893-a50a-2e73d28bb1a9\") " pod="openstack/keystone-cron-29330641-xmvkv" Oct 07 12:01:00 crc kubenswrapper[4700]: I1007 12:01:00.352919 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdgx7\" (UniqueName: \"kubernetes.io/projected/41b7be8c-afe7-4893-a50a-2e73d28bb1a9-kube-api-access-zdgx7\") pod \"keystone-cron-29330641-xmvkv\" (UID: \"41b7be8c-afe7-4893-a50a-2e73d28bb1a9\") " pod="openstack/keystone-cron-29330641-xmvkv" Oct 07 12:01:00 crc kubenswrapper[4700]: I1007 12:01:00.501969 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29330641-xmvkv" Oct 07 12:01:01 crc kubenswrapper[4700]: I1007 12:01:01.077256 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29330641-xmvkv"] Oct 07 12:01:01 crc kubenswrapper[4700]: I1007 12:01:01.290763 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29330641-xmvkv" event={"ID":"41b7be8c-afe7-4893-a50a-2e73d28bb1a9","Type":"ContainerStarted","Data":"bfbc775a15b0b49544548e8abd3443e5052a62be8311f0c745346964231a6a89"} Oct 07 12:01:01 crc kubenswrapper[4700]: I1007 12:01:01.291219 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29330641-xmvkv" event={"ID":"41b7be8c-afe7-4893-a50a-2e73d28bb1a9","Type":"ContainerStarted","Data":"a88eceb917e807abcc901924c1dda8ebb72627e5e117e413e34f6a282d67a70f"} Oct 07 12:01:01 crc kubenswrapper[4700]: I1007 12:01:01.309916 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29330641-xmvkv" podStartSLOduration=1.3098971609999999 podStartE2EDuration="1.309897161s" podCreationTimestamp="2025-10-07 12:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:01:01.307992021 +0000 UTC m=+2428.104391050" watchObservedRunningTime="2025-10-07 12:01:01.309897161 +0000 UTC m=+2428.106296190" Oct 07 12:01:02 crc kubenswrapper[4700]: I1007 12:01:02.957177 4700 scope.go:117] "RemoveContainer" containerID="e576be0f7f83ef624f0d7177a7f95beb1640df75972c7a17cbd531d88581d1f2" Oct 07 12:01:02 crc kubenswrapper[4700]: E1007 12:01:02.958752 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:01:03 crc kubenswrapper[4700]: I1007 12:01:03.311342 4700 generic.go:334] "Generic (PLEG): container finished" podID="41b7be8c-afe7-4893-a50a-2e73d28bb1a9" containerID="bfbc775a15b0b49544548e8abd3443e5052a62be8311f0c745346964231a6a89" exitCode=0 Oct 07 12:01:03 crc kubenswrapper[4700]: I1007 12:01:03.311443 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29330641-xmvkv" event={"ID":"41b7be8c-afe7-4893-a50a-2e73d28bb1a9","Type":"ContainerDied","Data":"bfbc775a15b0b49544548e8abd3443e5052a62be8311f0c745346964231a6a89"} Oct 07 12:01:04 crc kubenswrapper[4700]: I1007 12:01:04.715065 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29330641-xmvkv" Oct 07 12:01:04 crc kubenswrapper[4700]: I1007 12:01:04.848090 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41b7be8c-afe7-4893-a50a-2e73d28bb1a9-combined-ca-bundle\") pod \"41b7be8c-afe7-4893-a50a-2e73d28bb1a9\" (UID: \"41b7be8c-afe7-4893-a50a-2e73d28bb1a9\") " Oct 07 12:01:04 crc kubenswrapper[4700]: I1007 12:01:04.848428 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41b7be8c-afe7-4893-a50a-2e73d28bb1a9-config-data\") pod \"41b7be8c-afe7-4893-a50a-2e73d28bb1a9\" (UID: \"41b7be8c-afe7-4893-a50a-2e73d28bb1a9\") " Oct 07 12:01:04 crc kubenswrapper[4700]: I1007 12:01:04.848750 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/41b7be8c-afe7-4893-a50a-2e73d28bb1a9-fernet-keys\") pod \"41b7be8c-afe7-4893-a50a-2e73d28bb1a9\" (UID: \"41b7be8c-afe7-4893-a50a-2e73d28bb1a9\") " Oct 07 12:01:04 crc kubenswrapper[4700]: I1007 12:01:04.848839 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdgx7\" (UniqueName: \"kubernetes.io/projected/41b7be8c-afe7-4893-a50a-2e73d28bb1a9-kube-api-access-zdgx7\") pod \"41b7be8c-afe7-4893-a50a-2e73d28bb1a9\" (UID: \"41b7be8c-afe7-4893-a50a-2e73d28bb1a9\") " Oct 07 12:01:04 crc kubenswrapper[4700]: I1007 12:01:04.854367 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41b7be8c-afe7-4893-a50a-2e73d28bb1a9-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "41b7be8c-afe7-4893-a50a-2e73d28bb1a9" (UID: "41b7be8c-afe7-4893-a50a-2e73d28bb1a9"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:01:04 crc kubenswrapper[4700]: I1007 12:01:04.856067 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41b7be8c-afe7-4893-a50a-2e73d28bb1a9-kube-api-access-zdgx7" (OuterVolumeSpecName: "kube-api-access-zdgx7") pod "41b7be8c-afe7-4893-a50a-2e73d28bb1a9" (UID: "41b7be8c-afe7-4893-a50a-2e73d28bb1a9"). InnerVolumeSpecName "kube-api-access-zdgx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:01:04 crc kubenswrapper[4700]: I1007 12:01:04.902510 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41b7be8c-afe7-4893-a50a-2e73d28bb1a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41b7be8c-afe7-4893-a50a-2e73d28bb1a9" (UID: "41b7be8c-afe7-4893-a50a-2e73d28bb1a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:01:04 crc kubenswrapper[4700]: I1007 12:01:04.941660 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41b7be8c-afe7-4893-a50a-2e73d28bb1a9-config-data" (OuterVolumeSpecName: "config-data") pod "41b7be8c-afe7-4893-a50a-2e73d28bb1a9" (UID: "41b7be8c-afe7-4893-a50a-2e73d28bb1a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:01:04 crc kubenswrapper[4700]: I1007 12:01:04.952561 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41b7be8c-afe7-4893-a50a-2e73d28bb1a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:01:04 crc kubenswrapper[4700]: I1007 12:01:04.952642 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41b7be8c-afe7-4893-a50a-2e73d28bb1a9-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:01:04 crc kubenswrapper[4700]: I1007 12:01:04.952662 4700 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/41b7be8c-afe7-4893-a50a-2e73d28bb1a9-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 12:01:04 crc kubenswrapper[4700]: I1007 12:01:04.952709 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdgx7\" (UniqueName: \"kubernetes.io/projected/41b7be8c-afe7-4893-a50a-2e73d28bb1a9-kube-api-access-zdgx7\") on node \"crc\" DevicePath \"\"" Oct 07 12:01:05 crc kubenswrapper[4700]: I1007 12:01:05.337276 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29330641-xmvkv" event={"ID":"41b7be8c-afe7-4893-a50a-2e73d28bb1a9","Type":"ContainerDied","Data":"a88eceb917e807abcc901924c1dda8ebb72627e5e117e413e34f6a282d67a70f"} Oct 07 12:01:05 crc kubenswrapper[4700]: I1007 12:01:05.337691 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a88eceb917e807abcc901924c1dda8ebb72627e5e117e413e34f6a282d67a70f" Oct 07 12:01:05 crc kubenswrapper[4700]: I1007 12:01:05.337370 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29330641-xmvkv" Oct 07 12:01:13 crc kubenswrapper[4700]: I1007 12:01:13.971799 4700 scope.go:117] "RemoveContainer" containerID="e576be0f7f83ef624f0d7177a7f95beb1640df75972c7a17cbd531d88581d1f2" Oct 07 12:01:13 crc kubenswrapper[4700]: E1007 12:01:13.972818 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:01:20 crc kubenswrapper[4700]: I1007 12:01:20.499077 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5t565"] Oct 07 12:01:20 crc kubenswrapper[4700]: E1007 12:01:20.501226 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41b7be8c-afe7-4893-a50a-2e73d28bb1a9" containerName="keystone-cron" Oct 07 12:01:20 crc kubenswrapper[4700]: I1007 12:01:20.501260 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="41b7be8c-afe7-4893-a50a-2e73d28bb1a9" containerName="keystone-cron" Oct 07 12:01:20 crc kubenswrapper[4700]: I1007 12:01:20.501617 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="41b7be8c-afe7-4893-a50a-2e73d28bb1a9" containerName="keystone-cron" Oct 07 12:01:20 crc kubenswrapper[4700]: I1007 12:01:20.507319 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5t565" Oct 07 12:01:20 crc kubenswrapper[4700]: I1007 12:01:20.538562 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5t565"] Oct 07 12:01:20 crc kubenswrapper[4700]: I1007 12:01:20.612113 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27a60977-b987-4ca1-8bfc-48a43d7592ac-utilities\") pod \"community-operators-5t565\" (UID: \"27a60977-b987-4ca1-8bfc-48a43d7592ac\") " pod="openshift-marketplace/community-operators-5t565" Oct 07 12:01:20 crc kubenswrapper[4700]: I1007 12:01:20.612200 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27a60977-b987-4ca1-8bfc-48a43d7592ac-catalog-content\") pod \"community-operators-5t565\" (UID: \"27a60977-b987-4ca1-8bfc-48a43d7592ac\") " pod="openshift-marketplace/community-operators-5t565" Oct 07 12:01:20 crc kubenswrapper[4700]: I1007 12:01:20.612369 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wqtm\" (UniqueName: \"kubernetes.io/projected/27a60977-b987-4ca1-8bfc-48a43d7592ac-kube-api-access-8wqtm\") pod \"community-operators-5t565\" (UID: \"27a60977-b987-4ca1-8bfc-48a43d7592ac\") " pod="openshift-marketplace/community-operators-5t565" Oct 07 12:01:20 crc kubenswrapper[4700]: I1007 12:01:20.714528 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27a60977-b987-4ca1-8bfc-48a43d7592ac-utilities\") pod \"community-operators-5t565\" (UID: \"27a60977-b987-4ca1-8bfc-48a43d7592ac\") " pod="openshift-marketplace/community-operators-5t565" Oct 07 12:01:20 crc kubenswrapper[4700]: I1007 12:01:20.714588 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27a60977-b987-4ca1-8bfc-48a43d7592ac-catalog-content\") pod \"community-operators-5t565\" (UID: \"27a60977-b987-4ca1-8bfc-48a43d7592ac\") " pod="openshift-marketplace/community-operators-5t565" Oct 07 12:01:20 crc kubenswrapper[4700]: I1007 12:01:20.714631 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wqtm\" (UniqueName: \"kubernetes.io/projected/27a60977-b987-4ca1-8bfc-48a43d7592ac-kube-api-access-8wqtm\") pod \"community-operators-5t565\" (UID: \"27a60977-b987-4ca1-8bfc-48a43d7592ac\") " pod="openshift-marketplace/community-operators-5t565" Oct 07 12:01:20 crc kubenswrapper[4700]: I1007 12:01:20.715181 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27a60977-b987-4ca1-8bfc-48a43d7592ac-utilities\") pod \"community-operators-5t565\" (UID: \"27a60977-b987-4ca1-8bfc-48a43d7592ac\") " pod="openshift-marketplace/community-operators-5t565" Oct 07 12:01:20 crc kubenswrapper[4700]: I1007 12:01:20.715375 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27a60977-b987-4ca1-8bfc-48a43d7592ac-catalog-content\") pod \"community-operators-5t565\" (UID: \"27a60977-b987-4ca1-8bfc-48a43d7592ac\") " pod="openshift-marketplace/community-operators-5t565" Oct 07 12:01:20 crc kubenswrapper[4700]: I1007 12:01:20.747160 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wqtm\" (UniqueName: \"kubernetes.io/projected/27a60977-b987-4ca1-8bfc-48a43d7592ac-kube-api-access-8wqtm\") pod \"community-operators-5t565\" (UID: \"27a60977-b987-4ca1-8bfc-48a43d7592ac\") " pod="openshift-marketplace/community-operators-5t565" Oct 07 12:01:20 crc kubenswrapper[4700]: I1007 12:01:20.848829 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5t565" Oct 07 12:01:21 crc kubenswrapper[4700]: I1007 12:01:21.368441 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5t565"] Oct 07 12:01:21 crc kubenswrapper[4700]: I1007 12:01:21.507893 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t565" event={"ID":"27a60977-b987-4ca1-8bfc-48a43d7592ac","Type":"ContainerStarted","Data":"095166d8f81d3709d2d193e3a9389abb8ef399e259b5b768f7dcd682ae73495e"} Oct 07 12:01:22 crc kubenswrapper[4700]: I1007 12:01:22.522720 4700 generic.go:334] "Generic (PLEG): container finished" podID="27a60977-b987-4ca1-8bfc-48a43d7592ac" containerID="1d145ec21afa369a61d833b1346ac4c04be5a3ca9f6afc5b8e043228fdc5ecf3" exitCode=0 Oct 07 12:01:22 crc kubenswrapper[4700]: I1007 12:01:22.522913 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t565" event={"ID":"27a60977-b987-4ca1-8bfc-48a43d7592ac","Type":"ContainerDied","Data":"1d145ec21afa369a61d833b1346ac4c04be5a3ca9f6afc5b8e043228fdc5ecf3"} Oct 07 12:01:24 crc kubenswrapper[4700]: I1007 12:01:24.551880 4700 generic.go:334] "Generic (PLEG): container finished" podID="27a60977-b987-4ca1-8bfc-48a43d7592ac" containerID="34f28ce58059c1aadf8ccb8dce5fa19864585d994d874f3ca75b1dba112fea03" exitCode=0 Oct 07 12:01:24 crc kubenswrapper[4700]: I1007 12:01:24.552018 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t565" event={"ID":"27a60977-b987-4ca1-8bfc-48a43d7592ac","Type":"ContainerDied","Data":"34f28ce58059c1aadf8ccb8dce5fa19864585d994d874f3ca75b1dba112fea03"} Oct 07 12:01:24 crc kubenswrapper[4700]: I1007 12:01:24.957384 4700 scope.go:117] "RemoveContainer" containerID="e576be0f7f83ef624f0d7177a7f95beb1640df75972c7a17cbd531d88581d1f2" Oct 07 12:01:24 crc kubenswrapper[4700]: E1007 12:01:24.958504 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:01:25 crc kubenswrapper[4700]: I1007 12:01:25.566745 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t565" event={"ID":"27a60977-b987-4ca1-8bfc-48a43d7592ac","Type":"ContainerStarted","Data":"ba1d2796258077b14c4e3b620fea286a5ae3309a3819fb18d89c12025e159503"} Oct 07 12:01:25 crc kubenswrapper[4700]: I1007 12:01:25.601283 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5t565" podStartSLOduration=3.057985237 podStartE2EDuration="5.60116853s" podCreationTimestamp="2025-10-07 12:01:20 +0000 UTC" firstStartedPulling="2025-10-07 12:01:22.525357225 +0000 UTC m=+2449.321756254" lastFinishedPulling="2025-10-07 12:01:25.068540518 +0000 UTC m=+2451.864939547" observedRunningTime="2025-10-07 12:01:25.592860012 +0000 UTC m=+2452.389259061" watchObservedRunningTime="2025-10-07 12:01:25.60116853 +0000 UTC m=+2452.397567519" Oct 07 12:01:30 crc kubenswrapper[4700]: I1007 12:01:30.849982 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5t565" Oct 07 12:01:30 crc kubenswrapper[4700]: I1007 12:01:30.850873 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5t565" Oct 07 12:01:30 crc kubenswrapper[4700]: I1007 12:01:30.935702 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5t565" Oct 07 12:01:31 crc kubenswrapper[4700]: I1007 12:01:31.711072 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5t565" Oct 07 12:01:31 crc kubenswrapper[4700]: I1007 12:01:31.782989 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5t565"] Oct 07 12:01:33 crc kubenswrapper[4700]: I1007 12:01:33.661432 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5t565" podUID="27a60977-b987-4ca1-8bfc-48a43d7592ac" containerName="registry-server" containerID="cri-o://ba1d2796258077b14c4e3b620fea286a5ae3309a3819fb18d89c12025e159503" gracePeriod=2 Oct 07 12:01:34 crc kubenswrapper[4700]: I1007 12:01:34.203343 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5t565" Oct 07 12:01:34 crc kubenswrapper[4700]: I1007 12:01:34.346069 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27a60977-b987-4ca1-8bfc-48a43d7592ac-catalog-content\") pod \"27a60977-b987-4ca1-8bfc-48a43d7592ac\" (UID: \"27a60977-b987-4ca1-8bfc-48a43d7592ac\") " Oct 07 12:01:34 crc kubenswrapper[4700]: I1007 12:01:34.346140 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27a60977-b987-4ca1-8bfc-48a43d7592ac-utilities\") pod \"27a60977-b987-4ca1-8bfc-48a43d7592ac\" (UID: \"27a60977-b987-4ca1-8bfc-48a43d7592ac\") " Oct 07 12:01:34 crc kubenswrapper[4700]: I1007 12:01:34.346346 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wqtm\" (UniqueName: \"kubernetes.io/projected/27a60977-b987-4ca1-8bfc-48a43d7592ac-kube-api-access-8wqtm\") pod \"27a60977-b987-4ca1-8bfc-48a43d7592ac\" (UID: \"27a60977-b987-4ca1-8bfc-48a43d7592ac\") " Oct 07 12:01:34 crc kubenswrapper[4700]: I1007 12:01:34.347904 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27a60977-b987-4ca1-8bfc-48a43d7592ac-utilities" (OuterVolumeSpecName: "utilities") pod "27a60977-b987-4ca1-8bfc-48a43d7592ac" (UID: "27a60977-b987-4ca1-8bfc-48a43d7592ac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:01:34 crc kubenswrapper[4700]: I1007 12:01:34.353771 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27a60977-b987-4ca1-8bfc-48a43d7592ac-kube-api-access-8wqtm" (OuterVolumeSpecName: "kube-api-access-8wqtm") pod "27a60977-b987-4ca1-8bfc-48a43d7592ac" (UID: "27a60977-b987-4ca1-8bfc-48a43d7592ac"). InnerVolumeSpecName "kube-api-access-8wqtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:01:34 crc kubenswrapper[4700]: I1007 12:01:34.449494 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wqtm\" (UniqueName: \"kubernetes.io/projected/27a60977-b987-4ca1-8bfc-48a43d7592ac-kube-api-access-8wqtm\") on node \"crc\" DevicePath \"\"" Oct 07 12:01:34 crc kubenswrapper[4700]: I1007 12:01:34.449825 4700 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27a60977-b987-4ca1-8bfc-48a43d7592ac-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 12:01:34 crc kubenswrapper[4700]: I1007 12:01:34.672608 4700 generic.go:334] "Generic (PLEG): container finished" podID="27a60977-b987-4ca1-8bfc-48a43d7592ac" containerID="ba1d2796258077b14c4e3b620fea286a5ae3309a3819fb18d89c12025e159503" exitCode=0 Oct 07 12:01:34 crc kubenswrapper[4700]: I1007 12:01:34.672705 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5t565" Oct 07 12:01:34 crc kubenswrapper[4700]: I1007 12:01:34.673401 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t565" event={"ID":"27a60977-b987-4ca1-8bfc-48a43d7592ac","Type":"ContainerDied","Data":"ba1d2796258077b14c4e3b620fea286a5ae3309a3819fb18d89c12025e159503"} Oct 07 12:01:34 crc kubenswrapper[4700]: I1007 12:01:34.673475 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t565" event={"ID":"27a60977-b987-4ca1-8bfc-48a43d7592ac","Type":"ContainerDied","Data":"095166d8f81d3709d2d193e3a9389abb8ef399e259b5b768f7dcd682ae73495e"} Oct 07 12:01:34 crc kubenswrapper[4700]: I1007 12:01:34.673510 4700 scope.go:117] "RemoveContainer" containerID="ba1d2796258077b14c4e3b620fea286a5ae3309a3819fb18d89c12025e159503" Oct 07 12:01:34 crc kubenswrapper[4700]: I1007 12:01:34.709198 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27a60977-b987-4ca1-8bfc-48a43d7592ac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "27a60977-b987-4ca1-8bfc-48a43d7592ac" (UID: "27a60977-b987-4ca1-8bfc-48a43d7592ac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:01:34 crc kubenswrapper[4700]: I1007 12:01:34.709442 4700 scope.go:117] "RemoveContainer" containerID="34f28ce58059c1aadf8ccb8dce5fa19864585d994d874f3ca75b1dba112fea03" Oct 07 12:01:34 crc kubenswrapper[4700]: I1007 12:01:34.735120 4700 scope.go:117] "RemoveContainer" containerID="1d145ec21afa369a61d833b1346ac4c04be5a3ca9f6afc5b8e043228fdc5ecf3" Oct 07 12:01:34 crc kubenswrapper[4700]: I1007 12:01:34.758369 4700 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27a60977-b987-4ca1-8bfc-48a43d7592ac-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 12:01:34 crc kubenswrapper[4700]: I1007 12:01:34.791190 4700 scope.go:117] "RemoveContainer" containerID="ba1d2796258077b14c4e3b620fea286a5ae3309a3819fb18d89c12025e159503" Oct 07 12:01:34 crc kubenswrapper[4700]: E1007 12:01:34.791851 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba1d2796258077b14c4e3b620fea286a5ae3309a3819fb18d89c12025e159503\": container with ID starting with ba1d2796258077b14c4e3b620fea286a5ae3309a3819fb18d89c12025e159503 not found: ID does not exist" containerID="ba1d2796258077b14c4e3b620fea286a5ae3309a3819fb18d89c12025e159503" Oct 07 12:01:34 crc kubenswrapper[4700]: I1007 12:01:34.791896 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba1d2796258077b14c4e3b620fea286a5ae3309a3819fb18d89c12025e159503"} err="failed to get container status \"ba1d2796258077b14c4e3b620fea286a5ae3309a3819fb18d89c12025e159503\": rpc error: code = NotFound desc = could not find container \"ba1d2796258077b14c4e3b620fea286a5ae3309a3819fb18d89c12025e159503\": container with ID starting with ba1d2796258077b14c4e3b620fea286a5ae3309a3819fb18d89c12025e159503 not found: ID does not exist" Oct 07 12:01:34 crc kubenswrapper[4700]: I1007 12:01:34.791921 4700 scope.go:117] "RemoveContainer" containerID="34f28ce58059c1aadf8ccb8dce5fa19864585d994d874f3ca75b1dba112fea03" Oct 07 12:01:34 crc kubenswrapper[4700]: E1007 12:01:34.792343 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34f28ce58059c1aadf8ccb8dce5fa19864585d994d874f3ca75b1dba112fea03\": container with ID starting with 34f28ce58059c1aadf8ccb8dce5fa19864585d994d874f3ca75b1dba112fea03 not found: ID does not exist" containerID="34f28ce58059c1aadf8ccb8dce5fa19864585d994d874f3ca75b1dba112fea03" Oct 07 12:01:34 crc kubenswrapper[4700]: I1007 12:01:34.792387 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34f28ce58059c1aadf8ccb8dce5fa19864585d994d874f3ca75b1dba112fea03"} err="failed to get container status \"34f28ce58059c1aadf8ccb8dce5fa19864585d994d874f3ca75b1dba112fea03\": rpc error: code = NotFound desc = could not find container \"34f28ce58059c1aadf8ccb8dce5fa19864585d994d874f3ca75b1dba112fea03\": container with ID starting with 34f28ce58059c1aadf8ccb8dce5fa19864585d994d874f3ca75b1dba112fea03 not found: ID does not exist" Oct 07 12:01:34 crc kubenswrapper[4700]: I1007 12:01:34.792407 4700 scope.go:117] "RemoveContainer" containerID="1d145ec21afa369a61d833b1346ac4c04be5a3ca9f6afc5b8e043228fdc5ecf3" Oct 07 12:01:34 crc kubenswrapper[4700]: E1007 12:01:34.793013 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d145ec21afa369a61d833b1346ac4c04be5a3ca9f6afc5b8e043228fdc5ecf3\": container with ID starting with 1d145ec21afa369a61d833b1346ac4c04be5a3ca9f6afc5b8e043228fdc5ecf3 not found: ID does not exist" containerID="1d145ec21afa369a61d833b1346ac4c04be5a3ca9f6afc5b8e043228fdc5ecf3" Oct 07 12:01:34 crc kubenswrapper[4700]: I1007 12:01:34.793054 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d145ec21afa369a61d833b1346ac4c04be5a3ca9f6afc5b8e043228fdc5ecf3"} err="failed to get container status \"1d145ec21afa369a61d833b1346ac4c04be5a3ca9f6afc5b8e043228fdc5ecf3\": rpc error: code = NotFound desc = could not find container \"1d145ec21afa369a61d833b1346ac4c04be5a3ca9f6afc5b8e043228fdc5ecf3\": container with ID starting with 1d145ec21afa369a61d833b1346ac4c04be5a3ca9f6afc5b8e043228fdc5ecf3 not found: ID does not exist" Oct 07 12:01:35 crc kubenswrapper[4700]: I1007 12:01:35.027247 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5t565"] Oct 07 12:01:35 crc kubenswrapper[4700]: I1007 12:01:35.042683 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5t565"] Oct 07 12:01:35 crc kubenswrapper[4700]: I1007 12:01:35.979372 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27a60977-b987-4ca1-8bfc-48a43d7592ac" path="/var/lib/kubelet/pods/27a60977-b987-4ca1-8bfc-48a43d7592ac/volumes" Oct 07 12:01:37 crc kubenswrapper[4700]: I1007 12:01:37.958512 4700 scope.go:117] "RemoveContainer" containerID="e576be0f7f83ef624f0d7177a7f95beb1640df75972c7a17cbd531d88581d1f2" Oct 07 12:01:37 crc kubenswrapper[4700]: E1007 12:01:37.959494 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:01:48 crc kubenswrapper[4700]: I1007 12:01:48.959356 4700 scope.go:117] "RemoveContainer" containerID="e576be0f7f83ef624f0d7177a7f95beb1640df75972c7a17cbd531d88581d1f2" Oct 07 12:01:48 crc kubenswrapper[4700]: E1007 12:01:48.960725 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:02:00 crc kubenswrapper[4700]: I1007 12:02:00.958772 4700 scope.go:117] "RemoveContainer" containerID="e576be0f7f83ef624f0d7177a7f95beb1640df75972c7a17cbd531d88581d1f2" Oct 07 12:02:00 crc kubenswrapper[4700]: E1007 12:02:00.959897 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:02:11 crc kubenswrapper[4700]: I1007 12:02:11.958139 4700 scope.go:117] "RemoveContainer" containerID="e576be0f7f83ef624f0d7177a7f95beb1640df75972c7a17cbd531d88581d1f2" Oct 07 12:02:11 crc kubenswrapper[4700]: E1007 12:02:11.959584 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:02:26 crc kubenswrapper[4700]: I1007 12:02:26.957905 4700 scope.go:117] "RemoveContainer" containerID="e576be0f7f83ef624f0d7177a7f95beb1640df75972c7a17cbd531d88581d1f2" Oct 07 12:02:27 crc kubenswrapper[4700]: I1007 12:02:27.322863 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" event={"ID":"97a77b38-e9b1-4243-ac3a-28d83d87cf15","Type":"ContainerStarted","Data":"077fbf1964642735b6e4a2875e82d8725a07632de4cee876d72621725a72a660"} Oct 07 12:04:07 crc kubenswrapper[4700]: I1007 12:04:07.933880 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f8hgg"] Oct 07 12:04:07 crc kubenswrapper[4700]: E1007 12:04:07.940888 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27a60977-b987-4ca1-8bfc-48a43d7592ac" containerName="registry-server" Oct 07 12:04:07 crc kubenswrapper[4700]: I1007 12:04:07.940919 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="27a60977-b987-4ca1-8bfc-48a43d7592ac" containerName="registry-server" Oct 07 12:04:07 crc kubenswrapper[4700]: E1007 12:04:07.940962 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27a60977-b987-4ca1-8bfc-48a43d7592ac" containerName="extract-utilities" Oct 07 12:04:07 crc kubenswrapper[4700]: I1007 12:04:07.940977 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="27a60977-b987-4ca1-8bfc-48a43d7592ac" containerName="extract-utilities" Oct 07 12:04:07 crc kubenswrapper[4700]: E1007 12:04:07.941006 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27a60977-b987-4ca1-8bfc-48a43d7592ac" containerName="extract-content" Oct 07 12:04:07 crc kubenswrapper[4700]: I1007 12:04:07.941020 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="27a60977-b987-4ca1-8bfc-48a43d7592ac" containerName="extract-content" Oct 07 12:04:07 crc kubenswrapper[4700]: I1007 12:04:07.941368 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="27a60977-b987-4ca1-8bfc-48a43d7592ac" containerName="registry-server" Oct 07 12:04:07 crc kubenswrapper[4700]: I1007 12:04:07.943739 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f8hgg" Oct 07 12:04:07 crc kubenswrapper[4700]: I1007 12:04:07.948015 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f8hgg"] Oct 07 12:04:08 crc kubenswrapper[4700]: I1007 12:04:08.057759 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbm2p\" (UniqueName: \"kubernetes.io/projected/1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8-kube-api-access-dbm2p\") pod \"redhat-marketplace-f8hgg\" (UID: \"1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8\") " pod="openshift-marketplace/redhat-marketplace-f8hgg" Oct 07 12:04:08 crc kubenswrapper[4700]: I1007 12:04:08.057862 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8-catalog-content\") pod \"redhat-marketplace-f8hgg\" (UID: \"1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8\") " pod="openshift-marketplace/redhat-marketplace-f8hgg" Oct 07 12:04:08 crc kubenswrapper[4700]: I1007 12:04:08.057939 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8-utilities\") pod \"redhat-marketplace-f8hgg\" (UID: \"1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8\") " pod="openshift-marketplace/redhat-marketplace-f8hgg" Oct 07 12:04:08 crc kubenswrapper[4700]: I1007 12:04:08.160268 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8-utilities\") pod \"redhat-marketplace-f8hgg\" (UID: \"1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8\") " pod="openshift-marketplace/redhat-marketplace-f8hgg" Oct 07 12:04:08 crc kubenswrapper[4700]: I1007 12:04:08.160672 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbm2p\" (UniqueName: \"kubernetes.io/projected/1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8-kube-api-access-dbm2p\") pod \"redhat-marketplace-f8hgg\" (UID: \"1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8\") " pod="openshift-marketplace/redhat-marketplace-f8hgg" Oct 07 12:04:08 crc kubenswrapper[4700]: I1007 12:04:08.160813 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8-catalog-content\") pod \"redhat-marketplace-f8hgg\" (UID: \"1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8\") " pod="openshift-marketplace/redhat-marketplace-f8hgg" Oct 07 12:04:08 crc kubenswrapper[4700]: I1007 12:04:08.161966 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8-catalog-content\") pod \"redhat-marketplace-f8hgg\" (UID: \"1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8\") " pod="openshift-marketplace/redhat-marketplace-f8hgg" Oct 07 12:04:08 crc kubenswrapper[4700]: I1007 12:04:08.162608 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8-utilities\") pod \"redhat-marketplace-f8hgg\" (UID: \"1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8\") " pod="openshift-marketplace/redhat-marketplace-f8hgg" Oct 07 12:04:08 crc kubenswrapper[4700]: I1007 12:04:08.197951 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbm2p\" (UniqueName: \"kubernetes.io/projected/1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8-kube-api-access-dbm2p\") pod \"redhat-marketplace-f8hgg\" (UID: \"1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8\") " pod="openshift-marketplace/redhat-marketplace-f8hgg" Oct 07 12:04:08 crc kubenswrapper[4700]: I1007 12:04:08.318458 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f8hgg" Oct 07 12:04:08 crc kubenswrapper[4700]: I1007 12:04:08.837194 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f8hgg"] Oct 07 12:04:09 crc kubenswrapper[4700]: I1007 12:04:09.458699 4700 generic.go:334] "Generic (PLEG): container finished" podID="1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8" containerID="30e92f1026ebe813b4c25ac1d91abe8017835ab28691a1453244a147d454c8db" exitCode=0 Oct 07 12:04:09 crc kubenswrapper[4700]: I1007 12:04:09.458792 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8hgg" event={"ID":"1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8","Type":"ContainerDied","Data":"30e92f1026ebe813b4c25ac1d91abe8017835ab28691a1453244a147d454c8db"} Oct 07 12:04:09 crc kubenswrapper[4700]: I1007 12:04:09.460867 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8hgg" event={"ID":"1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8","Type":"ContainerStarted","Data":"07c25d180c090ffda1e91e645ed27dae02719a876599b71040ad48f710909490"} Oct 07 12:04:11 crc kubenswrapper[4700]: I1007 12:04:11.490856 4700 generic.go:334] "Generic (PLEG): container finished" podID="1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8" containerID="2e7143514b214ead46f94887948cbaeef8ab3aca855b1c7b04a960bdc16f4d29" exitCode=0 Oct 07 12:04:11 crc kubenswrapper[4700]: I1007 12:04:11.491437 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8hgg" event={"ID":"1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8","Type":"ContainerDied","Data":"2e7143514b214ead46f94887948cbaeef8ab3aca855b1c7b04a960bdc16f4d29"} Oct 07 12:04:14 crc kubenswrapper[4700]: I1007 12:04:14.526621 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8hgg" event={"ID":"1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8","Type":"ContainerStarted","Data":"346d9ae26ca00c16104871dbf8a3b4fb59f89f5afca84960ddc895818f147698"} Oct 07 12:04:14 crc kubenswrapper[4700]: I1007 12:04:14.546888 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f8hgg" podStartSLOduration=3.618293856 podStartE2EDuration="7.546866515s" podCreationTimestamp="2025-10-07 12:04:07 +0000 UTC" firstStartedPulling="2025-10-07 12:04:09.460471179 +0000 UTC m=+2616.256870188" lastFinishedPulling="2025-10-07 12:04:13.389043838 +0000 UTC m=+2620.185442847" observedRunningTime="2025-10-07 12:04:14.542011468 +0000 UTC m=+2621.338410477" watchObservedRunningTime="2025-10-07 12:04:14.546866515 +0000 UTC m=+2621.343265524" Oct 07 12:04:18 crc kubenswrapper[4700]: I1007 12:04:18.318781 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f8hgg" Oct 07 12:04:18 crc kubenswrapper[4700]: I1007 12:04:18.319131 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f8hgg" Oct 07 12:04:18 crc kubenswrapper[4700]: I1007 12:04:18.383011 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f8hgg" Oct 07 12:04:18 crc kubenswrapper[4700]: I1007 12:04:18.651121 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f8hgg" Oct 07 12:04:19 crc kubenswrapper[4700]: I1007 12:04:19.508762 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f8hgg"] Oct 07 12:04:20 crc kubenswrapper[4700]: I1007 12:04:20.591800 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f8hgg" podUID="1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8" containerName="registry-server" containerID="cri-o://346d9ae26ca00c16104871dbf8a3b4fb59f89f5afca84960ddc895818f147698" gracePeriod=2 Oct 07 12:04:21 crc kubenswrapper[4700]: I1007 12:04:21.074559 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f8hgg" Oct 07 12:04:21 crc kubenswrapper[4700]: I1007 12:04:21.265803 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8-catalog-content\") pod \"1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8\" (UID: \"1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8\") " Oct 07 12:04:21 crc kubenswrapper[4700]: I1007 12:04:21.265941 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbm2p\" (UniqueName: \"kubernetes.io/projected/1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8-kube-api-access-dbm2p\") pod \"1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8\" (UID: \"1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8\") " Oct 07 12:04:21 crc kubenswrapper[4700]: I1007 12:04:21.266177 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8-utilities\") pod \"1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8\" (UID: \"1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8\") " Oct 07 12:04:21 crc kubenswrapper[4700]: I1007 12:04:21.267071 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8-utilities" (OuterVolumeSpecName: "utilities") pod "1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8" (UID: "1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:04:21 crc kubenswrapper[4700]: I1007 12:04:21.279582 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8-kube-api-access-dbm2p" (OuterVolumeSpecName: "kube-api-access-dbm2p") pod "1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8" (UID: "1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8"). InnerVolumeSpecName "kube-api-access-dbm2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:04:21 crc kubenswrapper[4700]: I1007 12:04:21.281991 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8" (UID: "1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:04:21 crc kubenswrapper[4700]: I1007 12:04:21.368279 4700 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 12:04:21 crc kubenswrapper[4700]: I1007 12:04:21.368340 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbm2p\" (UniqueName: \"kubernetes.io/projected/1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8-kube-api-access-dbm2p\") on node \"crc\" DevicePath \"\"" Oct 07 12:04:21 crc kubenswrapper[4700]: I1007 12:04:21.368356 4700 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 12:04:21 crc kubenswrapper[4700]: I1007 12:04:21.605961 4700 generic.go:334] "Generic (PLEG): container finished" podID="1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8" containerID="346d9ae26ca00c16104871dbf8a3b4fb59f89f5afca84960ddc895818f147698" exitCode=0 Oct 07 12:04:21 crc kubenswrapper[4700]: I1007 12:04:21.606034 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f8hgg" Oct 07 12:04:21 crc kubenswrapper[4700]: I1007 12:04:21.607434 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8hgg" event={"ID":"1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8","Type":"ContainerDied","Data":"346d9ae26ca00c16104871dbf8a3b4fb59f89f5afca84960ddc895818f147698"} Oct 07 12:04:21 crc kubenswrapper[4700]: I1007 12:04:21.607513 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8hgg" event={"ID":"1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8","Type":"ContainerDied","Data":"07c25d180c090ffda1e91e645ed27dae02719a876599b71040ad48f710909490"} Oct 07 12:04:21 crc kubenswrapper[4700]: I1007 12:04:21.607547 4700 scope.go:117] "RemoveContainer" containerID="346d9ae26ca00c16104871dbf8a3b4fb59f89f5afca84960ddc895818f147698" Oct 07 12:04:21 crc kubenswrapper[4700]: I1007 12:04:21.654918 4700 scope.go:117] "RemoveContainer" containerID="2e7143514b214ead46f94887948cbaeef8ab3aca855b1c7b04a960bdc16f4d29" Oct 07 12:04:21 crc kubenswrapper[4700]: I1007 12:04:21.656023 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f8hgg"] Oct 07 12:04:21 crc kubenswrapper[4700]: I1007 12:04:21.665506 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f8hgg"] Oct 07 12:04:21 crc kubenswrapper[4700]: I1007 12:04:21.690875 4700 scope.go:117] "RemoveContainer" containerID="30e92f1026ebe813b4c25ac1d91abe8017835ab28691a1453244a147d454c8db" Oct 07 12:04:21 crc kubenswrapper[4700]: I1007 12:04:21.749552 4700 scope.go:117] "RemoveContainer" containerID="346d9ae26ca00c16104871dbf8a3b4fb59f89f5afca84960ddc895818f147698" Oct 07 12:04:21 crc kubenswrapper[4700]: E1007 12:04:21.750109 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"346d9ae26ca00c16104871dbf8a3b4fb59f89f5afca84960ddc895818f147698\": container with ID starting with 346d9ae26ca00c16104871dbf8a3b4fb59f89f5afca84960ddc895818f147698 not found: ID does not exist" containerID="346d9ae26ca00c16104871dbf8a3b4fb59f89f5afca84960ddc895818f147698" Oct 07 12:04:21 crc kubenswrapper[4700]: I1007 12:04:21.750185 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"346d9ae26ca00c16104871dbf8a3b4fb59f89f5afca84960ddc895818f147698"} err="failed to get container status \"346d9ae26ca00c16104871dbf8a3b4fb59f89f5afca84960ddc895818f147698\": rpc error: code = NotFound desc = could not find container \"346d9ae26ca00c16104871dbf8a3b4fb59f89f5afca84960ddc895818f147698\": container with ID starting with 346d9ae26ca00c16104871dbf8a3b4fb59f89f5afca84960ddc895818f147698 not found: ID does not exist" Oct 07 12:04:21 crc kubenswrapper[4700]: I1007 12:04:21.750232 4700 scope.go:117] "RemoveContainer" containerID="2e7143514b214ead46f94887948cbaeef8ab3aca855b1c7b04a960bdc16f4d29" Oct 07 12:04:21 crc kubenswrapper[4700]: E1007 12:04:21.750693 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e7143514b214ead46f94887948cbaeef8ab3aca855b1c7b04a960bdc16f4d29\": container with ID starting with 2e7143514b214ead46f94887948cbaeef8ab3aca855b1c7b04a960bdc16f4d29 not found: ID does not exist" containerID="2e7143514b214ead46f94887948cbaeef8ab3aca855b1c7b04a960bdc16f4d29" Oct 07 12:04:21 crc kubenswrapper[4700]: I1007 12:04:21.750755 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e7143514b214ead46f94887948cbaeef8ab3aca855b1c7b04a960bdc16f4d29"} err="failed to get container status \"2e7143514b214ead46f94887948cbaeef8ab3aca855b1c7b04a960bdc16f4d29\": rpc error: code = NotFound desc = could not find container \"2e7143514b214ead46f94887948cbaeef8ab3aca855b1c7b04a960bdc16f4d29\": container with ID starting with 2e7143514b214ead46f94887948cbaeef8ab3aca855b1c7b04a960bdc16f4d29 not found: ID does not exist" Oct 07 12:04:21 crc kubenswrapper[4700]: I1007 12:04:21.750793 4700 scope.go:117] "RemoveContainer" containerID="30e92f1026ebe813b4c25ac1d91abe8017835ab28691a1453244a147d454c8db" Oct 07 12:04:21 crc kubenswrapper[4700]: E1007 12:04:21.751329 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30e92f1026ebe813b4c25ac1d91abe8017835ab28691a1453244a147d454c8db\": container with ID starting with 30e92f1026ebe813b4c25ac1d91abe8017835ab28691a1453244a147d454c8db not found: ID does not exist" containerID="30e92f1026ebe813b4c25ac1d91abe8017835ab28691a1453244a147d454c8db" Oct 07 12:04:21 crc kubenswrapper[4700]: I1007 12:04:21.751386 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30e92f1026ebe813b4c25ac1d91abe8017835ab28691a1453244a147d454c8db"} err="failed to get container status \"30e92f1026ebe813b4c25ac1d91abe8017835ab28691a1453244a147d454c8db\": rpc error: code = NotFound desc = could not find container \"30e92f1026ebe813b4c25ac1d91abe8017835ab28691a1453244a147d454c8db\": container with ID starting with 30e92f1026ebe813b4c25ac1d91abe8017835ab28691a1453244a147d454c8db not found: ID does not exist" Oct 07 12:04:21 crc kubenswrapper[4700]: I1007 12:04:21.977162 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8" path="/var/lib/kubelet/pods/1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8/volumes" Oct 07 12:04:43 crc kubenswrapper[4700]: I1007 12:04:43.868474 4700 generic.go:334] "Generic (PLEG): container finished" podID="f28c07c7-b33b-4203-a814-25cc5156660b" containerID="f41f51ba566fdb2fb72859be08695186e391d3f90ed606d98a6daeec61351edb" exitCode=0 Oct 07 12:04:43 crc kubenswrapper[4700]: I1007 12:04:43.868860 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf" event={"ID":"f28c07c7-b33b-4203-a814-25cc5156660b","Type":"ContainerDied","Data":"f41f51ba566fdb2fb72859be08695186e391d3f90ed606d98a6daeec61351edb"} Oct 07 12:04:45 crc kubenswrapper[4700]: I1007 12:04:45.335045 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:04:45 crc kubenswrapper[4700]: I1007 12:04:45.335630 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:04:45 crc kubenswrapper[4700]: I1007 12:04:45.335457 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf" Oct 07 12:04:45 crc kubenswrapper[4700]: I1007 12:04:45.500936 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-nova-migration-ssh-key-0\") pod \"f28c07c7-b33b-4203-a814-25cc5156660b\" (UID: \"f28c07c7-b33b-4203-a814-25cc5156660b\") " Oct 07 12:04:45 crc kubenswrapper[4700]: I1007 12:04:45.501439 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-nova-migration-ssh-key-1\") pod \"f28c07c7-b33b-4203-a814-25cc5156660b\" (UID: \"f28c07c7-b33b-4203-a814-25cc5156660b\") " Oct 07 12:04:45 crc kubenswrapper[4700]: I1007 12:04:45.501516 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f28c07c7-b33b-4203-a814-25cc5156660b-nova-extra-config-0\") pod \"f28c07c7-b33b-4203-a814-25cc5156660b\" (UID: \"f28c07c7-b33b-4203-a814-25cc5156660b\") " Oct 07 12:04:45 crc kubenswrapper[4700]: I1007 12:04:45.501564 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-nova-cell1-compute-config-1\") pod \"f28c07c7-b33b-4203-a814-25cc5156660b\" (UID: \"f28c07c7-b33b-4203-a814-25cc5156660b\") " Oct 07 12:04:45 crc kubenswrapper[4700]: I1007 12:04:45.501608 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-nova-combined-ca-bundle\") pod \"f28c07c7-b33b-4203-a814-25cc5156660b\" (UID: \"f28c07c7-b33b-4203-a814-25cc5156660b\") " Oct 07 12:04:45 crc kubenswrapper[4700]: I1007 12:04:45.501705 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5r5r8\" (UniqueName: \"kubernetes.io/projected/f28c07c7-b33b-4203-a814-25cc5156660b-kube-api-access-5r5r8\") pod \"f28c07c7-b33b-4203-a814-25cc5156660b\" (UID: \"f28c07c7-b33b-4203-a814-25cc5156660b\") " Oct 07 12:04:45 crc kubenswrapper[4700]: I1007 12:04:45.501762 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-ssh-key\") pod \"f28c07c7-b33b-4203-a814-25cc5156660b\" (UID: \"f28c07c7-b33b-4203-a814-25cc5156660b\") " Oct 07 12:04:45 crc kubenswrapper[4700]: I1007 12:04:45.501815 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-nova-cell1-compute-config-0\") pod \"f28c07c7-b33b-4203-a814-25cc5156660b\" (UID: \"f28c07c7-b33b-4203-a814-25cc5156660b\") " Oct 07 12:04:45 crc kubenswrapper[4700]: I1007 12:04:45.501889 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-inventory\") pod \"f28c07c7-b33b-4203-a814-25cc5156660b\" (UID: \"f28c07c7-b33b-4203-a814-25cc5156660b\") " Oct 07 12:04:45 crc kubenswrapper[4700]: I1007 12:04:45.509555 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f28c07c7-b33b-4203-a814-25cc5156660b-kube-api-access-5r5r8" (OuterVolumeSpecName: "kube-api-access-5r5r8") pod "f28c07c7-b33b-4203-a814-25cc5156660b" (UID: "f28c07c7-b33b-4203-a814-25cc5156660b"). InnerVolumeSpecName "kube-api-access-5r5r8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:04:45 crc kubenswrapper[4700]: I1007 12:04:45.509569 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "f28c07c7-b33b-4203-a814-25cc5156660b" (UID: "f28c07c7-b33b-4203-a814-25cc5156660b"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:04:45 crc kubenswrapper[4700]: I1007 12:04:45.544811 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "f28c07c7-b33b-4203-a814-25cc5156660b" (UID: "f28c07c7-b33b-4203-a814-25cc5156660b"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:04:45 crc kubenswrapper[4700]: I1007 12:04:45.546977 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "f28c07c7-b33b-4203-a814-25cc5156660b" (UID: "f28c07c7-b33b-4203-a814-25cc5156660b"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:04:45 crc kubenswrapper[4700]: I1007 12:04:45.550500 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "f28c07c7-b33b-4203-a814-25cc5156660b" (UID: "f28c07c7-b33b-4203-a814-25cc5156660b"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:04:45 crc kubenswrapper[4700]: I1007 12:04:45.557016 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f28c07c7-b33b-4203-a814-25cc5156660b" (UID: "f28c07c7-b33b-4203-a814-25cc5156660b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:04:45 crc kubenswrapper[4700]: I1007 12:04:45.566843 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-inventory" (OuterVolumeSpecName: "inventory") pod "f28c07c7-b33b-4203-a814-25cc5156660b" (UID: "f28c07c7-b33b-4203-a814-25cc5156660b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:04:45 crc kubenswrapper[4700]: I1007 12:04:45.569439 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "f28c07c7-b33b-4203-a814-25cc5156660b" (UID: "f28c07c7-b33b-4203-a814-25cc5156660b"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:04:45 crc kubenswrapper[4700]: I1007 12:04:45.578801 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f28c07c7-b33b-4203-a814-25cc5156660b-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "f28c07c7-b33b-4203-a814-25cc5156660b" (UID: "f28c07c7-b33b-4203-a814-25cc5156660b"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:04:45 crc kubenswrapper[4700]: I1007 12:04:45.604593 4700 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 07 12:04:45 crc kubenswrapper[4700]: I1007 12:04:45.604637 4700 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f28c07c7-b33b-4203-a814-25cc5156660b-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 07 12:04:45 crc kubenswrapper[4700]: I1007 12:04:45.604654 4700 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 07 12:04:45 crc kubenswrapper[4700]: I1007 12:04:45.604672 4700 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:04:45 crc kubenswrapper[4700]: I1007 12:04:45.604689 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5r5r8\" (UniqueName: \"kubernetes.io/projected/f28c07c7-b33b-4203-a814-25cc5156660b-kube-api-access-5r5r8\") on node \"crc\" DevicePath \"\"" Oct 07 12:04:45 crc kubenswrapper[4700]: I1007 12:04:45.604704 4700 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 12:04:45 crc kubenswrapper[4700]: I1007 12:04:45.604720 4700 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 07 12:04:45 crc kubenswrapper[4700]: I1007 12:04:45.604735 4700 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 12:04:45 crc kubenswrapper[4700]: I1007 12:04:45.604751 4700 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f28c07c7-b33b-4203-a814-25cc5156660b-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 07 12:04:45 crc kubenswrapper[4700]: I1007 12:04:45.890389 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf" event={"ID":"f28c07c7-b33b-4203-a814-25cc5156660b","Type":"ContainerDied","Data":"f4694769b03171dfd998ce75e5e1ba3c302a95ec34edd2d9681265a0f5a471e8"} Oct 07 12:04:45 crc kubenswrapper[4700]: I1007 12:04:45.890430 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4694769b03171dfd998ce75e5e1ba3c302a95ec34edd2d9681265a0f5a471e8" Oct 07 12:04:45 crc kubenswrapper[4700]: I1007 12:04:45.890463 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wlkdf" Oct 07 12:04:46 crc kubenswrapper[4700]: I1007 12:04:46.028607 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q"] Oct 07 12:04:46 crc kubenswrapper[4700]: E1007 12:04:46.029013 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8" containerName="extract-content" Oct 07 12:04:46 crc kubenswrapper[4700]: I1007 12:04:46.029031 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8" containerName="extract-content" Oct 07 12:04:46 crc kubenswrapper[4700]: E1007 12:04:46.029050 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8" containerName="extract-utilities" Oct 07 12:04:46 crc kubenswrapper[4700]: I1007 12:04:46.029058 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8" containerName="extract-utilities" Oct 07 12:04:46 crc kubenswrapper[4700]: E1007 12:04:46.029094 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f28c07c7-b33b-4203-a814-25cc5156660b" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 07 12:04:46 crc kubenswrapper[4700]: I1007 12:04:46.029105 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="f28c07c7-b33b-4203-a814-25cc5156660b" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 07 12:04:46 crc kubenswrapper[4700]: E1007 12:04:46.029126 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8" containerName="registry-server" Oct 07 12:04:46 crc kubenswrapper[4700]: I1007 12:04:46.029134 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8" containerName="registry-server" Oct 07 12:04:46 crc kubenswrapper[4700]: I1007 12:04:46.029380 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ee2a110-6bf4-4f8a-ba30-b19c4d9908c8" containerName="registry-server" Oct 07 12:04:46 crc kubenswrapper[4700]: I1007 12:04:46.029401 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="f28c07c7-b33b-4203-a814-25cc5156660b" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 07 12:04:46 crc kubenswrapper[4700]: I1007 12:04:46.030076 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q" Oct 07 12:04:46 crc kubenswrapper[4700]: I1007 12:04:46.032954 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 12:04:46 crc kubenswrapper[4700]: I1007 12:04:46.037637 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxxzn" Oct 07 12:04:46 crc kubenswrapper[4700]: I1007 12:04:46.038144 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 12:04:46 crc kubenswrapper[4700]: I1007 12:04:46.039883 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 07 12:04:46 crc kubenswrapper[4700]: I1007 12:04:46.050210 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q"] Oct 07 12:04:46 crc kubenswrapper[4700]: I1007 12:04:46.050399 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 12:04:46 crc kubenswrapper[4700]: I1007 12:04:46.214860 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/3b253611-bde5-4dcd-9291-284951206e6f-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q\" (UID: \"3b253611-bde5-4dcd-9291-284951206e6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q" Oct 07 12:04:46 crc kubenswrapper[4700]: I1007 12:04:46.215239 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/3b253611-bde5-4dcd-9291-284951206e6f-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q\" (UID: \"3b253611-bde5-4dcd-9291-284951206e6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q" Oct 07 12:04:46 crc kubenswrapper[4700]: I1007 12:04:46.215354 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms8vr\" (UniqueName: \"kubernetes.io/projected/3b253611-bde5-4dcd-9291-284951206e6f-kube-api-access-ms8vr\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q\" (UID: \"3b253611-bde5-4dcd-9291-284951206e6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q" Oct 07 12:04:46 crc kubenswrapper[4700]: I1007 12:04:46.215412 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3b253611-bde5-4dcd-9291-284951206e6f-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q\" (UID: \"3b253611-bde5-4dcd-9291-284951206e6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q" Oct 07 12:04:46 crc kubenswrapper[4700]: I1007 12:04:46.215575 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b253611-bde5-4dcd-9291-284951206e6f-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q\" (UID: \"3b253611-bde5-4dcd-9291-284951206e6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q" Oct 07 12:04:46 crc kubenswrapper[4700]: I1007 12:04:46.215642 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b253611-bde5-4dcd-9291-284951206e6f-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q\" (UID: \"3b253611-bde5-4dcd-9291-284951206e6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q" Oct 07 12:04:46 crc kubenswrapper[4700]: I1007 12:04:46.215670 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/3b253611-bde5-4dcd-9291-284951206e6f-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q\" (UID: \"3b253611-bde5-4dcd-9291-284951206e6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q" Oct 07 12:04:46 crc kubenswrapper[4700]: I1007 12:04:46.318205 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/3b253611-bde5-4dcd-9291-284951206e6f-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q\" (UID: \"3b253611-bde5-4dcd-9291-284951206e6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q" Oct 07 12:04:46 crc kubenswrapper[4700]: I1007 12:04:46.318396 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/3b253611-bde5-4dcd-9291-284951206e6f-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q\" (UID: \"3b253611-bde5-4dcd-9291-284951206e6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q" Oct 07 12:04:46 crc kubenswrapper[4700]: I1007 12:04:46.318575 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms8vr\" (UniqueName: \"kubernetes.io/projected/3b253611-bde5-4dcd-9291-284951206e6f-kube-api-access-ms8vr\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q\" (UID: \"3b253611-bde5-4dcd-9291-284951206e6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q" Oct 07 12:04:46 crc kubenswrapper[4700]: I1007 12:04:46.318643 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3b253611-bde5-4dcd-9291-284951206e6f-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q\" (UID: \"3b253611-bde5-4dcd-9291-284951206e6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q" Oct 07 12:04:46 crc kubenswrapper[4700]: I1007 12:04:46.318755 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b253611-bde5-4dcd-9291-284951206e6f-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q\" (UID: \"3b253611-bde5-4dcd-9291-284951206e6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q" Oct 07 12:04:46 crc kubenswrapper[4700]: I1007 12:04:46.318832 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/3b253611-bde5-4dcd-9291-284951206e6f-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q\" (UID: \"3b253611-bde5-4dcd-9291-284951206e6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q" Oct 07 12:04:46 crc kubenswrapper[4700]: I1007 12:04:46.318884 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b253611-bde5-4dcd-9291-284951206e6f-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q\" (UID: \"3b253611-bde5-4dcd-9291-284951206e6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q" Oct 07 12:04:46 crc kubenswrapper[4700]: I1007 12:04:46.322807 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/3b253611-bde5-4dcd-9291-284951206e6f-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q\" (UID: \"3b253611-bde5-4dcd-9291-284951206e6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q" Oct 07 12:04:46 crc kubenswrapper[4700]: I1007 12:04:46.322989 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b253611-bde5-4dcd-9291-284951206e6f-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q\" (UID: \"3b253611-bde5-4dcd-9291-284951206e6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q" Oct 07 12:04:46 crc kubenswrapper[4700]: I1007 12:04:46.324767 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3b253611-bde5-4dcd-9291-284951206e6f-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q\" (UID: \"3b253611-bde5-4dcd-9291-284951206e6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q" Oct 07 12:04:46 crc kubenswrapper[4700]: I1007 12:04:46.326649 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/3b253611-bde5-4dcd-9291-284951206e6f-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q\" (UID: \"3b253611-bde5-4dcd-9291-284951206e6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q" Oct 07 12:04:46 crc kubenswrapper[4700]: I1007 12:04:46.338390 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/3b253611-bde5-4dcd-9291-284951206e6f-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q\" (UID: \"3b253611-bde5-4dcd-9291-284951206e6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q" Oct 07 12:04:46 crc kubenswrapper[4700]: I1007 12:04:46.340355 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b253611-bde5-4dcd-9291-284951206e6f-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q\" (UID: \"3b253611-bde5-4dcd-9291-284951206e6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q" Oct 07 12:04:46 crc kubenswrapper[4700]: I1007 12:04:46.347256 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms8vr\" (UniqueName: \"kubernetes.io/projected/3b253611-bde5-4dcd-9291-284951206e6f-kube-api-access-ms8vr\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q\" (UID: \"3b253611-bde5-4dcd-9291-284951206e6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q" Oct 07 12:04:46 crc kubenswrapper[4700]: I1007 12:04:46.371437 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q" Oct 07 12:04:46 crc kubenswrapper[4700]: I1007 12:04:46.933998 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q"] Oct 07 12:04:47 crc kubenswrapper[4700]: I1007 12:04:47.916349 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q" event={"ID":"3b253611-bde5-4dcd-9291-284951206e6f","Type":"ContainerStarted","Data":"4e4777225e32fe64dac255d33316429e82fea6340a54ce1da22e41edd22b0c80"} Oct 07 12:04:47 crc kubenswrapper[4700]: I1007 12:04:47.916866 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q" event={"ID":"3b253611-bde5-4dcd-9291-284951206e6f","Type":"ContainerStarted","Data":"63a7ebdc3081718f79f55b66e0db793a8c8c1da004acefd5435d27c7f1414f84"} Oct 07 12:04:47 crc kubenswrapper[4700]: I1007 12:04:47.942276 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q" podStartSLOduration=1.255007786 podStartE2EDuration="1.94225387s" podCreationTimestamp="2025-10-07 12:04:46 +0000 UTC" firstStartedPulling="2025-10-07 12:04:46.945027972 +0000 UTC m=+2653.741426981" lastFinishedPulling="2025-10-07 12:04:47.632274046 +0000 UTC m=+2654.428673065" observedRunningTime="2025-10-07 12:04:47.940465073 +0000 UTC m=+2654.736864102" watchObservedRunningTime="2025-10-07 12:04:47.94225387 +0000 UTC m=+2654.738652869" Oct 07 12:05:15 crc kubenswrapper[4700]: I1007 12:05:15.334007 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:05:15 crc kubenswrapper[4700]: I1007 12:05:15.334778 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:05:43 crc kubenswrapper[4700]: I1007 12:05:43.685455 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6tl8s"] Oct 07 12:05:43 crc kubenswrapper[4700]: I1007 12:05:43.690008 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tl8s" Oct 07 12:05:43 crc kubenswrapper[4700]: I1007 12:05:43.712368 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6tl8s"] Oct 07 12:05:43 crc kubenswrapper[4700]: I1007 12:05:43.768229 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/694c2692-da4c-46de-90ab-4005cd89fced-utilities\") pod \"certified-operators-6tl8s\" (UID: \"694c2692-da4c-46de-90ab-4005cd89fced\") " pod="openshift-marketplace/certified-operators-6tl8s" Oct 07 12:05:43 crc kubenswrapper[4700]: I1007 12:05:43.768273 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/694c2692-da4c-46de-90ab-4005cd89fced-catalog-content\") pod \"certified-operators-6tl8s\" (UID: \"694c2692-da4c-46de-90ab-4005cd89fced\") " pod="openshift-marketplace/certified-operators-6tl8s" Oct 07 12:05:43 crc kubenswrapper[4700]: I1007 12:05:43.768600 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bmcf\" (UniqueName: \"kubernetes.io/projected/694c2692-da4c-46de-90ab-4005cd89fced-kube-api-access-5bmcf\") pod \"certified-operators-6tl8s\" (UID: \"694c2692-da4c-46de-90ab-4005cd89fced\") " pod="openshift-marketplace/certified-operators-6tl8s" Oct 07 12:05:43 crc kubenswrapper[4700]: I1007 12:05:43.871286 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/694c2692-da4c-46de-90ab-4005cd89fced-utilities\") pod \"certified-operators-6tl8s\" (UID: \"694c2692-da4c-46de-90ab-4005cd89fced\") " pod="openshift-marketplace/certified-operators-6tl8s" Oct 07 12:05:43 crc kubenswrapper[4700]: I1007 12:05:43.871766 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/694c2692-da4c-46de-90ab-4005cd89fced-catalog-content\") pod \"certified-operators-6tl8s\" (UID: \"694c2692-da4c-46de-90ab-4005cd89fced\") " pod="openshift-marketplace/certified-operators-6tl8s" Oct 07 12:05:43 crc kubenswrapper[4700]: I1007 12:05:43.871906 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/694c2692-da4c-46de-90ab-4005cd89fced-utilities\") pod \"certified-operators-6tl8s\" (UID: \"694c2692-da4c-46de-90ab-4005cd89fced\") " pod="openshift-marketplace/certified-operators-6tl8s" Oct 07 12:05:43 crc kubenswrapper[4700]: I1007 12:05:43.872365 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/694c2692-da4c-46de-90ab-4005cd89fced-catalog-content\") pod \"certified-operators-6tl8s\" (UID: \"694c2692-da4c-46de-90ab-4005cd89fced\") " pod="openshift-marketplace/certified-operators-6tl8s" Oct 07 12:05:43 crc kubenswrapper[4700]: I1007 12:05:43.872643 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bmcf\" (UniqueName: \"kubernetes.io/projected/694c2692-da4c-46de-90ab-4005cd89fced-kube-api-access-5bmcf\") pod \"certified-operators-6tl8s\" (UID: \"694c2692-da4c-46de-90ab-4005cd89fced\") " pod="openshift-marketplace/certified-operators-6tl8s" Oct 07 12:05:43 crc kubenswrapper[4700]: I1007 12:05:43.898741 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bmcf\" (UniqueName: \"kubernetes.io/projected/694c2692-da4c-46de-90ab-4005cd89fced-kube-api-access-5bmcf\") pod \"certified-operators-6tl8s\" (UID: \"694c2692-da4c-46de-90ab-4005cd89fced\") " pod="openshift-marketplace/certified-operators-6tl8s" Oct 07 12:05:44 crc kubenswrapper[4700]: I1007 12:05:44.016650 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tl8s" Oct 07 12:05:44 crc kubenswrapper[4700]: I1007 12:05:44.517561 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6tl8s"] Oct 07 12:05:44 crc kubenswrapper[4700]: I1007 12:05:44.586408 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tl8s" event={"ID":"694c2692-da4c-46de-90ab-4005cd89fced","Type":"ContainerStarted","Data":"61306a92e2519d73faf9e896b0c2b90c3399ccf764f847883a894bc8bf54229e"} Oct 07 12:05:45 crc kubenswrapper[4700]: I1007 12:05:45.333805 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:05:45 crc kubenswrapper[4700]: I1007 12:05:45.334360 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:05:45 crc kubenswrapper[4700]: I1007 12:05:45.334427 4700 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" Oct 07 12:05:45 crc kubenswrapper[4700]: I1007 12:05:45.335966 4700 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"077fbf1964642735b6e4a2875e82d8725a07632de4cee876d72621725a72a660"} pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 12:05:45 crc kubenswrapper[4700]: I1007 12:05:45.336085 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" containerID="cri-o://077fbf1964642735b6e4a2875e82d8725a07632de4cee876d72621725a72a660" gracePeriod=600 Oct 07 12:05:45 crc kubenswrapper[4700]: I1007 12:05:45.598370 4700 generic.go:334] "Generic (PLEG): container finished" podID="694c2692-da4c-46de-90ab-4005cd89fced" containerID="effc71a7acee8866ab6c91fdb4c54725d09d168a3d7ca9aaafd32d0585eec390" exitCode=0 Oct 07 12:05:45 crc kubenswrapper[4700]: I1007 12:05:45.598430 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tl8s" event={"ID":"694c2692-da4c-46de-90ab-4005cd89fced","Type":"ContainerDied","Data":"effc71a7acee8866ab6c91fdb4c54725d09d168a3d7ca9aaafd32d0585eec390"} Oct 07 12:05:45 crc kubenswrapper[4700]: I1007 12:05:45.608043 4700 generic.go:334] "Generic (PLEG): container finished" podID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerID="077fbf1964642735b6e4a2875e82d8725a07632de4cee876d72621725a72a660" exitCode=0 Oct 07 12:05:45 crc kubenswrapper[4700]: I1007 12:05:45.608074 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" event={"ID":"97a77b38-e9b1-4243-ac3a-28d83d87cf15","Type":"ContainerDied","Data":"077fbf1964642735b6e4a2875e82d8725a07632de4cee876d72621725a72a660"} Oct 07 12:05:45 crc kubenswrapper[4700]: I1007 12:05:45.608101 4700 scope.go:117] "RemoveContainer" containerID="e576be0f7f83ef624f0d7177a7f95beb1640df75972c7a17cbd531d88581d1f2" Oct 07 12:05:46 crc kubenswrapper[4700]: I1007 12:05:46.619867 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" event={"ID":"97a77b38-e9b1-4243-ac3a-28d83d87cf15","Type":"ContainerStarted","Data":"c06b2df83958e39f4d5be0954930f188e3730ce13fd533f1ae848cc8c7038468"} Oct 07 12:05:48 crc kubenswrapper[4700]: I1007 12:05:48.640737 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tl8s" event={"ID":"694c2692-da4c-46de-90ab-4005cd89fced","Type":"ContainerStarted","Data":"a08981cdc34af51626336748d109030b855d8ce2d8cb74e1fadb750cb2b6936e"} Oct 07 12:05:48 crc kubenswrapper[4700]: I1007 12:05:48.643051 4700 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 12:05:49 crc kubenswrapper[4700]: I1007 12:05:49.655755 4700 generic.go:334] "Generic (PLEG): container finished" podID="694c2692-da4c-46de-90ab-4005cd89fced" containerID="a08981cdc34af51626336748d109030b855d8ce2d8cb74e1fadb750cb2b6936e" exitCode=0 Oct 07 12:05:49 crc kubenswrapper[4700]: I1007 12:05:49.655843 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tl8s" event={"ID":"694c2692-da4c-46de-90ab-4005cd89fced","Type":"ContainerDied","Data":"a08981cdc34af51626336748d109030b855d8ce2d8cb74e1fadb750cb2b6936e"} Oct 07 12:05:50 crc kubenswrapper[4700]: I1007 12:05:50.671021 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tl8s" event={"ID":"694c2692-da4c-46de-90ab-4005cd89fced","Type":"ContainerStarted","Data":"17b44bb34b0208099a4cb693425a98592a5f47eaa6df73e1a41eedb8337b3d6f"} Oct 07 12:05:50 crc kubenswrapper[4700]: I1007 12:05:50.691812 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6tl8s" podStartSLOduration=3.024306383 podStartE2EDuration="7.691760848s" podCreationTimestamp="2025-10-07 12:05:43 +0000 UTC" firstStartedPulling="2025-10-07 12:05:45.600561537 +0000 UTC m=+2712.396960526" lastFinishedPulling="2025-10-07 12:05:50.268015992 +0000 UTC m=+2717.064414991" observedRunningTime="2025-10-07 12:05:50.690186447 +0000 UTC m=+2717.486585466" watchObservedRunningTime="2025-10-07 12:05:50.691760848 +0000 UTC m=+2717.488159857" Oct 07 12:05:54 crc kubenswrapper[4700]: I1007 12:05:54.017898 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6tl8s" Oct 07 12:05:54 crc kubenswrapper[4700]: I1007 12:05:54.019942 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6tl8s" Oct 07 12:05:54 crc kubenswrapper[4700]: I1007 12:05:54.086713 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6tl8s" Oct 07 12:05:55 crc kubenswrapper[4700]: I1007 12:05:55.791185 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6tl8s" Oct 07 12:05:55 crc kubenswrapper[4700]: I1007 12:05:55.844136 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6tl8s"] Oct 07 12:05:57 crc kubenswrapper[4700]: I1007 12:05:57.753493 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6tl8s" podUID="694c2692-da4c-46de-90ab-4005cd89fced" containerName="registry-server" containerID="cri-o://17b44bb34b0208099a4cb693425a98592a5f47eaa6df73e1a41eedb8337b3d6f" gracePeriod=2 Oct 07 12:05:58 crc kubenswrapper[4700]: I1007 12:05:58.271466 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tl8s" Oct 07 12:05:58 crc kubenswrapper[4700]: I1007 12:05:58.470446 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bmcf\" (UniqueName: \"kubernetes.io/projected/694c2692-da4c-46de-90ab-4005cd89fced-kube-api-access-5bmcf\") pod \"694c2692-da4c-46de-90ab-4005cd89fced\" (UID: \"694c2692-da4c-46de-90ab-4005cd89fced\") " Oct 07 12:05:58 crc kubenswrapper[4700]: I1007 12:05:58.470839 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/694c2692-da4c-46de-90ab-4005cd89fced-catalog-content\") pod \"694c2692-da4c-46de-90ab-4005cd89fced\" (UID: \"694c2692-da4c-46de-90ab-4005cd89fced\") " Oct 07 12:05:58 crc kubenswrapper[4700]: I1007 12:05:58.471120 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/694c2692-da4c-46de-90ab-4005cd89fced-utilities\") pod \"694c2692-da4c-46de-90ab-4005cd89fced\" (UID: \"694c2692-da4c-46de-90ab-4005cd89fced\") " Oct 07 12:05:58 crc kubenswrapper[4700]: I1007 12:05:58.472903 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/694c2692-da4c-46de-90ab-4005cd89fced-utilities" (OuterVolumeSpecName: "utilities") pod "694c2692-da4c-46de-90ab-4005cd89fced" (UID: "694c2692-da4c-46de-90ab-4005cd89fced"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:05:58 crc kubenswrapper[4700]: I1007 12:05:58.481892 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/694c2692-da4c-46de-90ab-4005cd89fced-kube-api-access-5bmcf" (OuterVolumeSpecName: "kube-api-access-5bmcf") pod "694c2692-da4c-46de-90ab-4005cd89fced" (UID: "694c2692-da4c-46de-90ab-4005cd89fced"). InnerVolumeSpecName "kube-api-access-5bmcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:05:58 crc kubenswrapper[4700]: I1007 12:05:58.531692 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/694c2692-da4c-46de-90ab-4005cd89fced-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "694c2692-da4c-46de-90ab-4005cd89fced" (UID: "694c2692-da4c-46de-90ab-4005cd89fced"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:05:58 crc kubenswrapper[4700]: I1007 12:05:58.574085 4700 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/694c2692-da4c-46de-90ab-4005cd89fced-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 12:05:58 crc kubenswrapper[4700]: I1007 12:05:58.574254 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bmcf\" (UniqueName: \"kubernetes.io/projected/694c2692-da4c-46de-90ab-4005cd89fced-kube-api-access-5bmcf\") on node \"crc\" DevicePath \"\"" Oct 07 12:05:58 crc kubenswrapper[4700]: I1007 12:05:58.574336 4700 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/694c2692-da4c-46de-90ab-4005cd89fced-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 12:05:58 crc kubenswrapper[4700]: I1007 12:05:58.771580 4700 generic.go:334] "Generic (PLEG): container finished" podID="694c2692-da4c-46de-90ab-4005cd89fced" containerID="17b44bb34b0208099a4cb693425a98592a5f47eaa6df73e1a41eedb8337b3d6f" exitCode=0 Oct 07 12:05:58 crc kubenswrapper[4700]: I1007 12:05:58.771710 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tl8s" Oct 07 12:05:58 crc kubenswrapper[4700]: I1007 12:05:58.771705 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tl8s" event={"ID":"694c2692-da4c-46de-90ab-4005cd89fced","Type":"ContainerDied","Data":"17b44bb34b0208099a4cb693425a98592a5f47eaa6df73e1a41eedb8337b3d6f"} Oct 07 12:05:58 crc kubenswrapper[4700]: I1007 12:05:58.772594 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tl8s" event={"ID":"694c2692-da4c-46de-90ab-4005cd89fced","Type":"ContainerDied","Data":"61306a92e2519d73faf9e896b0c2b90c3399ccf764f847883a894bc8bf54229e"} Oct 07 12:05:58 crc kubenswrapper[4700]: I1007 12:05:58.772635 4700 scope.go:117] "RemoveContainer" containerID="17b44bb34b0208099a4cb693425a98592a5f47eaa6df73e1a41eedb8337b3d6f" Oct 07 12:05:58 crc kubenswrapper[4700]: I1007 12:05:58.811365 4700 scope.go:117] "RemoveContainer" containerID="a08981cdc34af51626336748d109030b855d8ce2d8cb74e1fadb750cb2b6936e" Oct 07 12:05:58 crc kubenswrapper[4700]: I1007 12:05:58.829227 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6tl8s"] Oct 07 12:05:58 crc kubenswrapper[4700]: I1007 12:05:58.850844 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6tl8s"] Oct 07 12:05:58 crc kubenswrapper[4700]: I1007 12:05:58.855028 4700 scope.go:117] "RemoveContainer" containerID="effc71a7acee8866ab6c91fdb4c54725d09d168a3d7ca9aaafd32d0585eec390" Oct 07 12:05:58 crc kubenswrapper[4700]: I1007 12:05:58.899673 4700 scope.go:117] "RemoveContainer" containerID="17b44bb34b0208099a4cb693425a98592a5f47eaa6df73e1a41eedb8337b3d6f" Oct 07 12:05:58 crc kubenswrapper[4700]: E1007 12:05:58.900245 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17b44bb34b0208099a4cb693425a98592a5f47eaa6df73e1a41eedb8337b3d6f\": container with ID starting with 17b44bb34b0208099a4cb693425a98592a5f47eaa6df73e1a41eedb8337b3d6f not found: ID does not exist" containerID="17b44bb34b0208099a4cb693425a98592a5f47eaa6df73e1a41eedb8337b3d6f" Oct 07 12:05:58 crc kubenswrapper[4700]: I1007 12:05:58.900341 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17b44bb34b0208099a4cb693425a98592a5f47eaa6df73e1a41eedb8337b3d6f"} err="failed to get container status \"17b44bb34b0208099a4cb693425a98592a5f47eaa6df73e1a41eedb8337b3d6f\": rpc error: code = NotFound desc = could not find container \"17b44bb34b0208099a4cb693425a98592a5f47eaa6df73e1a41eedb8337b3d6f\": container with ID starting with 17b44bb34b0208099a4cb693425a98592a5f47eaa6df73e1a41eedb8337b3d6f not found: ID does not exist" Oct 07 12:05:58 crc kubenswrapper[4700]: I1007 12:05:58.900383 4700 scope.go:117] "RemoveContainer" containerID="a08981cdc34af51626336748d109030b855d8ce2d8cb74e1fadb750cb2b6936e" Oct 07 12:05:58 crc kubenswrapper[4700]: E1007 12:05:58.902981 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a08981cdc34af51626336748d109030b855d8ce2d8cb74e1fadb750cb2b6936e\": container with ID starting with a08981cdc34af51626336748d109030b855d8ce2d8cb74e1fadb750cb2b6936e not found: ID does not exist" containerID="a08981cdc34af51626336748d109030b855d8ce2d8cb74e1fadb750cb2b6936e" Oct 07 12:05:58 crc kubenswrapper[4700]: I1007 12:05:58.903280 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a08981cdc34af51626336748d109030b855d8ce2d8cb74e1fadb750cb2b6936e"} err="failed to get container status \"a08981cdc34af51626336748d109030b855d8ce2d8cb74e1fadb750cb2b6936e\": rpc error: code = NotFound desc = could not find container \"a08981cdc34af51626336748d109030b855d8ce2d8cb74e1fadb750cb2b6936e\": container with ID starting with a08981cdc34af51626336748d109030b855d8ce2d8cb74e1fadb750cb2b6936e not found: ID does not exist" Oct 07 12:05:58 crc kubenswrapper[4700]: I1007 12:05:58.903554 4700 scope.go:117] "RemoveContainer" containerID="effc71a7acee8866ab6c91fdb4c54725d09d168a3d7ca9aaafd32d0585eec390" Oct 07 12:05:58 crc kubenswrapper[4700]: E1007 12:05:58.904126 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"effc71a7acee8866ab6c91fdb4c54725d09d168a3d7ca9aaafd32d0585eec390\": container with ID starting with effc71a7acee8866ab6c91fdb4c54725d09d168a3d7ca9aaafd32d0585eec390 not found: ID does not exist" containerID="effc71a7acee8866ab6c91fdb4c54725d09d168a3d7ca9aaafd32d0585eec390" Oct 07 12:05:58 crc kubenswrapper[4700]: I1007 12:05:58.904174 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"effc71a7acee8866ab6c91fdb4c54725d09d168a3d7ca9aaafd32d0585eec390"} err="failed to get container status \"effc71a7acee8866ab6c91fdb4c54725d09d168a3d7ca9aaafd32d0585eec390\": rpc error: code = NotFound desc = could not find container \"effc71a7acee8866ab6c91fdb4c54725d09d168a3d7ca9aaafd32d0585eec390\": container with ID starting with effc71a7acee8866ab6c91fdb4c54725d09d168a3d7ca9aaafd32d0585eec390 not found: ID does not exist" Oct 07 12:05:59 crc kubenswrapper[4700]: I1007 12:05:59.976416 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="694c2692-da4c-46de-90ab-4005cd89fced" path="/var/lib/kubelet/pods/694c2692-da4c-46de-90ab-4005cd89fced/volumes" Oct 07 12:07:38 crc kubenswrapper[4700]: I1007 12:07:38.871858 4700 generic.go:334] "Generic (PLEG): container finished" podID="3b253611-bde5-4dcd-9291-284951206e6f" containerID="4e4777225e32fe64dac255d33316429e82fea6340a54ce1da22e41edd22b0c80" exitCode=0 Oct 07 12:07:38 crc kubenswrapper[4700]: I1007 12:07:38.871976 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q" event={"ID":"3b253611-bde5-4dcd-9291-284951206e6f","Type":"ContainerDied","Data":"4e4777225e32fe64dac255d33316429e82fea6340a54ce1da22e41edd22b0c80"} Oct 07 12:07:40 crc kubenswrapper[4700]: I1007 12:07:40.420405 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q" Oct 07 12:07:40 crc kubenswrapper[4700]: I1007 12:07:40.526734 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/3b253611-bde5-4dcd-9291-284951206e6f-ceilometer-compute-config-data-0\") pod \"3b253611-bde5-4dcd-9291-284951206e6f\" (UID: \"3b253611-bde5-4dcd-9291-284951206e6f\") " Oct 07 12:07:40 crc kubenswrapper[4700]: I1007 12:07:40.526843 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b253611-bde5-4dcd-9291-284951206e6f-telemetry-combined-ca-bundle\") pod \"3b253611-bde5-4dcd-9291-284951206e6f\" (UID: \"3b253611-bde5-4dcd-9291-284951206e6f\") " Oct 07 12:07:40 crc kubenswrapper[4700]: I1007 12:07:40.526881 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3b253611-bde5-4dcd-9291-284951206e6f-ssh-key\") pod \"3b253611-bde5-4dcd-9291-284951206e6f\" (UID: \"3b253611-bde5-4dcd-9291-284951206e6f\") " Oct 07 12:07:40 crc kubenswrapper[4700]: I1007 12:07:40.526944 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ms8vr\" (UniqueName: \"kubernetes.io/projected/3b253611-bde5-4dcd-9291-284951206e6f-kube-api-access-ms8vr\") pod \"3b253611-bde5-4dcd-9291-284951206e6f\" (UID: \"3b253611-bde5-4dcd-9291-284951206e6f\") " Oct 07 12:07:40 crc kubenswrapper[4700]: I1007 12:07:40.526998 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/3b253611-bde5-4dcd-9291-284951206e6f-ceilometer-compute-config-data-2\") pod \"3b253611-bde5-4dcd-9291-284951206e6f\" (UID: \"3b253611-bde5-4dcd-9291-284951206e6f\") " Oct 07 12:07:40 crc kubenswrapper[4700]: I1007 12:07:40.527040 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/3b253611-bde5-4dcd-9291-284951206e6f-ceilometer-compute-config-data-1\") pod \"3b253611-bde5-4dcd-9291-284951206e6f\" (UID: \"3b253611-bde5-4dcd-9291-284951206e6f\") " Oct 07 12:07:40 crc kubenswrapper[4700]: I1007 12:07:40.527070 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b253611-bde5-4dcd-9291-284951206e6f-inventory\") pod \"3b253611-bde5-4dcd-9291-284951206e6f\" (UID: \"3b253611-bde5-4dcd-9291-284951206e6f\") " Oct 07 12:07:40 crc kubenswrapper[4700]: I1007 12:07:40.532537 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b253611-bde5-4dcd-9291-284951206e6f-kube-api-access-ms8vr" (OuterVolumeSpecName: "kube-api-access-ms8vr") pod "3b253611-bde5-4dcd-9291-284951206e6f" (UID: "3b253611-bde5-4dcd-9291-284951206e6f"). InnerVolumeSpecName "kube-api-access-ms8vr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:07:40 crc kubenswrapper[4700]: I1007 12:07:40.532555 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b253611-bde5-4dcd-9291-284951206e6f-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "3b253611-bde5-4dcd-9291-284951206e6f" (UID: "3b253611-bde5-4dcd-9291-284951206e6f"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:07:40 crc kubenswrapper[4700]: I1007 12:07:40.560158 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b253611-bde5-4dcd-9291-284951206e6f-inventory" (OuterVolumeSpecName: "inventory") pod "3b253611-bde5-4dcd-9291-284951206e6f" (UID: "3b253611-bde5-4dcd-9291-284951206e6f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:07:40 crc kubenswrapper[4700]: I1007 12:07:40.561121 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b253611-bde5-4dcd-9291-284951206e6f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3b253611-bde5-4dcd-9291-284951206e6f" (UID: "3b253611-bde5-4dcd-9291-284951206e6f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:07:40 crc kubenswrapper[4700]: I1007 12:07:40.562262 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b253611-bde5-4dcd-9291-284951206e6f-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "3b253611-bde5-4dcd-9291-284951206e6f" (UID: "3b253611-bde5-4dcd-9291-284951206e6f"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:07:40 crc kubenswrapper[4700]: I1007 12:07:40.564445 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b253611-bde5-4dcd-9291-284951206e6f-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "3b253611-bde5-4dcd-9291-284951206e6f" (UID: "3b253611-bde5-4dcd-9291-284951206e6f"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:07:40 crc kubenswrapper[4700]: I1007 12:07:40.565029 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b253611-bde5-4dcd-9291-284951206e6f-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "3b253611-bde5-4dcd-9291-284951206e6f" (UID: "3b253611-bde5-4dcd-9291-284951206e6f"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:07:40 crc kubenswrapper[4700]: I1007 12:07:40.629720 4700 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/3b253611-bde5-4dcd-9291-284951206e6f-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 07 12:07:40 crc kubenswrapper[4700]: I1007 12:07:40.630127 4700 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/3b253611-bde5-4dcd-9291-284951206e6f-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 07 12:07:40 crc kubenswrapper[4700]: I1007 12:07:40.630186 4700 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b253611-bde5-4dcd-9291-284951206e6f-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 12:07:40 crc kubenswrapper[4700]: I1007 12:07:40.630210 4700 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/3b253611-bde5-4dcd-9291-284951206e6f-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 07 12:07:40 crc kubenswrapper[4700]: I1007 12:07:40.630231 4700 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b253611-bde5-4dcd-9291-284951206e6f-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:07:40 crc kubenswrapper[4700]: I1007 12:07:40.630253 4700 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3b253611-bde5-4dcd-9291-284951206e6f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 12:07:40 crc kubenswrapper[4700]: I1007 12:07:40.630271 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ms8vr\" (UniqueName: \"kubernetes.io/projected/3b253611-bde5-4dcd-9291-284951206e6f-kube-api-access-ms8vr\") on node \"crc\" DevicePath \"\"" Oct 07 12:07:40 crc kubenswrapper[4700]: I1007 12:07:40.898962 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q" event={"ID":"3b253611-bde5-4dcd-9291-284951206e6f","Type":"ContainerDied","Data":"63a7ebdc3081718f79f55b66e0db793a8c8c1da004acefd5435d27c7f1414f84"} Oct 07 12:07:40 crc kubenswrapper[4700]: I1007 12:07:40.899002 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q" Oct 07 12:07:40 crc kubenswrapper[4700]: I1007 12:07:40.899004 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63a7ebdc3081718f79f55b66e0db793a8c8c1da004acefd5435d27c7f1414f84" Oct 07 12:07:45 crc kubenswrapper[4700]: I1007 12:07:45.334360 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:07:45 crc kubenswrapper[4700]: I1007 12:07:45.335076 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:08:15 crc kubenswrapper[4700]: I1007 12:08:15.310207 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7cbqb"] Oct 07 12:08:15 crc kubenswrapper[4700]: E1007 12:08:15.312257 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="694c2692-da4c-46de-90ab-4005cd89fced" containerName="extract-content" Oct 07 12:08:15 crc kubenswrapper[4700]: I1007 12:08:15.312448 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="694c2692-da4c-46de-90ab-4005cd89fced" containerName="extract-content" Oct 07 12:08:15 crc kubenswrapper[4700]: E1007 12:08:15.312533 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b253611-bde5-4dcd-9291-284951206e6f" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 07 12:08:15 crc kubenswrapper[4700]: I1007 12:08:15.312595 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b253611-bde5-4dcd-9291-284951206e6f" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 07 12:08:15 crc kubenswrapper[4700]: E1007 12:08:15.312660 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="694c2692-da4c-46de-90ab-4005cd89fced" containerName="extract-utilities" Oct 07 12:08:15 crc kubenswrapper[4700]: I1007 12:08:15.312721 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="694c2692-da4c-46de-90ab-4005cd89fced" containerName="extract-utilities" Oct 07 12:08:15 crc kubenswrapper[4700]: E1007 12:08:15.312790 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="694c2692-da4c-46de-90ab-4005cd89fced" containerName="registry-server" Oct 07 12:08:15 crc kubenswrapper[4700]: I1007 12:08:15.312850 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="694c2692-da4c-46de-90ab-4005cd89fced" containerName="registry-server" Oct 07 12:08:15 crc kubenswrapper[4700]: I1007 12:08:15.313103 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="694c2692-da4c-46de-90ab-4005cd89fced" containerName="registry-server" Oct 07 12:08:15 crc kubenswrapper[4700]: I1007 12:08:15.313175 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b253611-bde5-4dcd-9291-284951206e6f" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 07 12:08:15 crc kubenswrapper[4700]: I1007 12:08:15.314526 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7cbqb" Oct 07 12:08:15 crc kubenswrapper[4700]: I1007 12:08:15.333537 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:08:15 crc kubenswrapper[4700]: I1007 12:08:15.333683 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:08:15 crc kubenswrapper[4700]: I1007 12:08:15.349056 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7cbqb"] Oct 07 12:08:15 crc kubenswrapper[4700]: I1007 12:08:15.389038 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbc7v\" (UniqueName: \"kubernetes.io/projected/47d31880-758b-40b8-af22-a10e10c062f5-kube-api-access-fbc7v\") pod \"redhat-operators-7cbqb\" (UID: \"47d31880-758b-40b8-af22-a10e10c062f5\") " pod="openshift-marketplace/redhat-operators-7cbqb" Oct 07 12:08:15 crc kubenswrapper[4700]: I1007 12:08:15.389124 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47d31880-758b-40b8-af22-a10e10c062f5-catalog-content\") pod \"redhat-operators-7cbqb\" (UID: \"47d31880-758b-40b8-af22-a10e10c062f5\") " pod="openshift-marketplace/redhat-operators-7cbqb" Oct 07 12:08:15 crc kubenswrapper[4700]: I1007 12:08:15.389655 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47d31880-758b-40b8-af22-a10e10c062f5-utilities\") pod \"redhat-operators-7cbqb\" (UID: \"47d31880-758b-40b8-af22-a10e10c062f5\") " pod="openshift-marketplace/redhat-operators-7cbqb" Oct 07 12:08:15 crc kubenswrapper[4700]: I1007 12:08:15.492600 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47d31880-758b-40b8-af22-a10e10c062f5-utilities\") pod \"redhat-operators-7cbqb\" (UID: \"47d31880-758b-40b8-af22-a10e10c062f5\") " pod="openshift-marketplace/redhat-operators-7cbqb" Oct 07 12:08:15 crc kubenswrapper[4700]: I1007 12:08:15.493052 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbc7v\" (UniqueName: \"kubernetes.io/projected/47d31880-758b-40b8-af22-a10e10c062f5-kube-api-access-fbc7v\") pod \"redhat-operators-7cbqb\" (UID: \"47d31880-758b-40b8-af22-a10e10c062f5\") " pod="openshift-marketplace/redhat-operators-7cbqb" Oct 07 12:08:15 crc kubenswrapper[4700]: I1007 12:08:15.493085 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47d31880-758b-40b8-af22-a10e10c062f5-catalog-content\") pod \"redhat-operators-7cbqb\" (UID: \"47d31880-758b-40b8-af22-a10e10c062f5\") " pod="openshift-marketplace/redhat-operators-7cbqb" Oct 07 12:08:15 crc kubenswrapper[4700]: I1007 12:08:15.493498 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47d31880-758b-40b8-af22-a10e10c062f5-utilities\") pod \"redhat-operators-7cbqb\" (UID: \"47d31880-758b-40b8-af22-a10e10c062f5\") " pod="openshift-marketplace/redhat-operators-7cbqb" Oct 07 12:08:15 crc kubenswrapper[4700]: I1007 12:08:15.493566 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47d31880-758b-40b8-af22-a10e10c062f5-catalog-content\") pod \"redhat-operators-7cbqb\" (UID: \"47d31880-758b-40b8-af22-a10e10c062f5\") " pod="openshift-marketplace/redhat-operators-7cbqb" Oct 07 12:08:15 crc kubenswrapper[4700]: I1007 12:08:15.529289 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbc7v\" (UniqueName: \"kubernetes.io/projected/47d31880-758b-40b8-af22-a10e10c062f5-kube-api-access-fbc7v\") pod \"redhat-operators-7cbqb\" (UID: \"47d31880-758b-40b8-af22-a10e10c062f5\") " pod="openshift-marketplace/redhat-operators-7cbqb" Oct 07 12:08:15 crc kubenswrapper[4700]: I1007 12:08:15.640448 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7cbqb" Oct 07 12:08:16 crc kubenswrapper[4700]: I1007 12:08:16.180755 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7cbqb"] Oct 07 12:08:16 crc kubenswrapper[4700]: I1007 12:08:16.310402 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7cbqb" event={"ID":"47d31880-758b-40b8-af22-a10e10c062f5","Type":"ContainerStarted","Data":"baa59d6980a18ef2d7bcf5ef2182c011606fbf7381b40fcb5af428e6f323d0b7"} Oct 07 12:08:17 crc kubenswrapper[4700]: I1007 12:08:17.325889 4700 generic.go:334] "Generic (PLEG): container finished" podID="47d31880-758b-40b8-af22-a10e10c062f5" containerID="8e26b16f7a5607efd4b0fd262d95abd1e18bd84a8ddae5c9d60d789acb6c604d" exitCode=0 Oct 07 12:08:17 crc kubenswrapper[4700]: I1007 12:08:17.325996 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7cbqb" event={"ID":"47d31880-758b-40b8-af22-a10e10c062f5","Type":"ContainerDied","Data":"8e26b16f7a5607efd4b0fd262d95abd1e18bd84a8ddae5c9d60d789acb6c604d"} Oct 07 12:08:19 crc kubenswrapper[4700]: I1007 12:08:19.348660 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7cbqb" event={"ID":"47d31880-758b-40b8-af22-a10e10c062f5","Type":"ContainerStarted","Data":"e91a3b957339d6877c3f15cb001648019413c3ecb1378a4c57201096db49f984"} Oct 07 12:08:20 crc kubenswrapper[4700]: I1007 12:08:20.362265 4700 generic.go:334] "Generic (PLEG): container finished" podID="47d31880-758b-40b8-af22-a10e10c062f5" containerID="e91a3b957339d6877c3f15cb001648019413c3ecb1378a4c57201096db49f984" exitCode=0 Oct 07 12:08:20 crc kubenswrapper[4700]: I1007 12:08:20.362782 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7cbqb" event={"ID":"47d31880-758b-40b8-af22-a10e10c062f5","Type":"ContainerDied","Data":"e91a3b957339d6877c3f15cb001648019413c3ecb1378a4c57201096db49f984"} Oct 07 12:08:20 crc kubenswrapper[4700]: I1007 12:08:20.362812 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7cbqb" event={"ID":"47d31880-758b-40b8-af22-a10e10c062f5","Type":"ContainerStarted","Data":"106f8f681a92f0289f32586390c0e4ea292eb272cb0a722039ae45aa9393ca87"} Oct 07 12:08:20 crc kubenswrapper[4700]: I1007 12:08:20.394187 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7cbqb" podStartSLOduration=2.7581593140000003 podStartE2EDuration="5.39416568s" podCreationTimestamp="2025-10-07 12:08:15 +0000 UTC" firstStartedPulling="2025-10-07 12:08:17.328446474 +0000 UTC m=+2864.124845493" lastFinishedPulling="2025-10-07 12:08:19.96445287 +0000 UTC m=+2866.760851859" observedRunningTime="2025-10-07 12:08:20.384928528 +0000 UTC m=+2867.181327517" watchObservedRunningTime="2025-10-07 12:08:20.39416568 +0000 UTC m=+2867.190564669" Oct 07 12:08:25 crc kubenswrapper[4700]: I1007 12:08:25.640889 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7cbqb" Oct 07 12:08:25 crc kubenswrapper[4700]: I1007 12:08:25.641521 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7cbqb" Oct 07 12:08:25 crc kubenswrapper[4700]: I1007 12:08:25.745362 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7cbqb" Oct 07 12:08:26 crc kubenswrapper[4700]: I1007 12:08:26.510959 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7cbqb" Oct 07 12:08:26 crc kubenswrapper[4700]: I1007 12:08:26.583195 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7cbqb"] Oct 07 12:08:28 crc kubenswrapper[4700]: I1007 12:08:28.459216 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7cbqb" podUID="47d31880-758b-40b8-af22-a10e10c062f5" containerName="registry-server" containerID="cri-o://106f8f681a92f0289f32586390c0e4ea292eb272cb0a722039ae45aa9393ca87" gracePeriod=2 Oct 07 12:08:29 crc kubenswrapper[4700]: I1007 12:08:29.478423 4700 generic.go:334] "Generic (PLEG): container finished" podID="47d31880-758b-40b8-af22-a10e10c062f5" containerID="106f8f681a92f0289f32586390c0e4ea292eb272cb0a722039ae45aa9393ca87" exitCode=0 Oct 07 12:08:29 crc kubenswrapper[4700]: I1007 12:08:29.478525 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7cbqb" event={"ID":"47d31880-758b-40b8-af22-a10e10c062f5","Type":"ContainerDied","Data":"106f8f681a92f0289f32586390c0e4ea292eb272cb0a722039ae45aa9393ca87"} Oct 07 12:08:29 crc kubenswrapper[4700]: I1007 12:08:29.479592 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7cbqb" event={"ID":"47d31880-758b-40b8-af22-a10e10c062f5","Type":"ContainerDied","Data":"baa59d6980a18ef2d7bcf5ef2182c011606fbf7381b40fcb5af428e6f323d0b7"} Oct 07 12:08:29 crc kubenswrapper[4700]: I1007 12:08:29.479693 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="baa59d6980a18ef2d7bcf5ef2182c011606fbf7381b40fcb5af428e6f323d0b7" Oct 07 12:08:29 crc kubenswrapper[4700]: I1007 12:08:29.523455 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7cbqb" Oct 07 12:08:29 crc kubenswrapper[4700]: I1007 12:08:29.612194 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47d31880-758b-40b8-af22-a10e10c062f5-utilities\") pod \"47d31880-758b-40b8-af22-a10e10c062f5\" (UID: \"47d31880-758b-40b8-af22-a10e10c062f5\") " Oct 07 12:08:29 crc kubenswrapper[4700]: I1007 12:08:29.612568 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47d31880-758b-40b8-af22-a10e10c062f5-catalog-content\") pod \"47d31880-758b-40b8-af22-a10e10c062f5\" (UID: \"47d31880-758b-40b8-af22-a10e10c062f5\") " Oct 07 12:08:29 crc kubenswrapper[4700]: I1007 12:08:29.612608 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbc7v\" (UniqueName: \"kubernetes.io/projected/47d31880-758b-40b8-af22-a10e10c062f5-kube-api-access-fbc7v\") pod \"47d31880-758b-40b8-af22-a10e10c062f5\" (UID: \"47d31880-758b-40b8-af22-a10e10c062f5\") " Oct 07 12:08:29 crc kubenswrapper[4700]: I1007 12:08:29.613191 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47d31880-758b-40b8-af22-a10e10c062f5-utilities" (OuterVolumeSpecName: "utilities") pod "47d31880-758b-40b8-af22-a10e10c062f5" (UID: "47d31880-758b-40b8-af22-a10e10c062f5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:08:29 crc kubenswrapper[4700]: I1007 12:08:29.621503 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47d31880-758b-40b8-af22-a10e10c062f5-kube-api-access-fbc7v" (OuterVolumeSpecName: "kube-api-access-fbc7v") pod "47d31880-758b-40b8-af22-a10e10c062f5" (UID: "47d31880-758b-40b8-af22-a10e10c062f5"). InnerVolumeSpecName "kube-api-access-fbc7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:08:29 crc kubenswrapper[4700]: I1007 12:08:29.697620 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47d31880-758b-40b8-af22-a10e10c062f5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "47d31880-758b-40b8-af22-a10e10c062f5" (UID: "47d31880-758b-40b8-af22-a10e10c062f5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:08:29 crc kubenswrapper[4700]: I1007 12:08:29.714328 4700 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47d31880-758b-40b8-af22-a10e10c062f5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 12:08:29 crc kubenswrapper[4700]: I1007 12:08:29.714368 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbc7v\" (UniqueName: \"kubernetes.io/projected/47d31880-758b-40b8-af22-a10e10c062f5-kube-api-access-fbc7v\") on node \"crc\" DevicePath \"\"" Oct 07 12:08:29 crc kubenswrapper[4700]: I1007 12:08:29.714382 4700 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47d31880-758b-40b8-af22-a10e10c062f5-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 12:08:30 crc kubenswrapper[4700]: I1007 12:08:30.489644 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7cbqb" Oct 07 12:08:30 crc kubenswrapper[4700]: I1007 12:08:30.519416 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7cbqb"] Oct 07 12:08:30 crc kubenswrapper[4700]: I1007 12:08:30.527407 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7cbqb"] Oct 07 12:08:31 crc kubenswrapper[4700]: I1007 12:08:31.986869 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47d31880-758b-40b8-af22-a10e10c062f5" path="/var/lib/kubelet/pods/47d31880-758b-40b8-af22-a10e10c062f5/volumes" Oct 07 12:08:45 crc kubenswrapper[4700]: I1007 12:08:45.334285 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:08:45 crc kubenswrapper[4700]: I1007 12:08:45.335222 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:08:45 crc kubenswrapper[4700]: I1007 12:08:45.335343 4700 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" Oct 07 12:08:45 crc kubenswrapper[4700]: I1007 12:08:45.336483 4700 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c06b2df83958e39f4d5be0954930f188e3730ce13fd533f1ae848cc8c7038468"} pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 12:08:45 crc kubenswrapper[4700]: I1007 12:08:45.336612 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" containerID="cri-o://c06b2df83958e39f4d5be0954930f188e3730ce13fd533f1ae848cc8c7038468" gracePeriod=600 Oct 07 12:08:45 crc kubenswrapper[4700]: E1007 12:08:45.477204 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:08:45 crc kubenswrapper[4700]: I1007 12:08:45.687910 4700 generic.go:334] "Generic (PLEG): container finished" podID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerID="c06b2df83958e39f4d5be0954930f188e3730ce13fd533f1ae848cc8c7038468" exitCode=0 Oct 07 12:08:45 crc kubenswrapper[4700]: I1007 12:08:45.687982 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" event={"ID":"97a77b38-e9b1-4243-ac3a-28d83d87cf15","Type":"ContainerDied","Data":"c06b2df83958e39f4d5be0954930f188e3730ce13fd533f1ae848cc8c7038468"} Oct 07 12:08:45 crc kubenswrapper[4700]: I1007 12:08:45.688032 4700 scope.go:117] "RemoveContainer" containerID="077fbf1964642735b6e4a2875e82d8725a07632de4cee876d72621725a72a660" Oct 07 12:08:45 crc kubenswrapper[4700]: I1007 12:08:45.688864 4700 scope.go:117] "RemoveContainer" containerID="c06b2df83958e39f4d5be0954930f188e3730ce13fd533f1ae848cc8c7038468" Oct 07 12:08:45 crc kubenswrapper[4700]: E1007 12:08:45.689212 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:08:59 crc kubenswrapper[4700]: I1007 12:08:59.958734 4700 scope.go:117] "RemoveContainer" containerID="c06b2df83958e39f4d5be0954930f188e3730ce13fd533f1ae848cc8c7038468" Oct 07 12:08:59 crc kubenswrapper[4700]: E1007 12:08:59.959914 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:09:13 crc kubenswrapper[4700]: I1007 12:09:13.965334 4700 scope.go:117] "RemoveContainer" containerID="c06b2df83958e39f4d5be0954930f188e3730ce13fd533f1ae848cc8c7038468" Oct 07 12:09:13 crc kubenswrapper[4700]: E1007 12:09:13.966370 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:09:26 crc kubenswrapper[4700]: I1007 12:09:26.958561 4700 scope.go:117] "RemoveContainer" containerID="c06b2df83958e39f4d5be0954930f188e3730ce13fd533f1ae848cc8c7038468" Oct 07 12:09:26 crc kubenswrapper[4700]: E1007 12:09:26.959589 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:09:41 crc kubenswrapper[4700]: I1007 12:09:41.957741 4700 scope.go:117] "RemoveContainer" containerID="c06b2df83958e39f4d5be0954930f188e3730ce13fd533f1ae848cc8c7038468" Oct 07 12:09:41 crc kubenswrapper[4700]: E1007 12:09:41.958645 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:09:52 crc kubenswrapper[4700]: I1007 12:09:52.957583 4700 scope.go:117] "RemoveContainer" containerID="c06b2df83958e39f4d5be0954930f188e3730ce13fd533f1ae848cc8c7038468" Oct 07 12:09:52 crc kubenswrapper[4700]: E1007 12:09:52.958521 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:10:07 crc kubenswrapper[4700]: I1007 12:10:07.957655 4700 scope.go:117] "RemoveContainer" containerID="c06b2df83958e39f4d5be0954930f188e3730ce13fd533f1ae848cc8c7038468" Oct 07 12:10:07 crc kubenswrapper[4700]: E1007 12:10:07.958349 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:10:18 crc kubenswrapper[4700]: I1007 12:10:18.960163 4700 scope.go:117] "RemoveContainer" containerID="c06b2df83958e39f4d5be0954930f188e3730ce13fd533f1ae848cc8c7038468" Oct 07 12:10:18 crc kubenswrapper[4700]: E1007 12:10:18.960938 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:10:30 crc kubenswrapper[4700]: I1007 12:10:30.957889 4700 scope.go:117] "RemoveContainer" containerID="c06b2df83958e39f4d5be0954930f188e3730ce13fd533f1ae848cc8c7038468" Oct 07 12:10:30 crc kubenswrapper[4700]: E1007 12:10:30.958930 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:10:31 crc kubenswrapper[4700]: I1007 12:10:31.214392 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-bf98bb7b6-ghwv9_b5911608-e23e-46e8-9637-488593110278/manager/0.log" Oct 07 12:10:33 crc kubenswrapper[4700]: I1007 12:10:33.065834 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 07 12:10:33 crc kubenswrapper[4700]: I1007 12:10:33.066307 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="1cb5863c-c473-49b8-9ffa-ce83d51a061c" containerName="openstackclient" containerID="cri-o://8274f66316f6312d8cf6b2599d570d71bb607f9434378e8043d2f90cfd921ace" gracePeriod=2 Oct 07 12:10:33 crc kubenswrapper[4700]: I1007 12:10:33.075844 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 07 12:10:33 crc kubenswrapper[4700]: I1007 12:10:33.102383 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 07 12:10:33 crc kubenswrapper[4700]: E1007 12:10:33.102930 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47d31880-758b-40b8-af22-a10e10c062f5" containerName="extract-content" Oct 07 12:10:33 crc kubenswrapper[4700]: I1007 12:10:33.102946 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="47d31880-758b-40b8-af22-a10e10c062f5" containerName="extract-content" Oct 07 12:10:33 crc kubenswrapper[4700]: E1007 12:10:33.102963 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47d31880-758b-40b8-af22-a10e10c062f5" containerName="registry-server" Oct 07 12:10:33 crc kubenswrapper[4700]: I1007 12:10:33.102970 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="47d31880-758b-40b8-af22-a10e10c062f5" containerName="registry-server" Oct 07 12:10:33 crc kubenswrapper[4700]: E1007 12:10:33.103017 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47d31880-758b-40b8-af22-a10e10c062f5" containerName="extract-utilities" Oct 07 12:10:33 crc kubenswrapper[4700]: I1007 12:10:33.103025 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="47d31880-758b-40b8-af22-a10e10c062f5" containerName="extract-utilities" Oct 07 12:10:33 crc kubenswrapper[4700]: E1007 12:10:33.103038 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cb5863c-c473-49b8-9ffa-ce83d51a061c" containerName="openstackclient" Oct 07 12:10:33 crc kubenswrapper[4700]: I1007 12:10:33.103045 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cb5863c-c473-49b8-9ffa-ce83d51a061c" containerName="openstackclient" Oct 07 12:10:33 crc kubenswrapper[4700]: I1007 12:10:33.103257 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="47d31880-758b-40b8-af22-a10e10c062f5" containerName="registry-server" Oct 07 12:10:33 crc kubenswrapper[4700]: I1007 12:10:33.103292 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cb5863c-c473-49b8-9ffa-ce83d51a061c" containerName="openstackclient" Oct 07 12:10:33 crc kubenswrapper[4700]: I1007 12:10:33.104073 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 12:10:33 crc kubenswrapper[4700]: I1007 12:10:33.123623 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 07 12:10:33 crc kubenswrapper[4700]: I1007 12:10:33.128905 4700 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="1cb5863c-c473-49b8-9ffa-ce83d51a061c" podUID="d6005c71-9a39-4ab6-876a-99dd2f60c9ae" Oct 07 12:10:33 crc kubenswrapper[4700]: I1007 12:10:33.183421 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wvrn\" (UniqueName: \"kubernetes.io/projected/d6005c71-9a39-4ab6-876a-99dd2f60c9ae-kube-api-access-5wvrn\") pod \"openstackclient\" (UID: \"d6005c71-9a39-4ab6-876a-99dd2f60c9ae\") " pod="openstack/openstackclient" Oct 07 12:10:33 crc kubenswrapper[4700]: I1007 12:10:33.183525 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d6005c71-9a39-4ab6-876a-99dd2f60c9ae-openstack-config-secret\") pod \"openstackclient\" (UID: \"d6005c71-9a39-4ab6-876a-99dd2f60c9ae\") " pod="openstack/openstackclient" Oct 07 12:10:33 crc kubenswrapper[4700]: I1007 12:10:33.183571 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d6005c71-9a39-4ab6-876a-99dd2f60c9ae-openstack-config\") pod \"openstackclient\" (UID: \"d6005c71-9a39-4ab6-876a-99dd2f60c9ae\") " pod="openstack/openstackclient" Oct 07 12:10:33 crc kubenswrapper[4700]: I1007 12:10:33.183851 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6005c71-9a39-4ab6-876a-99dd2f60c9ae-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d6005c71-9a39-4ab6-876a-99dd2f60c9ae\") " pod="openstack/openstackclient" Oct 07 12:10:33 crc kubenswrapper[4700]: I1007 12:10:33.285706 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6005c71-9a39-4ab6-876a-99dd2f60c9ae-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d6005c71-9a39-4ab6-876a-99dd2f60c9ae\") " pod="openstack/openstackclient" Oct 07 12:10:33 crc kubenswrapper[4700]: I1007 12:10:33.285830 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wvrn\" (UniqueName: \"kubernetes.io/projected/d6005c71-9a39-4ab6-876a-99dd2f60c9ae-kube-api-access-5wvrn\") pod \"openstackclient\" (UID: \"d6005c71-9a39-4ab6-876a-99dd2f60c9ae\") " pod="openstack/openstackclient" Oct 07 12:10:33 crc kubenswrapper[4700]: I1007 12:10:33.285863 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d6005c71-9a39-4ab6-876a-99dd2f60c9ae-openstack-config-secret\") pod \"openstackclient\" (UID: \"d6005c71-9a39-4ab6-876a-99dd2f60c9ae\") " pod="openstack/openstackclient" Oct 07 12:10:33 crc kubenswrapper[4700]: I1007 12:10:33.285889 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d6005c71-9a39-4ab6-876a-99dd2f60c9ae-openstack-config\") pod \"openstackclient\" (UID: \"d6005c71-9a39-4ab6-876a-99dd2f60c9ae\") " pod="openstack/openstackclient" Oct 07 12:10:33 crc kubenswrapper[4700]: I1007 12:10:33.286868 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d6005c71-9a39-4ab6-876a-99dd2f60c9ae-openstack-config\") pod \"openstackclient\" (UID: \"d6005c71-9a39-4ab6-876a-99dd2f60c9ae\") " pod="openstack/openstackclient" Oct 07 12:10:33 crc kubenswrapper[4700]: I1007 12:10:33.292523 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6005c71-9a39-4ab6-876a-99dd2f60c9ae-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d6005c71-9a39-4ab6-876a-99dd2f60c9ae\") " pod="openstack/openstackclient" Oct 07 12:10:33 crc kubenswrapper[4700]: I1007 12:10:33.296759 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d6005c71-9a39-4ab6-876a-99dd2f60c9ae-openstack-config-secret\") pod \"openstackclient\" (UID: \"d6005c71-9a39-4ab6-876a-99dd2f60c9ae\") " pod="openstack/openstackclient" Oct 07 12:10:33 crc kubenswrapper[4700]: I1007 12:10:33.305091 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wvrn\" (UniqueName: \"kubernetes.io/projected/d6005c71-9a39-4ab6-876a-99dd2f60c9ae-kube-api-access-5wvrn\") pod \"openstackclient\" (UID: \"d6005c71-9a39-4ab6-876a-99dd2f60c9ae\") " pod="openstack/openstackclient" Oct 07 12:10:33 crc kubenswrapper[4700]: I1007 12:10:33.423777 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 12:10:33 crc kubenswrapper[4700]: I1007 12:10:33.795278 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 07 12:10:33 crc kubenswrapper[4700]: I1007 12:10:33.924084 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"d6005c71-9a39-4ab6-876a-99dd2f60c9ae","Type":"ContainerStarted","Data":"f9ce0545fe93a815a8c74570d11f849737c9e54797e71773e58d794bee2bca90"} Oct 07 12:10:34 crc kubenswrapper[4700]: I1007 12:10:34.082019 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-2bphs"] Oct 07 12:10:34 crc kubenswrapper[4700]: I1007 12:10:34.083956 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-2bphs" Oct 07 12:10:34 crc kubenswrapper[4700]: I1007 12:10:34.096551 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-2bphs"] Oct 07 12:10:34 crc kubenswrapper[4700]: I1007 12:10:34.205264 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngmjn\" (UniqueName: \"kubernetes.io/projected/1c2f4928-4400-49ed-bc37-8d9ab2aa0de6-kube-api-access-ngmjn\") pod \"aodh-db-create-2bphs\" (UID: \"1c2f4928-4400-49ed-bc37-8d9ab2aa0de6\") " pod="openstack/aodh-db-create-2bphs" Oct 07 12:10:34 crc kubenswrapper[4700]: I1007 12:10:34.307152 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngmjn\" (UniqueName: \"kubernetes.io/projected/1c2f4928-4400-49ed-bc37-8d9ab2aa0de6-kube-api-access-ngmjn\") pod \"aodh-db-create-2bphs\" (UID: \"1c2f4928-4400-49ed-bc37-8d9ab2aa0de6\") " pod="openstack/aodh-db-create-2bphs" Oct 07 12:10:34 crc kubenswrapper[4700]: I1007 12:10:34.329178 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngmjn\" (UniqueName: \"kubernetes.io/projected/1c2f4928-4400-49ed-bc37-8d9ab2aa0de6-kube-api-access-ngmjn\") pod \"aodh-db-create-2bphs\" (UID: \"1c2f4928-4400-49ed-bc37-8d9ab2aa0de6\") " pod="openstack/aodh-db-create-2bphs" Oct 07 12:10:34 crc kubenswrapper[4700]: I1007 12:10:34.414774 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-2bphs" Oct 07 12:10:34 crc kubenswrapper[4700]: I1007 12:10:34.880438 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-2bphs"] Oct 07 12:10:34 crc kubenswrapper[4700]: W1007 12:10:34.884608 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c2f4928_4400_49ed_bc37_8d9ab2aa0de6.slice/crio-de38ebb888e0e298e17851fbe67b592d131fd4253b79dc15d1aa6b2f855b82b3 WatchSource:0}: Error finding container de38ebb888e0e298e17851fbe67b592d131fd4253b79dc15d1aa6b2f855b82b3: Status 404 returned error can't find the container with id de38ebb888e0e298e17851fbe67b592d131fd4253b79dc15d1aa6b2f855b82b3 Oct 07 12:10:34 crc kubenswrapper[4700]: I1007 12:10:34.937790 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"d6005c71-9a39-4ab6-876a-99dd2f60c9ae","Type":"ContainerStarted","Data":"0a05451550bcedc55ea69b11889bc6287e0978e9ed60e5232005deae9188df7c"} Oct 07 12:10:34 crc kubenswrapper[4700]: I1007 12:10:34.940895 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-2bphs" event={"ID":"1c2f4928-4400-49ed-bc37-8d9ab2aa0de6","Type":"ContainerStarted","Data":"de38ebb888e0e298e17851fbe67b592d131fd4253b79dc15d1aa6b2f855b82b3"} Oct 07 12:10:34 crc kubenswrapper[4700]: I1007 12:10:34.959354 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.959332083 podStartE2EDuration="1.959332083s" podCreationTimestamp="2025-10-07 12:10:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:10:34.952793522 +0000 UTC m=+3001.749192521" watchObservedRunningTime="2025-10-07 12:10:34.959332083 +0000 UTC m=+3001.755731082" Oct 07 12:10:35 crc kubenswrapper[4700]: I1007 12:10:35.478855 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 12:10:35 crc kubenswrapper[4700]: I1007 12:10:35.482841 4700 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="1cb5863c-c473-49b8-9ffa-ce83d51a061c" podUID="d6005c71-9a39-4ab6-876a-99dd2f60c9ae" Oct 07 12:10:35 crc kubenswrapper[4700]: I1007 12:10:35.532329 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wd6k\" (UniqueName: \"kubernetes.io/projected/1cb5863c-c473-49b8-9ffa-ce83d51a061c-kube-api-access-8wd6k\") pod \"1cb5863c-c473-49b8-9ffa-ce83d51a061c\" (UID: \"1cb5863c-c473-49b8-9ffa-ce83d51a061c\") " Oct 07 12:10:35 crc kubenswrapper[4700]: I1007 12:10:35.532391 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1cb5863c-c473-49b8-9ffa-ce83d51a061c-openstack-config-secret\") pod \"1cb5863c-c473-49b8-9ffa-ce83d51a061c\" (UID: \"1cb5863c-c473-49b8-9ffa-ce83d51a061c\") " Oct 07 12:10:35 crc kubenswrapper[4700]: I1007 12:10:35.532473 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1cb5863c-c473-49b8-9ffa-ce83d51a061c-openstack-config\") pod \"1cb5863c-c473-49b8-9ffa-ce83d51a061c\" (UID: \"1cb5863c-c473-49b8-9ffa-ce83d51a061c\") " Oct 07 12:10:35 crc kubenswrapper[4700]: I1007 12:10:35.532507 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cb5863c-c473-49b8-9ffa-ce83d51a061c-combined-ca-bundle\") pod \"1cb5863c-c473-49b8-9ffa-ce83d51a061c\" (UID: \"1cb5863c-c473-49b8-9ffa-ce83d51a061c\") " Oct 07 12:10:35 crc kubenswrapper[4700]: I1007 12:10:35.539283 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cb5863c-c473-49b8-9ffa-ce83d51a061c-kube-api-access-8wd6k" (OuterVolumeSpecName: "kube-api-access-8wd6k") pod "1cb5863c-c473-49b8-9ffa-ce83d51a061c" (UID: "1cb5863c-c473-49b8-9ffa-ce83d51a061c"). InnerVolumeSpecName "kube-api-access-8wd6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:10:35 crc kubenswrapper[4700]: I1007 12:10:35.563678 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cb5863c-c473-49b8-9ffa-ce83d51a061c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1cb5863c-c473-49b8-9ffa-ce83d51a061c" (UID: "1cb5863c-c473-49b8-9ffa-ce83d51a061c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:10:35 crc kubenswrapper[4700]: I1007 12:10:35.570414 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cb5863c-c473-49b8-9ffa-ce83d51a061c-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "1cb5863c-c473-49b8-9ffa-ce83d51a061c" (UID: "1cb5863c-c473-49b8-9ffa-ce83d51a061c"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:10:35 crc kubenswrapper[4700]: I1007 12:10:35.599493 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cb5863c-c473-49b8-9ffa-ce83d51a061c-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "1cb5863c-c473-49b8-9ffa-ce83d51a061c" (UID: "1cb5863c-c473-49b8-9ffa-ce83d51a061c"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:10:35 crc kubenswrapper[4700]: I1007 12:10:35.634959 4700 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1cb5863c-c473-49b8-9ffa-ce83d51a061c-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:10:35 crc kubenswrapper[4700]: I1007 12:10:35.634997 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cb5863c-c473-49b8-9ffa-ce83d51a061c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:10:35 crc kubenswrapper[4700]: I1007 12:10:35.635009 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wd6k\" (UniqueName: \"kubernetes.io/projected/1cb5863c-c473-49b8-9ffa-ce83d51a061c-kube-api-access-8wd6k\") on node \"crc\" DevicePath \"\"" Oct 07 12:10:35 crc kubenswrapper[4700]: I1007 12:10:35.635023 4700 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1cb5863c-c473-49b8-9ffa-ce83d51a061c-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 07 12:10:35 crc kubenswrapper[4700]: I1007 12:10:35.952131 4700 generic.go:334] "Generic (PLEG): container finished" podID="1cb5863c-c473-49b8-9ffa-ce83d51a061c" containerID="8274f66316f6312d8cf6b2599d570d71bb607f9434378e8043d2f90cfd921ace" exitCode=137 Oct 07 12:10:35 crc kubenswrapper[4700]: I1007 12:10:35.952198 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 12:10:35 crc kubenswrapper[4700]: I1007 12:10:35.952250 4700 scope.go:117] "RemoveContainer" containerID="8274f66316f6312d8cf6b2599d570d71bb607f9434378e8043d2f90cfd921ace" Oct 07 12:10:35 crc kubenswrapper[4700]: I1007 12:10:35.964923 4700 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="1cb5863c-c473-49b8-9ffa-ce83d51a061c" podUID="d6005c71-9a39-4ab6-876a-99dd2f60c9ae" Oct 07 12:10:35 crc kubenswrapper[4700]: I1007 12:10:35.967466 4700 generic.go:334] "Generic (PLEG): container finished" podID="1c2f4928-4400-49ed-bc37-8d9ab2aa0de6" containerID="ef381d6512f44f24c157cd4ab61301f4206444d8c19eb34d59f473c90aeb4af1" exitCode=0 Oct 07 12:10:35 crc kubenswrapper[4700]: I1007 12:10:35.978118 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cb5863c-c473-49b8-9ffa-ce83d51a061c" path="/var/lib/kubelet/pods/1cb5863c-c473-49b8-9ffa-ce83d51a061c/volumes" Oct 07 12:10:35 crc kubenswrapper[4700]: I1007 12:10:35.982616 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-2bphs" event={"ID":"1c2f4928-4400-49ed-bc37-8d9ab2aa0de6","Type":"ContainerDied","Data":"ef381d6512f44f24c157cd4ab61301f4206444d8c19eb34d59f473c90aeb4af1"} Oct 07 12:10:35 crc kubenswrapper[4700]: I1007 12:10:35.997353 4700 scope.go:117] "RemoveContainer" containerID="8274f66316f6312d8cf6b2599d570d71bb607f9434378e8043d2f90cfd921ace" Oct 07 12:10:35 crc kubenswrapper[4700]: E1007 12:10:35.997822 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8274f66316f6312d8cf6b2599d570d71bb607f9434378e8043d2f90cfd921ace\": container with ID starting with 8274f66316f6312d8cf6b2599d570d71bb607f9434378e8043d2f90cfd921ace not found: ID does not exist" containerID="8274f66316f6312d8cf6b2599d570d71bb607f9434378e8043d2f90cfd921ace" Oct 07 12:10:35 crc kubenswrapper[4700]: I1007 12:10:35.997853 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8274f66316f6312d8cf6b2599d570d71bb607f9434378e8043d2f90cfd921ace"} err="failed to get container status \"8274f66316f6312d8cf6b2599d570d71bb607f9434378e8043d2f90cfd921ace\": rpc error: code = NotFound desc = could not find container \"8274f66316f6312d8cf6b2599d570d71bb607f9434378e8043d2f90cfd921ace\": container with ID starting with 8274f66316f6312d8cf6b2599d570d71bb607f9434378e8043d2f90cfd921ace not found: ID does not exist" Oct 07 12:10:37 crc kubenswrapper[4700]: I1007 12:10:37.369932 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-2bphs" Oct 07 12:10:37 crc kubenswrapper[4700]: I1007 12:10:37.468882 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngmjn\" (UniqueName: \"kubernetes.io/projected/1c2f4928-4400-49ed-bc37-8d9ab2aa0de6-kube-api-access-ngmjn\") pod \"1c2f4928-4400-49ed-bc37-8d9ab2aa0de6\" (UID: \"1c2f4928-4400-49ed-bc37-8d9ab2aa0de6\") " Oct 07 12:10:37 crc kubenswrapper[4700]: I1007 12:10:37.477574 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c2f4928-4400-49ed-bc37-8d9ab2aa0de6-kube-api-access-ngmjn" (OuterVolumeSpecName: "kube-api-access-ngmjn") pod "1c2f4928-4400-49ed-bc37-8d9ab2aa0de6" (UID: "1c2f4928-4400-49ed-bc37-8d9ab2aa0de6"). InnerVolumeSpecName "kube-api-access-ngmjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:10:37 crc kubenswrapper[4700]: I1007 12:10:37.572097 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngmjn\" (UniqueName: \"kubernetes.io/projected/1c2f4928-4400-49ed-bc37-8d9ab2aa0de6-kube-api-access-ngmjn\") on node \"crc\" DevicePath \"\"" Oct 07 12:10:37 crc kubenswrapper[4700]: I1007 12:10:37.993573 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-2bphs" event={"ID":"1c2f4928-4400-49ed-bc37-8d9ab2aa0de6","Type":"ContainerDied","Data":"de38ebb888e0e298e17851fbe67b592d131fd4253b79dc15d1aa6b2f855b82b3"} Oct 07 12:10:37 crc kubenswrapper[4700]: I1007 12:10:37.993608 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de38ebb888e0e298e17851fbe67b592d131fd4253b79dc15d1aa6b2f855b82b3" Oct 07 12:10:37 crc kubenswrapper[4700]: I1007 12:10:37.993692 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-2bphs" Oct 07 12:10:41 crc kubenswrapper[4700]: I1007 12:10:41.957939 4700 scope.go:117] "RemoveContainer" containerID="c06b2df83958e39f4d5be0954930f188e3730ce13fd533f1ae848cc8c7038468" Oct 07 12:10:41 crc kubenswrapper[4700]: E1007 12:10:41.959097 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:10:44 crc kubenswrapper[4700]: I1007 12:10:44.182078 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-998f-account-create-tlv92"] Oct 07 12:10:44 crc kubenswrapper[4700]: E1007 12:10:44.182862 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c2f4928-4400-49ed-bc37-8d9ab2aa0de6" containerName="mariadb-database-create" Oct 07 12:10:44 crc kubenswrapper[4700]: I1007 12:10:44.182880 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c2f4928-4400-49ed-bc37-8d9ab2aa0de6" containerName="mariadb-database-create" Oct 07 12:10:44 crc kubenswrapper[4700]: I1007 12:10:44.183160 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c2f4928-4400-49ed-bc37-8d9ab2aa0de6" containerName="mariadb-database-create" Oct 07 12:10:44 crc kubenswrapper[4700]: I1007 12:10:44.183988 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-998f-account-create-tlv92" Oct 07 12:10:44 crc kubenswrapper[4700]: I1007 12:10:44.186702 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Oct 07 12:10:44 crc kubenswrapper[4700]: I1007 12:10:44.192092 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-998f-account-create-tlv92"] Oct 07 12:10:44 crc kubenswrapper[4700]: I1007 12:10:44.206238 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltczw\" (UniqueName: \"kubernetes.io/projected/c4fb38c5-276c-41c2-8f9c-f03e43564dbe-kube-api-access-ltczw\") pod \"aodh-998f-account-create-tlv92\" (UID: \"c4fb38c5-276c-41c2-8f9c-f03e43564dbe\") " pod="openstack/aodh-998f-account-create-tlv92" Oct 07 12:10:44 crc kubenswrapper[4700]: I1007 12:10:44.308211 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltczw\" (UniqueName: \"kubernetes.io/projected/c4fb38c5-276c-41c2-8f9c-f03e43564dbe-kube-api-access-ltczw\") pod \"aodh-998f-account-create-tlv92\" (UID: \"c4fb38c5-276c-41c2-8f9c-f03e43564dbe\") " pod="openstack/aodh-998f-account-create-tlv92" Oct 07 12:10:44 crc kubenswrapper[4700]: I1007 12:10:44.328959 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltczw\" (UniqueName: \"kubernetes.io/projected/c4fb38c5-276c-41c2-8f9c-f03e43564dbe-kube-api-access-ltczw\") pod \"aodh-998f-account-create-tlv92\" (UID: \"c4fb38c5-276c-41c2-8f9c-f03e43564dbe\") " pod="openstack/aodh-998f-account-create-tlv92" Oct 07 12:10:44 crc kubenswrapper[4700]: I1007 12:10:44.518389 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-998f-account-create-tlv92" Oct 07 12:10:45 crc kubenswrapper[4700]: I1007 12:10:45.032611 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-998f-account-create-tlv92"] Oct 07 12:10:45 crc kubenswrapper[4700]: W1007 12:10:45.039732 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4fb38c5_276c_41c2_8f9c_f03e43564dbe.slice/crio-6d361590e6cdb130a565e354f347a108b01be85de9312301870a4821c785b948 WatchSource:0}: Error finding container 6d361590e6cdb130a565e354f347a108b01be85de9312301870a4821c785b948: Status 404 returned error can't find the container with id 6d361590e6cdb130a565e354f347a108b01be85de9312301870a4821c785b948 Oct 07 12:10:45 crc kubenswrapper[4700]: I1007 12:10:45.067831 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-998f-account-create-tlv92" event={"ID":"c4fb38c5-276c-41c2-8f9c-f03e43564dbe","Type":"ContainerStarted","Data":"6d361590e6cdb130a565e354f347a108b01be85de9312301870a4821c785b948"} Oct 07 12:10:46 crc kubenswrapper[4700]: I1007 12:10:46.081227 4700 generic.go:334] "Generic (PLEG): container finished" podID="c4fb38c5-276c-41c2-8f9c-f03e43564dbe" containerID="52b1fe863acc3f636ce580f3d15430d810f27b16ae938f2f15ec148b642ffd0e" exitCode=0 Oct 07 12:10:46 crc kubenswrapper[4700]: I1007 12:10:46.081277 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-998f-account-create-tlv92" event={"ID":"c4fb38c5-276c-41c2-8f9c-f03e43564dbe","Type":"ContainerDied","Data":"52b1fe863acc3f636ce580f3d15430d810f27b16ae938f2f15ec148b642ffd0e"} Oct 07 12:10:47 crc kubenswrapper[4700]: I1007 12:10:47.474476 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-998f-account-create-tlv92" Oct 07 12:10:47 crc kubenswrapper[4700]: I1007 12:10:47.672965 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltczw\" (UniqueName: \"kubernetes.io/projected/c4fb38c5-276c-41c2-8f9c-f03e43564dbe-kube-api-access-ltczw\") pod \"c4fb38c5-276c-41c2-8f9c-f03e43564dbe\" (UID: \"c4fb38c5-276c-41c2-8f9c-f03e43564dbe\") " Oct 07 12:10:47 crc kubenswrapper[4700]: I1007 12:10:47.683766 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4fb38c5-276c-41c2-8f9c-f03e43564dbe-kube-api-access-ltczw" (OuterVolumeSpecName: "kube-api-access-ltczw") pod "c4fb38c5-276c-41c2-8f9c-f03e43564dbe" (UID: "c4fb38c5-276c-41c2-8f9c-f03e43564dbe"). InnerVolumeSpecName "kube-api-access-ltczw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:10:47 crc kubenswrapper[4700]: I1007 12:10:47.776208 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltczw\" (UniqueName: \"kubernetes.io/projected/c4fb38c5-276c-41c2-8f9c-f03e43564dbe-kube-api-access-ltczw\") on node \"crc\" DevicePath \"\"" Oct 07 12:10:48 crc kubenswrapper[4700]: I1007 12:10:48.127453 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-998f-account-create-tlv92" event={"ID":"c4fb38c5-276c-41c2-8f9c-f03e43564dbe","Type":"ContainerDied","Data":"6d361590e6cdb130a565e354f347a108b01be85de9312301870a4821c785b948"} Oct 07 12:10:48 crc kubenswrapper[4700]: I1007 12:10:48.127512 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-998f-account-create-tlv92" Oct 07 12:10:48 crc kubenswrapper[4700]: I1007 12:10:48.127517 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d361590e6cdb130a565e354f347a108b01be85de9312301870a4821c785b948" Oct 07 12:10:49 crc kubenswrapper[4700]: I1007 12:10:49.496946 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-2nc8j"] Oct 07 12:10:49 crc kubenswrapper[4700]: E1007 12:10:49.497797 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4fb38c5-276c-41c2-8f9c-f03e43564dbe" containerName="mariadb-account-create" Oct 07 12:10:49 crc kubenswrapper[4700]: I1007 12:10:49.497818 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4fb38c5-276c-41c2-8f9c-f03e43564dbe" containerName="mariadb-account-create" Oct 07 12:10:49 crc kubenswrapper[4700]: I1007 12:10:49.498092 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4fb38c5-276c-41c2-8f9c-f03e43564dbe" containerName="mariadb-account-create" Oct 07 12:10:49 crc kubenswrapper[4700]: I1007 12:10:49.498896 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-2nc8j" Oct 07 12:10:49 crc kubenswrapper[4700]: I1007 12:10:49.501574 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-k7ltr" Oct 07 12:10:49 crc kubenswrapper[4700]: I1007 12:10:49.501685 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 07 12:10:49 crc kubenswrapper[4700]: I1007 12:10:49.506501 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 07 12:10:49 crc kubenswrapper[4700]: I1007 12:10:49.508526 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11ba889b-74ce-4f70-8ee0-d5a21cc39d98-combined-ca-bundle\") pod \"aodh-db-sync-2nc8j\" (UID: \"11ba889b-74ce-4f70-8ee0-d5a21cc39d98\") " pod="openstack/aodh-db-sync-2nc8j" Oct 07 12:10:49 crc kubenswrapper[4700]: I1007 12:10:49.508669 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11ba889b-74ce-4f70-8ee0-d5a21cc39d98-scripts\") pod \"aodh-db-sync-2nc8j\" (UID: \"11ba889b-74ce-4f70-8ee0-d5a21cc39d98\") " pod="openstack/aodh-db-sync-2nc8j" Oct 07 12:10:49 crc kubenswrapper[4700]: I1007 12:10:49.508750 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvm7b\" (UniqueName: \"kubernetes.io/projected/11ba889b-74ce-4f70-8ee0-d5a21cc39d98-kube-api-access-zvm7b\") pod \"aodh-db-sync-2nc8j\" (UID: \"11ba889b-74ce-4f70-8ee0-d5a21cc39d98\") " pod="openstack/aodh-db-sync-2nc8j" Oct 07 12:10:49 crc kubenswrapper[4700]: I1007 12:10:49.508810 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11ba889b-74ce-4f70-8ee0-d5a21cc39d98-config-data\") pod \"aodh-db-sync-2nc8j\" (UID: \"11ba889b-74ce-4f70-8ee0-d5a21cc39d98\") " pod="openstack/aodh-db-sync-2nc8j" Oct 07 12:10:49 crc kubenswrapper[4700]: I1007 12:10:49.512278 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-2nc8j"] Oct 07 12:10:49 crc kubenswrapper[4700]: I1007 12:10:49.610887 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11ba889b-74ce-4f70-8ee0-d5a21cc39d98-combined-ca-bundle\") pod \"aodh-db-sync-2nc8j\" (UID: \"11ba889b-74ce-4f70-8ee0-d5a21cc39d98\") " pod="openstack/aodh-db-sync-2nc8j" Oct 07 12:10:49 crc kubenswrapper[4700]: I1007 12:10:49.610993 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11ba889b-74ce-4f70-8ee0-d5a21cc39d98-scripts\") pod \"aodh-db-sync-2nc8j\" (UID: \"11ba889b-74ce-4f70-8ee0-d5a21cc39d98\") " pod="openstack/aodh-db-sync-2nc8j" Oct 07 12:10:49 crc kubenswrapper[4700]: I1007 12:10:49.611062 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvm7b\" (UniqueName: \"kubernetes.io/projected/11ba889b-74ce-4f70-8ee0-d5a21cc39d98-kube-api-access-zvm7b\") pod \"aodh-db-sync-2nc8j\" (UID: \"11ba889b-74ce-4f70-8ee0-d5a21cc39d98\") " pod="openstack/aodh-db-sync-2nc8j" Oct 07 12:10:49 crc kubenswrapper[4700]: I1007 12:10:49.611107 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11ba889b-74ce-4f70-8ee0-d5a21cc39d98-config-data\") pod \"aodh-db-sync-2nc8j\" (UID: \"11ba889b-74ce-4f70-8ee0-d5a21cc39d98\") " pod="openstack/aodh-db-sync-2nc8j" Oct 07 12:10:49 crc kubenswrapper[4700]: I1007 12:10:49.615805 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11ba889b-74ce-4f70-8ee0-d5a21cc39d98-scripts\") pod \"aodh-db-sync-2nc8j\" (UID: \"11ba889b-74ce-4f70-8ee0-d5a21cc39d98\") " pod="openstack/aodh-db-sync-2nc8j" Oct 07 12:10:49 crc kubenswrapper[4700]: I1007 12:10:49.632616 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11ba889b-74ce-4f70-8ee0-d5a21cc39d98-config-data\") pod \"aodh-db-sync-2nc8j\" (UID: \"11ba889b-74ce-4f70-8ee0-d5a21cc39d98\") " pod="openstack/aodh-db-sync-2nc8j" Oct 07 12:10:49 crc kubenswrapper[4700]: I1007 12:10:49.633781 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11ba889b-74ce-4f70-8ee0-d5a21cc39d98-combined-ca-bundle\") pod \"aodh-db-sync-2nc8j\" (UID: \"11ba889b-74ce-4f70-8ee0-d5a21cc39d98\") " pod="openstack/aodh-db-sync-2nc8j" Oct 07 12:10:49 crc kubenswrapper[4700]: I1007 12:10:49.635945 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvm7b\" (UniqueName: \"kubernetes.io/projected/11ba889b-74ce-4f70-8ee0-d5a21cc39d98-kube-api-access-zvm7b\") pod \"aodh-db-sync-2nc8j\" (UID: \"11ba889b-74ce-4f70-8ee0-d5a21cc39d98\") " pod="openstack/aodh-db-sync-2nc8j" Oct 07 12:10:49 crc kubenswrapper[4700]: I1007 12:10:49.823654 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-2nc8j" Oct 07 12:10:50 crc kubenswrapper[4700]: I1007 12:10:50.316897 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-2nc8j"] Oct 07 12:10:50 crc kubenswrapper[4700]: W1007 12:10:50.319528 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11ba889b_74ce_4f70_8ee0_d5a21cc39d98.slice/crio-4a7b5a388e5eb9edfcb029bba9ff0a743f47547afc3670c36e4235cbcc91c520 WatchSource:0}: Error finding container 4a7b5a388e5eb9edfcb029bba9ff0a743f47547afc3670c36e4235cbcc91c520: Status 404 returned error can't find the container with id 4a7b5a388e5eb9edfcb029bba9ff0a743f47547afc3670c36e4235cbcc91c520 Oct 07 12:10:50 crc kubenswrapper[4700]: I1007 12:10:50.321962 4700 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 12:10:51 crc kubenswrapper[4700]: I1007 12:10:51.155406 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-2nc8j" event={"ID":"11ba889b-74ce-4f70-8ee0-d5a21cc39d98","Type":"ContainerStarted","Data":"4a7b5a388e5eb9edfcb029bba9ff0a743f47547afc3670c36e4235cbcc91c520"} Oct 07 12:10:56 crc kubenswrapper[4700]: I1007 12:10:56.217781 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-2nc8j" event={"ID":"11ba889b-74ce-4f70-8ee0-d5a21cc39d98","Type":"ContainerStarted","Data":"660657e66ff61d19b133777d92fb411d0020e3d8f392295549535bd92fee14af"} Oct 07 12:10:56 crc kubenswrapper[4700]: I1007 12:10:56.236952 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-2nc8j" podStartSLOduration=2.458851031 podStartE2EDuration="7.236931208s" podCreationTimestamp="2025-10-07 12:10:49 +0000 UTC" firstStartedPulling="2025-10-07 12:10:50.321751786 +0000 UTC m=+3017.118150775" lastFinishedPulling="2025-10-07 12:10:55.099831913 +0000 UTC m=+3021.896230952" observedRunningTime="2025-10-07 12:10:56.233723904 +0000 UTC m=+3023.030122913" watchObservedRunningTime="2025-10-07 12:10:56.236931208 +0000 UTC m=+3023.033330197" Oct 07 12:10:56 crc kubenswrapper[4700]: I1007 12:10:56.958062 4700 scope.go:117] "RemoveContainer" containerID="c06b2df83958e39f4d5be0954930f188e3730ce13fd533f1ae848cc8c7038468" Oct 07 12:10:56 crc kubenswrapper[4700]: E1007 12:10:56.958962 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:10:58 crc kubenswrapper[4700]: I1007 12:10:58.238865 4700 generic.go:334] "Generic (PLEG): container finished" podID="11ba889b-74ce-4f70-8ee0-d5a21cc39d98" containerID="660657e66ff61d19b133777d92fb411d0020e3d8f392295549535bd92fee14af" exitCode=0 Oct 07 12:10:58 crc kubenswrapper[4700]: I1007 12:10:58.238927 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-2nc8j" event={"ID":"11ba889b-74ce-4f70-8ee0-d5a21cc39d98","Type":"ContainerDied","Data":"660657e66ff61d19b133777d92fb411d0020e3d8f392295549535bd92fee14af"} Oct 07 12:10:59 crc kubenswrapper[4700]: I1007 12:10:59.594375 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-2nc8j" Oct 07 12:10:59 crc kubenswrapper[4700]: I1007 12:10:59.658489 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11ba889b-74ce-4f70-8ee0-d5a21cc39d98-scripts\") pod \"11ba889b-74ce-4f70-8ee0-d5a21cc39d98\" (UID: \"11ba889b-74ce-4f70-8ee0-d5a21cc39d98\") " Oct 07 12:10:59 crc kubenswrapper[4700]: I1007 12:10:59.658562 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvm7b\" (UniqueName: \"kubernetes.io/projected/11ba889b-74ce-4f70-8ee0-d5a21cc39d98-kube-api-access-zvm7b\") pod \"11ba889b-74ce-4f70-8ee0-d5a21cc39d98\" (UID: \"11ba889b-74ce-4f70-8ee0-d5a21cc39d98\") " Oct 07 12:10:59 crc kubenswrapper[4700]: I1007 12:10:59.658948 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11ba889b-74ce-4f70-8ee0-d5a21cc39d98-config-data\") pod \"11ba889b-74ce-4f70-8ee0-d5a21cc39d98\" (UID: \"11ba889b-74ce-4f70-8ee0-d5a21cc39d98\") " Oct 07 12:10:59 crc kubenswrapper[4700]: I1007 12:10:59.659209 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11ba889b-74ce-4f70-8ee0-d5a21cc39d98-combined-ca-bundle\") pod \"11ba889b-74ce-4f70-8ee0-d5a21cc39d98\" (UID: \"11ba889b-74ce-4f70-8ee0-d5a21cc39d98\") " Oct 07 12:10:59 crc kubenswrapper[4700]: I1007 12:10:59.664439 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11ba889b-74ce-4f70-8ee0-d5a21cc39d98-scripts" (OuterVolumeSpecName: "scripts") pod "11ba889b-74ce-4f70-8ee0-d5a21cc39d98" (UID: "11ba889b-74ce-4f70-8ee0-d5a21cc39d98"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:10:59 crc kubenswrapper[4700]: I1007 12:10:59.666511 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11ba889b-74ce-4f70-8ee0-d5a21cc39d98-kube-api-access-zvm7b" (OuterVolumeSpecName: "kube-api-access-zvm7b") pod "11ba889b-74ce-4f70-8ee0-d5a21cc39d98" (UID: "11ba889b-74ce-4f70-8ee0-d5a21cc39d98"). InnerVolumeSpecName "kube-api-access-zvm7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:10:59 crc kubenswrapper[4700]: I1007 12:10:59.701139 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11ba889b-74ce-4f70-8ee0-d5a21cc39d98-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "11ba889b-74ce-4f70-8ee0-d5a21cc39d98" (UID: "11ba889b-74ce-4f70-8ee0-d5a21cc39d98"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:10:59 crc kubenswrapper[4700]: I1007 12:10:59.703836 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11ba889b-74ce-4f70-8ee0-d5a21cc39d98-config-data" (OuterVolumeSpecName: "config-data") pod "11ba889b-74ce-4f70-8ee0-d5a21cc39d98" (UID: "11ba889b-74ce-4f70-8ee0-d5a21cc39d98"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:10:59 crc kubenswrapper[4700]: I1007 12:10:59.761924 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11ba889b-74ce-4f70-8ee0-d5a21cc39d98-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:10:59 crc kubenswrapper[4700]: I1007 12:10:59.761993 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11ba889b-74ce-4f70-8ee0-d5a21cc39d98-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:10:59 crc kubenswrapper[4700]: I1007 12:10:59.762022 4700 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11ba889b-74ce-4f70-8ee0-d5a21cc39d98-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:10:59 crc kubenswrapper[4700]: I1007 12:10:59.762050 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvm7b\" (UniqueName: \"kubernetes.io/projected/11ba889b-74ce-4f70-8ee0-d5a21cc39d98-kube-api-access-zvm7b\") on node \"crc\" DevicePath \"\"" Oct 07 12:11:00 crc kubenswrapper[4700]: I1007 12:11:00.279863 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-2nc8j" event={"ID":"11ba889b-74ce-4f70-8ee0-d5a21cc39d98","Type":"ContainerDied","Data":"4a7b5a388e5eb9edfcb029bba9ff0a743f47547afc3670c36e4235cbcc91c520"} Oct 07 12:11:00 crc kubenswrapper[4700]: I1007 12:11:00.279901 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a7b5a388e5eb9edfcb029bba9ff0a743f47547afc3670c36e4235cbcc91c520" Oct 07 12:11:00 crc kubenswrapper[4700]: I1007 12:11:00.279930 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-2nc8j" Oct 07 12:11:04 crc kubenswrapper[4700]: I1007 12:11:04.161851 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Oct 07 12:11:04 crc kubenswrapper[4700]: E1007 12:11:04.167278 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11ba889b-74ce-4f70-8ee0-d5a21cc39d98" containerName="aodh-db-sync" Oct 07 12:11:04 crc kubenswrapper[4700]: I1007 12:11:04.167322 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="11ba889b-74ce-4f70-8ee0-d5a21cc39d98" containerName="aodh-db-sync" Oct 07 12:11:04 crc kubenswrapper[4700]: I1007 12:11:04.167569 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="11ba889b-74ce-4f70-8ee0-d5a21cc39d98" containerName="aodh-db-sync" Oct 07 12:11:04 crc kubenswrapper[4700]: I1007 12:11:04.171077 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 07 12:11:04 crc kubenswrapper[4700]: I1007 12:11:04.178637 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-k7ltr" Oct 07 12:11:04 crc kubenswrapper[4700]: I1007 12:11:04.178923 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 07 12:11:04 crc kubenswrapper[4700]: I1007 12:11:04.181865 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 07 12:11:04 crc kubenswrapper[4700]: I1007 12:11:04.193743 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 07 12:11:04 crc kubenswrapper[4700]: I1007 12:11:04.356295 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99gfj\" (UniqueName: \"kubernetes.io/projected/a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f-kube-api-access-99gfj\") pod \"aodh-0\" (UID: \"a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f\") " pod="openstack/aodh-0" Oct 07 12:11:04 crc kubenswrapper[4700]: I1007 12:11:04.356666 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f-config-data\") pod \"aodh-0\" (UID: \"a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f\") " pod="openstack/aodh-0" Oct 07 12:11:04 crc kubenswrapper[4700]: I1007 12:11:04.356690 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f-combined-ca-bundle\") pod \"aodh-0\" (UID: \"a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f\") " pod="openstack/aodh-0" Oct 07 12:11:04 crc kubenswrapper[4700]: I1007 12:11:04.356750 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f-scripts\") pod \"aodh-0\" (UID: \"a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f\") " pod="openstack/aodh-0" Oct 07 12:11:04 crc kubenswrapper[4700]: I1007 12:11:04.458402 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f-config-data\") pod \"aodh-0\" (UID: \"a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f\") " pod="openstack/aodh-0" Oct 07 12:11:04 crc kubenswrapper[4700]: I1007 12:11:04.458469 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f-combined-ca-bundle\") pod \"aodh-0\" (UID: \"a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f\") " pod="openstack/aodh-0" Oct 07 12:11:04 crc kubenswrapper[4700]: I1007 12:11:04.458551 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f-scripts\") pod \"aodh-0\" (UID: \"a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f\") " pod="openstack/aodh-0" Oct 07 12:11:04 crc kubenswrapper[4700]: I1007 12:11:04.458691 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99gfj\" (UniqueName: \"kubernetes.io/projected/a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f-kube-api-access-99gfj\") pod \"aodh-0\" (UID: \"a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f\") " pod="openstack/aodh-0" Oct 07 12:11:04 crc kubenswrapper[4700]: I1007 12:11:04.465629 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f-config-data\") pod \"aodh-0\" (UID: \"a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f\") " pod="openstack/aodh-0" Oct 07 12:11:04 crc kubenswrapper[4700]: I1007 12:11:04.469948 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f-scripts\") pod \"aodh-0\" (UID: \"a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f\") " pod="openstack/aodh-0" Oct 07 12:11:04 crc kubenswrapper[4700]: I1007 12:11:04.470983 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f-combined-ca-bundle\") pod \"aodh-0\" (UID: \"a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f\") " pod="openstack/aodh-0" Oct 07 12:11:04 crc kubenswrapper[4700]: I1007 12:11:04.487905 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99gfj\" (UniqueName: \"kubernetes.io/projected/a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f-kube-api-access-99gfj\") pod \"aodh-0\" (UID: \"a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f\") " pod="openstack/aodh-0" Oct 07 12:11:04 crc kubenswrapper[4700]: I1007 12:11:04.529185 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 07 12:11:05 crc kubenswrapper[4700]: I1007 12:11:05.015411 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 07 12:11:05 crc kubenswrapper[4700]: I1007 12:11:05.330812 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f","Type":"ContainerStarted","Data":"3a1d0bd7743150d84e54bab9918e9839581eb17e2da1d11e74aaff01502bd081"} Oct 07 12:11:06 crc kubenswrapper[4700]: I1007 12:11:06.208461 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:11:06 crc kubenswrapper[4700]: I1007 12:11:06.338894 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5193cebf-c7b2-4e53-8dc1-d3c37a551e03" containerName="sg-core" containerID="cri-o://e3451518ed041978620c92b36027bf5c035b675a1b7ed51a98194f3ad9259e7e" gracePeriod=30 Oct 07 12:11:06 crc kubenswrapper[4700]: I1007 12:11:06.339066 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5193cebf-c7b2-4e53-8dc1-d3c37a551e03" containerName="proxy-httpd" containerID="cri-o://477e79ecd61b79976efd124fa13866daea394e5cb35069114ff2658be2286e74" gracePeriod=30 Oct 07 12:11:06 crc kubenswrapper[4700]: I1007 12:11:06.339133 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5193cebf-c7b2-4e53-8dc1-d3c37a551e03" containerName="ceilometer-notification-agent" containerID="cri-o://89dce6e6bfc597c1d929128947cd3721cd82d7e29c57c2c47994650b43ece49c" gracePeriod=30 Oct 07 12:11:06 crc kubenswrapper[4700]: I1007 12:11:06.339214 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5193cebf-c7b2-4e53-8dc1-d3c37a551e03" containerName="ceilometer-central-agent" containerID="cri-o://8e42aaa5b9c5d1137fac1f44a2c67f854ffb18539aa419f0f307be279d6f4b96" gracePeriod=30 Oct 07 12:11:07 crc kubenswrapper[4700]: I1007 12:11:07.353198 4700 generic.go:334] "Generic (PLEG): container finished" podID="5193cebf-c7b2-4e53-8dc1-d3c37a551e03" containerID="477e79ecd61b79976efd124fa13866daea394e5cb35069114ff2658be2286e74" exitCode=0 Oct 07 12:11:07 crc kubenswrapper[4700]: I1007 12:11:07.353555 4700 generic.go:334] "Generic (PLEG): container finished" podID="5193cebf-c7b2-4e53-8dc1-d3c37a551e03" containerID="e3451518ed041978620c92b36027bf5c035b675a1b7ed51a98194f3ad9259e7e" exitCode=2 Oct 07 12:11:07 crc kubenswrapper[4700]: I1007 12:11:07.353569 4700 generic.go:334] "Generic (PLEG): container finished" podID="5193cebf-c7b2-4e53-8dc1-d3c37a551e03" containerID="8e42aaa5b9c5d1137fac1f44a2c67f854ffb18539aa419f0f307be279d6f4b96" exitCode=0 Oct 07 12:11:07 crc kubenswrapper[4700]: I1007 12:11:07.353250 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5193cebf-c7b2-4e53-8dc1-d3c37a551e03","Type":"ContainerDied","Data":"477e79ecd61b79976efd124fa13866daea394e5cb35069114ff2658be2286e74"} Oct 07 12:11:07 crc kubenswrapper[4700]: I1007 12:11:07.353615 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5193cebf-c7b2-4e53-8dc1-d3c37a551e03","Type":"ContainerDied","Data":"e3451518ed041978620c92b36027bf5c035b675a1b7ed51a98194f3ad9259e7e"} Oct 07 12:11:07 crc kubenswrapper[4700]: I1007 12:11:07.353634 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5193cebf-c7b2-4e53-8dc1-d3c37a551e03","Type":"ContainerDied","Data":"8e42aaa5b9c5d1137fac1f44a2c67f854ffb18539aa419f0f307be279d6f4b96"} Oct 07 12:11:07 crc kubenswrapper[4700]: I1007 12:11:07.357056 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Oct 07 12:11:08 crc kubenswrapper[4700]: I1007 12:11:08.362536 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f","Type":"ContainerStarted","Data":"c33af5f235c3b5087e43ff77b61337789c7f327ba7921af5478571174a69287a"} Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.327260 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.390521 4700 generic.go:334] "Generic (PLEG): container finished" podID="5193cebf-c7b2-4e53-8dc1-d3c37a551e03" containerID="89dce6e6bfc597c1d929128947cd3721cd82d7e29c57c2c47994650b43ece49c" exitCode=0 Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.390579 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5193cebf-c7b2-4e53-8dc1-d3c37a551e03","Type":"ContainerDied","Data":"89dce6e6bfc597c1d929128947cd3721cd82d7e29c57c2c47994650b43ece49c"} Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.390613 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5193cebf-c7b2-4e53-8dc1-d3c37a551e03","Type":"ContainerDied","Data":"0596ae1af214f195bcd9b8510c5bacd2866f4897657cd316fd30e4da9350e02d"} Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.390634 4700 scope.go:117] "RemoveContainer" containerID="477e79ecd61b79976efd124fa13866daea394e5cb35069114ff2658be2286e74" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.390873 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.434488 4700 scope.go:117] "RemoveContainer" containerID="e3451518ed041978620c92b36027bf5c035b675a1b7ed51a98194f3ad9259e7e" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.462619 4700 scope.go:117] "RemoveContainer" containerID="89dce6e6bfc597c1d929128947cd3721cd82d7e29c57c2c47994650b43ece49c" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.480173 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-scripts\") pod \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\" (UID: \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\") " Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.480671 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-combined-ca-bundle\") pod \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\" (UID: \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\") " Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.480811 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-sg-core-conf-yaml\") pod \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\" (UID: \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\") " Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.480862 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-ceilometer-tls-certs\") pod \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\" (UID: \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\") " Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.480894 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-run-httpd\") pod \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\" (UID: \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\") " Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.480923 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-log-httpd\") pod \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\" (UID: \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\") " Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.480958 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgr42\" (UniqueName: \"kubernetes.io/projected/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-kube-api-access-jgr42\") pod \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\" (UID: \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\") " Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.480988 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-config-data\") pod \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\" (UID: \"5193cebf-c7b2-4e53-8dc1-d3c37a551e03\") " Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.481630 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5193cebf-c7b2-4e53-8dc1-d3c37a551e03" (UID: "5193cebf-c7b2-4e53-8dc1-d3c37a551e03"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.482027 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5193cebf-c7b2-4e53-8dc1-d3c37a551e03" (UID: "5193cebf-c7b2-4e53-8dc1-d3c37a551e03"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.484080 4700 scope.go:117] "RemoveContainer" containerID="8e42aaa5b9c5d1137fac1f44a2c67f854ffb18539aa419f0f307be279d6f4b96" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.485979 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-scripts" (OuterVolumeSpecName: "scripts") pod "5193cebf-c7b2-4e53-8dc1-d3c37a551e03" (UID: "5193cebf-c7b2-4e53-8dc1-d3c37a551e03"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.488063 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-kube-api-access-jgr42" (OuterVolumeSpecName: "kube-api-access-jgr42") pod "5193cebf-c7b2-4e53-8dc1-d3c37a551e03" (UID: "5193cebf-c7b2-4e53-8dc1-d3c37a551e03"). InnerVolumeSpecName "kube-api-access-jgr42". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.526822 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5193cebf-c7b2-4e53-8dc1-d3c37a551e03" (UID: "5193cebf-c7b2-4e53-8dc1-d3c37a551e03"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.538609 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "5193cebf-c7b2-4e53-8dc1-d3c37a551e03" (UID: "5193cebf-c7b2-4e53-8dc1-d3c37a551e03"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.571198 4700 scope.go:117] "RemoveContainer" containerID="477e79ecd61b79976efd124fa13866daea394e5cb35069114ff2658be2286e74" Oct 07 12:11:10 crc kubenswrapper[4700]: E1007 12:11:10.571993 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"477e79ecd61b79976efd124fa13866daea394e5cb35069114ff2658be2286e74\": container with ID starting with 477e79ecd61b79976efd124fa13866daea394e5cb35069114ff2658be2286e74 not found: ID does not exist" containerID="477e79ecd61b79976efd124fa13866daea394e5cb35069114ff2658be2286e74" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.573677 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"477e79ecd61b79976efd124fa13866daea394e5cb35069114ff2658be2286e74"} err="failed to get container status \"477e79ecd61b79976efd124fa13866daea394e5cb35069114ff2658be2286e74\": rpc error: code = NotFound desc = could not find container \"477e79ecd61b79976efd124fa13866daea394e5cb35069114ff2658be2286e74\": container with ID starting with 477e79ecd61b79976efd124fa13866daea394e5cb35069114ff2658be2286e74 not found: ID does not exist" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.573738 4700 scope.go:117] "RemoveContainer" containerID="e3451518ed041978620c92b36027bf5c035b675a1b7ed51a98194f3ad9259e7e" Oct 07 12:11:10 crc kubenswrapper[4700]: E1007 12:11:10.574431 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3451518ed041978620c92b36027bf5c035b675a1b7ed51a98194f3ad9259e7e\": container with ID starting with e3451518ed041978620c92b36027bf5c035b675a1b7ed51a98194f3ad9259e7e not found: ID does not exist" containerID="e3451518ed041978620c92b36027bf5c035b675a1b7ed51a98194f3ad9259e7e" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.574463 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3451518ed041978620c92b36027bf5c035b675a1b7ed51a98194f3ad9259e7e"} err="failed to get container status \"e3451518ed041978620c92b36027bf5c035b675a1b7ed51a98194f3ad9259e7e\": rpc error: code = NotFound desc = could not find container \"e3451518ed041978620c92b36027bf5c035b675a1b7ed51a98194f3ad9259e7e\": container with ID starting with e3451518ed041978620c92b36027bf5c035b675a1b7ed51a98194f3ad9259e7e not found: ID does not exist" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.574534 4700 scope.go:117] "RemoveContainer" containerID="89dce6e6bfc597c1d929128947cd3721cd82d7e29c57c2c47994650b43ece49c" Oct 07 12:11:10 crc kubenswrapper[4700]: E1007 12:11:10.574891 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89dce6e6bfc597c1d929128947cd3721cd82d7e29c57c2c47994650b43ece49c\": container with ID starting with 89dce6e6bfc597c1d929128947cd3721cd82d7e29c57c2c47994650b43ece49c not found: ID does not exist" containerID="89dce6e6bfc597c1d929128947cd3721cd82d7e29c57c2c47994650b43ece49c" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.574953 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89dce6e6bfc597c1d929128947cd3721cd82d7e29c57c2c47994650b43ece49c"} err="failed to get container status \"89dce6e6bfc597c1d929128947cd3721cd82d7e29c57c2c47994650b43ece49c\": rpc error: code = NotFound desc = could not find container \"89dce6e6bfc597c1d929128947cd3721cd82d7e29c57c2c47994650b43ece49c\": container with ID starting with 89dce6e6bfc597c1d929128947cd3721cd82d7e29c57c2c47994650b43ece49c not found: ID does not exist" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.575050 4700 scope.go:117] "RemoveContainer" containerID="8e42aaa5b9c5d1137fac1f44a2c67f854ffb18539aa419f0f307be279d6f4b96" Oct 07 12:11:10 crc kubenswrapper[4700]: E1007 12:11:10.575529 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e42aaa5b9c5d1137fac1f44a2c67f854ffb18539aa419f0f307be279d6f4b96\": container with ID starting with 8e42aaa5b9c5d1137fac1f44a2c67f854ffb18539aa419f0f307be279d6f4b96 not found: ID does not exist" containerID="8e42aaa5b9c5d1137fac1f44a2c67f854ffb18539aa419f0f307be279d6f4b96" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.575593 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e42aaa5b9c5d1137fac1f44a2c67f854ffb18539aa419f0f307be279d6f4b96"} err="failed to get container status \"8e42aaa5b9c5d1137fac1f44a2c67f854ffb18539aa419f0f307be279d6f4b96\": rpc error: code = NotFound desc = could not find container \"8e42aaa5b9c5d1137fac1f44a2c67f854ffb18539aa419f0f307be279d6f4b96\": container with ID starting with 8e42aaa5b9c5d1137fac1f44a2c67f854ffb18539aa419f0f307be279d6f4b96 not found: ID does not exist" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.583497 4700 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.583724 4700 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.583784 4700 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.583840 4700 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.583910 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgr42\" (UniqueName: \"kubernetes.io/projected/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-kube-api-access-jgr42\") on node \"crc\" DevicePath \"\"" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.583989 4700 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.597691 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5193cebf-c7b2-4e53-8dc1-d3c37a551e03" (UID: "5193cebf-c7b2-4e53-8dc1-d3c37a551e03"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.663175 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-config-data" (OuterVolumeSpecName: "config-data") pod "5193cebf-c7b2-4e53-8dc1-d3c37a551e03" (UID: "5193cebf-c7b2-4e53-8dc1-d3c37a551e03"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.685686 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.685725 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5193cebf-c7b2-4e53-8dc1-d3c37a551e03-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.734475 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.744584 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.761912 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:11:10 crc kubenswrapper[4700]: E1007 12:11:10.762291 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5193cebf-c7b2-4e53-8dc1-d3c37a551e03" containerName="sg-core" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.762327 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="5193cebf-c7b2-4e53-8dc1-d3c37a551e03" containerName="sg-core" Oct 07 12:11:10 crc kubenswrapper[4700]: E1007 12:11:10.762354 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5193cebf-c7b2-4e53-8dc1-d3c37a551e03" containerName="ceilometer-central-agent" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.762361 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="5193cebf-c7b2-4e53-8dc1-d3c37a551e03" containerName="ceilometer-central-agent" Oct 07 12:11:10 crc kubenswrapper[4700]: E1007 12:11:10.762380 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5193cebf-c7b2-4e53-8dc1-d3c37a551e03" containerName="ceilometer-notification-agent" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.762388 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="5193cebf-c7b2-4e53-8dc1-d3c37a551e03" containerName="ceilometer-notification-agent" Oct 07 12:11:10 crc kubenswrapper[4700]: E1007 12:11:10.762403 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5193cebf-c7b2-4e53-8dc1-d3c37a551e03" containerName="proxy-httpd" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.762411 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="5193cebf-c7b2-4e53-8dc1-d3c37a551e03" containerName="proxy-httpd" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.762585 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="5193cebf-c7b2-4e53-8dc1-d3c37a551e03" containerName="ceilometer-notification-agent" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.762605 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="5193cebf-c7b2-4e53-8dc1-d3c37a551e03" containerName="ceilometer-central-agent" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.762621 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="5193cebf-c7b2-4e53-8dc1-d3c37a551e03" containerName="proxy-httpd" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.762636 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="5193cebf-c7b2-4e53-8dc1-d3c37a551e03" containerName="sg-core" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.764406 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.767836 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.768029 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.768119 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.782259 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.888938 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dff20986-65c2-4eb2-859c-55ea212165b5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dff20986-65c2-4eb2-859c-55ea212165b5\") " pod="openstack/ceilometer-0" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.889227 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dff20986-65c2-4eb2-859c-55ea212165b5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dff20986-65c2-4eb2-859c-55ea212165b5\") " pod="openstack/ceilometer-0" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.889358 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dff20986-65c2-4eb2-859c-55ea212165b5-log-httpd\") pod \"ceilometer-0\" (UID: \"dff20986-65c2-4eb2-859c-55ea212165b5\") " pod="openstack/ceilometer-0" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.889509 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dff20986-65c2-4eb2-859c-55ea212165b5-scripts\") pod \"ceilometer-0\" (UID: \"dff20986-65c2-4eb2-859c-55ea212165b5\") " pod="openstack/ceilometer-0" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.889609 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dff20986-65c2-4eb2-859c-55ea212165b5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dff20986-65c2-4eb2-859c-55ea212165b5\") " pod="openstack/ceilometer-0" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.889887 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dff20986-65c2-4eb2-859c-55ea212165b5-config-data\") pod \"ceilometer-0\" (UID: \"dff20986-65c2-4eb2-859c-55ea212165b5\") " pod="openstack/ceilometer-0" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.890074 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dff20986-65c2-4eb2-859c-55ea212165b5-run-httpd\") pod \"ceilometer-0\" (UID: \"dff20986-65c2-4eb2-859c-55ea212165b5\") " pod="openstack/ceilometer-0" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.890238 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsfdc\" (UniqueName: \"kubernetes.io/projected/dff20986-65c2-4eb2-859c-55ea212165b5-kube-api-access-bsfdc\") pod \"ceilometer-0\" (UID: \"dff20986-65c2-4eb2-859c-55ea212165b5\") " pod="openstack/ceilometer-0" Oct 07 12:11:10 crc kubenswrapper[4700]: I1007 12:11:10.957472 4700 scope.go:117] "RemoveContainer" containerID="c06b2df83958e39f4d5be0954930f188e3730ce13fd533f1ae848cc8c7038468" Oct 07 12:11:11 crc kubenswrapper[4700]: E1007 12:11:10.957961 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:11:11 crc kubenswrapper[4700]: I1007 12:11:10.991858 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsfdc\" (UniqueName: \"kubernetes.io/projected/dff20986-65c2-4eb2-859c-55ea212165b5-kube-api-access-bsfdc\") pod \"ceilometer-0\" (UID: \"dff20986-65c2-4eb2-859c-55ea212165b5\") " pod="openstack/ceilometer-0" Oct 07 12:11:11 crc kubenswrapper[4700]: I1007 12:11:10.991980 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dff20986-65c2-4eb2-859c-55ea212165b5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dff20986-65c2-4eb2-859c-55ea212165b5\") " pod="openstack/ceilometer-0" Oct 07 12:11:11 crc kubenswrapper[4700]: I1007 12:11:10.992008 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dff20986-65c2-4eb2-859c-55ea212165b5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dff20986-65c2-4eb2-859c-55ea212165b5\") " pod="openstack/ceilometer-0" Oct 07 12:11:11 crc kubenswrapper[4700]: I1007 12:11:10.992059 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dff20986-65c2-4eb2-859c-55ea212165b5-log-httpd\") pod \"ceilometer-0\" (UID: \"dff20986-65c2-4eb2-859c-55ea212165b5\") " pod="openstack/ceilometer-0" Oct 07 12:11:11 crc kubenswrapper[4700]: I1007 12:11:10.992104 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dff20986-65c2-4eb2-859c-55ea212165b5-scripts\") pod \"ceilometer-0\" (UID: \"dff20986-65c2-4eb2-859c-55ea212165b5\") " pod="openstack/ceilometer-0" Oct 07 12:11:11 crc kubenswrapper[4700]: I1007 12:11:10.992128 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dff20986-65c2-4eb2-859c-55ea212165b5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dff20986-65c2-4eb2-859c-55ea212165b5\") " pod="openstack/ceilometer-0" Oct 07 12:11:11 crc kubenswrapper[4700]: I1007 12:11:10.992192 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dff20986-65c2-4eb2-859c-55ea212165b5-config-data\") pod \"ceilometer-0\" (UID: \"dff20986-65c2-4eb2-859c-55ea212165b5\") " pod="openstack/ceilometer-0" Oct 07 12:11:11 crc kubenswrapper[4700]: I1007 12:11:10.992244 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dff20986-65c2-4eb2-859c-55ea212165b5-run-httpd\") pod \"ceilometer-0\" (UID: \"dff20986-65c2-4eb2-859c-55ea212165b5\") " pod="openstack/ceilometer-0" Oct 07 12:11:11 crc kubenswrapper[4700]: I1007 12:11:10.992672 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dff20986-65c2-4eb2-859c-55ea212165b5-log-httpd\") pod \"ceilometer-0\" (UID: \"dff20986-65c2-4eb2-859c-55ea212165b5\") " pod="openstack/ceilometer-0" Oct 07 12:11:11 crc kubenswrapper[4700]: I1007 12:11:10.992721 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dff20986-65c2-4eb2-859c-55ea212165b5-run-httpd\") pod \"ceilometer-0\" (UID: \"dff20986-65c2-4eb2-859c-55ea212165b5\") " pod="openstack/ceilometer-0" Oct 07 12:11:11 crc kubenswrapper[4700]: I1007 12:11:10.996118 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dff20986-65c2-4eb2-859c-55ea212165b5-scripts\") pod \"ceilometer-0\" (UID: \"dff20986-65c2-4eb2-859c-55ea212165b5\") " pod="openstack/ceilometer-0" Oct 07 12:11:11 crc kubenswrapper[4700]: I1007 12:11:10.996199 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dff20986-65c2-4eb2-859c-55ea212165b5-config-data\") pod \"ceilometer-0\" (UID: \"dff20986-65c2-4eb2-859c-55ea212165b5\") " pod="openstack/ceilometer-0" Oct 07 12:11:11 crc kubenswrapper[4700]: I1007 12:11:10.996562 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dff20986-65c2-4eb2-859c-55ea212165b5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dff20986-65c2-4eb2-859c-55ea212165b5\") " pod="openstack/ceilometer-0" Oct 07 12:11:11 crc kubenswrapper[4700]: I1007 12:11:10.997690 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dff20986-65c2-4eb2-859c-55ea212165b5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dff20986-65c2-4eb2-859c-55ea212165b5\") " pod="openstack/ceilometer-0" Oct 07 12:11:11 crc kubenswrapper[4700]: I1007 12:11:11.004998 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dff20986-65c2-4eb2-859c-55ea212165b5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dff20986-65c2-4eb2-859c-55ea212165b5\") " pod="openstack/ceilometer-0" Oct 07 12:11:11 crc kubenswrapper[4700]: I1007 12:11:11.014970 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsfdc\" (UniqueName: \"kubernetes.io/projected/dff20986-65c2-4eb2-859c-55ea212165b5-kube-api-access-bsfdc\") pod \"ceilometer-0\" (UID: \"dff20986-65c2-4eb2-859c-55ea212165b5\") " pod="openstack/ceilometer-0" Oct 07 12:11:11 crc kubenswrapper[4700]: I1007 12:11:11.080812 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 12:11:11 crc kubenswrapper[4700]: I1007 12:11:11.407760 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f","Type":"ContainerStarted","Data":"9ddbd5b9f6d034b1ee0a43ea9b4a134fb4c92e36ebb9c0ee520477a14ce99e7f"} Oct 07 12:11:11 crc kubenswrapper[4700]: I1007 12:11:11.973088 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5193cebf-c7b2-4e53-8dc1-d3c37a551e03" path="/var/lib/kubelet/pods/5193cebf-c7b2-4e53-8dc1-d3c37a551e03/volumes" Oct 07 12:11:12 crc kubenswrapper[4700]: I1007 12:11:12.169675 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:11:12 crc kubenswrapper[4700]: W1007 12:11:12.191455 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddff20986_65c2_4eb2_859c_55ea212165b5.slice/crio-ce5ce59d04235a7f192b6654b024024f36e182902e702f5226d6bdedb8df2e31 WatchSource:0}: Error finding container ce5ce59d04235a7f192b6654b024024f36e182902e702f5226d6bdedb8df2e31: Status 404 returned error can't find the container with id ce5ce59d04235a7f192b6654b024024f36e182902e702f5226d6bdedb8df2e31 Oct 07 12:11:12 crc kubenswrapper[4700]: I1007 12:11:12.444847 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dff20986-65c2-4eb2-859c-55ea212165b5","Type":"ContainerStarted","Data":"ce5ce59d04235a7f192b6654b024024f36e182902e702f5226d6bdedb8df2e31"} Oct 07 12:11:13 crc kubenswrapper[4700]: I1007 12:11:13.454877 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f","Type":"ContainerStarted","Data":"dbba1b6233af0065e8654d82f73d3928068f25b322ff4f21848a1f21f25094f2"} Oct 07 12:11:13 crc kubenswrapper[4700]: I1007 12:11:13.456283 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dff20986-65c2-4eb2-859c-55ea212165b5","Type":"ContainerStarted","Data":"34747b80dffa1ba3fa024d96d69de90c2e63b4d1c60cb1b38833eb133a979c22"} Oct 07 12:11:14 crc kubenswrapper[4700]: I1007 12:11:14.479644 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f","Type":"ContainerStarted","Data":"d5e546f1c8a32ce3d78492cb079e243af12c60a32055e2b1abf05a841c8d9f85"} Oct 07 12:11:14 crc kubenswrapper[4700]: I1007 12:11:14.479729 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f" containerName="aodh-api" containerID="cri-o://c33af5f235c3b5087e43ff77b61337789c7f327ba7921af5478571174a69287a" gracePeriod=30 Oct 07 12:11:14 crc kubenswrapper[4700]: I1007 12:11:14.479750 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f" containerName="aodh-listener" containerID="cri-o://d5e546f1c8a32ce3d78492cb079e243af12c60a32055e2b1abf05a841c8d9f85" gracePeriod=30 Oct 07 12:11:14 crc kubenswrapper[4700]: I1007 12:11:14.479821 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f" containerName="aodh-evaluator" containerID="cri-o://9ddbd5b9f6d034b1ee0a43ea9b4a134fb4c92e36ebb9c0ee520477a14ce99e7f" gracePeriod=30 Oct 07 12:11:14 crc kubenswrapper[4700]: I1007 12:11:14.479835 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f" containerName="aodh-notifier" containerID="cri-o://dbba1b6233af0065e8654d82f73d3928068f25b322ff4f21848a1f21f25094f2" gracePeriod=30 Oct 07 12:11:14 crc kubenswrapper[4700]: I1007 12:11:14.482631 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dff20986-65c2-4eb2-859c-55ea212165b5","Type":"ContainerStarted","Data":"2fd968200545a121a86a493fc5af37f03248d6af4ea20bccb5b6804090f50e38"} Oct 07 12:11:14 crc kubenswrapper[4700]: I1007 12:11:14.500194 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.7138649080000001 podStartE2EDuration="10.500175932s" podCreationTimestamp="2025-10-07 12:11:04 +0000 UTC" firstStartedPulling="2025-10-07 12:11:05.027618357 +0000 UTC m=+3031.824017346" lastFinishedPulling="2025-10-07 12:11:13.813929381 +0000 UTC m=+3040.610328370" observedRunningTime="2025-10-07 12:11:14.499964857 +0000 UTC m=+3041.296363846" watchObservedRunningTime="2025-10-07 12:11:14.500175932 +0000 UTC m=+3041.296574921" Oct 07 12:11:15 crc kubenswrapper[4700]: I1007 12:11:15.494143 4700 generic.go:334] "Generic (PLEG): container finished" podID="a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f" containerID="dbba1b6233af0065e8654d82f73d3928068f25b322ff4f21848a1f21f25094f2" exitCode=0 Oct 07 12:11:15 crc kubenswrapper[4700]: I1007 12:11:15.494544 4700 generic.go:334] "Generic (PLEG): container finished" podID="a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f" containerID="9ddbd5b9f6d034b1ee0a43ea9b4a134fb4c92e36ebb9c0ee520477a14ce99e7f" exitCode=0 Oct 07 12:11:15 crc kubenswrapper[4700]: I1007 12:11:15.494559 4700 generic.go:334] "Generic (PLEG): container finished" podID="a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f" containerID="c33af5f235c3b5087e43ff77b61337789c7f327ba7921af5478571174a69287a" exitCode=0 Oct 07 12:11:15 crc kubenswrapper[4700]: I1007 12:11:15.494313 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f","Type":"ContainerDied","Data":"dbba1b6233af0065e8654d82f73d3928068f25b322ff4f21848a1f21f25094f2"} Oct 07 12:11:15 crc kubenswrapper[4700]: I1007 12:11:15.494664 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f","Type":"ContainerDied","Data":"9ddbd5b9f6d034b1ee0a43ea9b4a134fb4c92e36ebb9c0ee520477a14ce99e7f"} Oct 07 12:11:15 crc kubenswrapper[4700]: I1007 12:11:15.494682 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f","Type":"ContainerDied","Data":"c33af5f235c3b5087e43ff77b61337789c7f327ba7921af5478571174a69287a"} Oct 07 12:11:15 crc kubenswrapper[4700]: I1007 12:11:15.497202 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dff20986-65c2-4eb2-859c-55ea212165b5","Type":"ContainerStarted","Data":"b91f14c8660f82a7410b1a4edb5a7852607ab22337fb6c2f04b9701224e50498"} Oct 07 12:11:17 crc kubenswrapper[4700]: I1007 12:11:17.520138 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dff20986-65c2-4eb2-859c-55ea212165b5","Type":"ContainerStarted","Data":"19a51a008b0aa5d3b3e58998a093e46e877eb756759833d9da757b4784cde3cd"} Oct 07 12:11:17 crc kubenswrapper[4700]: I1007 12:11:17.520670 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 12:11:17 crc kubenswrapper[4700]: I1007 12:11:17.551924 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.18769641 podStartE2EDuration="7.55190191s" podCreationTimestamp="2025-10-07 12:11:10 +0000 UTC" firstStartedPulling="2025-10-07 12:11:12.193527224 +0000 UTC m=+3038.989926213" lastFinishedPulling="2025-10-07 12:11:16.557732724 +0000 UTC m=+3043.354131713" observedRunningTime="2025-10-07 12:11:17.546966731 +0000 UTC m=+3044.343365720" watchObservedRunningTime="2025-10-07 12:11:17.55190191 +0000 UTC m=+3044.348300899" Oct 07 12:11:24 crc kubenswrapper[4700]: I1007 12:11:24.957834 4700 scope.go:117] "RemoveContainer" containerID="c06b2df83958e39f4d5be0954930f188e3730ce13fd533f1ae848cc8c7038468" Oct 07 12:11:24 crc kubenswrapper[4700]: E1007 12:11:24.958555 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:11:36 crc kubenswrapper[4700]: I1007 12:11:36.956480 4700 scope.go:117] "RemoveContainer" containerID="c06b2df83958e39f4d5be0954930f188e3730ce13fd533f1ae848cc8c7038468" Oct 07 12:11:36 crc kubenswrapper[4700]: E1007 12:11:36.957279 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:11:41 crc kubenswrapper[4700]: I1007 12:11:41.091243 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 07 12:11:44 crc kubenswrapper[4700]: I1007 12:11:44.802466 4700 generic.go:334] "Generic (PLEG): container finished" podID="a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f" containerID="d5e546f1c8a32ce3d78492cb079e243af12c60a32055e2b1abf05a841c8d9f85" exitCode=137 Oct 07 12:11:44 crc kubenswrapper[4700]: I1007 12:11:44.802507 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f","Type":"ContainerDied","Data":"d5e546f1c8a32ce3d78492cb079e243af12c60a32055e2b1abf05a841c8d9f85"} Oct 07 12:11:45 crc kubenswrapper[4700]: I1007 12:11:45.007344 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 07 12:11:45 crc kubenswrapper[4700]: I1007 12:11:45.098846 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f-scripts\") pod \"a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f\" (UID: \"a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f\") " Oct 07 12:11:45 crc kubenswrapper[4700]: I1007 12:11:45.098928 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99gfj\" (UniqueName: \"kubernetes.io/projected/a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f-kube-api-access-99gfj\") pod \"a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f\" (UID: \"a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f\") " Oct 07 12:11:45 crc kubenswrapper[4700]: I1007 12:11:45.098980 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f-combined-ca-bundle\") pod \"a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f\" (UID: \"a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f\") " Oct 07 12:11:45 crc kubenswrapper[4700]: I1007 12:11:45.099036 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f-config-data\") pod \"a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f\" (UID: \"a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f\") " Oct 07 12:11:45 crc kubenswrapper[4700]: I1007 12:11:45.105269 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f-kube-api-access-99gfj" (OuterVolumeSpecName: "kube-api-access-99gfj") pod "a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f" (UID: "a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f"). InnerVolumeSpecName "kube-api-access-99gfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:11:45 crc kubenswrapper[4700]: I1007 12:11:45.130581 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f-scripts" (OuterVolumeSpecName: "scripts") pod "a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f" (UID: "a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:11:45 crc kubenswrapper[4700]: I1007 12:11:45.201147 4700 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:11:45 crc kubenswrapper[4700]: I1007 12:11:45.201189 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99gfj\" (UniqueName: \"kubernetes.io/projected/a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f-kube-api-access-99gfj\") on node \"crc\" DevicePath \"\"" Oct 07 12:11:45 crc kubenswrapper[4700]: I1007 12:11:45.211692 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f-config-data" (OuterVolumeSpecName: "config-data") pod "a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f" (UID: "a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:11:45 crc kubenswrapper[4700]: I1007 12:11:45.230483 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f" (UID: "a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:11:45 crc kubenswrapper[4700]: I1007 12:11:45.303145 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:11:45 crc kubenswrapper[4700]: I1007 12:11:45.303195 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:11:45 crc kubenswrapper[4700]: I1007 12:11:45.822488 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f","Type":"ContainerDied","Data":"3a1d0bd7743150d84e54bab9918e9839581eb17e2da1d11e74aaff01502bd081"} Oct 07 12:11:45 crc kubenswrapper[4700]: I1007 12:11:45.822663 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 07 12:11:45 crc kubenswrapper[4700]: I1007 12:11:45.823040 4700 scope.go:117] "RemoveContainer" containerID="d5e546f1c8a32ce3d78492cb079e243af12c60a32055e2b1abf05a841c8d9f85" Oct 07 12:11:45 crc kubenswrapper[4700]: I1007 12:11:45.867036 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Oct 07 12:11:45 crc kubenswrapper[4700]: I1007 12:11:45.878048 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Oct 07 12:11:45 crc kubenswrapper[4700]: I1007 12:11:45.896404 4700 scope.go:117] "RemoveContainer" containerID="dbba1b6233af0065e8654d82f73d3928068f25b322ff4f21848a1f21f25094f2" Oct 07 12:11:45 crc kubenswrapper[4700]: I1007 12:11:45.903216 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Oct 07 12:11:45 crc kubenswrapper[4700]: E1007 12:11:45.903708 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f" containerName="aodh-api" Oct 07 12:11:45 crc kubenswrapper[4700]: I1007 12:11:45.903728 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f" containerName="aodh-api" Oct 07 12:11:45 crc kubenswrapper[4700]: E1007 12:11:45.903744 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f" containerName="aodh-evaluator" Oct 07 12:11:45 crc kubenswrapper[4700]: I1007 12:11:45.903754 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f" containerName="aodh-evaluator" Oct 07 12:11:45 crc kubenswrapper[4700]: E1007 12:11:45.903797 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f" containerName="aodh-notifier" Oct 07 12:11:45 crc kubenswrapper[4700]: I1007 12:11:45.903805 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f" containerName="aodh-notifier" Oct 07 12:11:45 crc kubenswrapper[4700]: E1007 12:11:45.903831 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f" containerName="aodh-listener" Oct 07 12:11:45 crc kubenswrapper[4700]: I1007 12:11:45.903839 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f" containerName="aodh-listener" Oct 07 12:11:45 crc kubenswrapper[4700]: I1007 12:11:45.904068 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f" containerName="aodh-api" Oct 07 12:11:45 crc kubenswrapper[4700]: I1007 12:11:45.904090 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f" containerName="aodh-notifier" Oct 07 12:11:45 crc kubenswrapper[4700]: I1007 12:11:45.904101 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f" containerName="aodh-listener" Oct 07 12:11:45 crc kubenswrapper[4700]: I1007 12:11:45.904123 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f" containerName="aodh-evaluator" Oct 07 12:11:45 crc kubenswrapper[4700]: I1007 12:11:45.909009 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 07 12:11:45 crc kubenswrapper[4700]: I1007 12:11:45.912006 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 07 12:11:45 crc kubenswrapper[4700]: I1007 12:11:45.912120 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Oct 07 12:11:45 crc kubenswrapper[4700]: I1007 12:11:45.912246 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Oct 07 12:11:45 crc kubenswrapper[4700]: I1007 12:11:45.912400 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 07 12:11:45 crc kubenswrapper[4700]: I1007 12:11:45.913337 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-k7ltr" Oct 07 12:11:45 crc kubenswrapper[4700]: I1007 12:11:45.934546 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 07 12:11:45 crc kubenswrapper[4700]: I1007 12:11:45.941059 4700 scope.go:117] "RemoveContainer" containerID="9ddbd5b9f6d034b1ee0a43ea9b4a134fb4c92e36ebb9c0ee520477a14ce99e7f" Oct 07 12:11:45 crc kubenswrapper[4700]: I1007 12:11:45.971207 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f" path="/var/lib/kubelet/pods/a0b95fb2-4ed6-4422-ac77-4b8ebae5da3f/volumes" Oct 07 12:11:46 crc kubenswrapper[4700]: I1007 12:11:46.018973 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8ee9baf-609a-49cc-b7cb-4dac97633cf7-combined-ca-bundle\") pod \"aodh-0\" (UID: \"d8ee9baf-609a-49cc-b7cb-4dac97633cf7\") " pod="openstack/aodh-0" Oct 07 12:11:46 crc kubenswrapper[4700]: I1007 12:11:46.019019 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrvnv\" (UniqueName: \"kubernetes.io/projected/d8ee9baf-609a-49cc-b7cb-4dac97633cf7-kube-api-access-rrvnv\") pod \"aodh-0\" (UID: \"d8ee9baf-609a-49cc-b7cb-4dac97633cf7\") " pod="openstack/aodh-0" Oct 07 12:11:46 crc kubenswrapper[4700]: I1007 12:11:46.019370 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8ee9baf-609a-49cc-b7cb-4dac97633cf7-internal-tls-certs\") pod \"aodh-0\" (UID: \"d8ee9baf-609a-49cc-b7cb-4dac97633cf7\") " pod="openstack/aodh-0" Oct 07 12:11:46 crc kubenswrapper[4700]: I1007 12:11:46.019427 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8ee9baf-609a-49cc-b7cb-4dac97633cf7-public-tls-certs\") pod \"aodh-0\" (UID: \"d8ee9baf-609a-49cc-b7cb-4dac97633cf7\") " pod="openstack/aodh-0" Oct 07 12:11:46 crc kubenswrapper[4700]: I1007 12:11:46.019723 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8ee9baf-609a-49cc-b7cb-4dac97633cf7-config-data\") pod \"aodh-0\" (UID: \"d8ee9baf-609a-49cc-b7cb-4dac97633cf7\") " pod="openstack/aodh-0" Oct 07 12:11:46 crc kubenswrapper[4700]: I1007 12:11:46.019773 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8ee9baf-609a-49cc-b7cb-4dac97633cf7-scripts\") pod \"aodh-0\" (UID: \"d8ee9baf-609a-49cc-b7cb-4dac97633cf7\") " pod="openstack/aodh-0" Oct 07 12:11:46 crc kubenswrapper[4700]: I1007 12:11:46.033660 4700 scope.go:117] "RemoveContainer" containerID="c33af5f235c3b5087e43ff77b61337789c7f327ba7921af5478571174a69287a" Oct 07 12:11:46 crc kubenswrapper[4700]: I1007 12:11:46.122092 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8ee9baf-609a-49cc-b7cb-4dac97633cf7-combined-ca-bundle\") pod \"aodh-0\" (UID: \"d8ee9baf-609a-49cc-b7cb-4dac97633cf7\") " pod="openstack/aodh-0" Oct 07 12:11:46 crc kubenswrapper[4700]: I1007 12:11:46.122162 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrvnv\" (UniqueName: \"kubernetes.io/projected/d8ee9baf-609a-49cc-b7cb-4dac97633cf7-kube-api-access-rrvnv\") pod \"aodh-0\" (UID: \"d8ee9baf-609a-49cc-b7cb-4dac97633cf7\") " pod="openstack/aodh-0" Oct 07 12:11:46 crc kubenswrapper[4700]: I1007 12:11:46.122293 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8ee9baf-609a-49cc-b7cb-4dac97633cf7-internal-tls-certs\") pod \"aodh-0\" (UID: \"d8ee9baf-609a-49cc-b7cb-4dac97633cf7\") " pod="openstack/aodh-0" Oct 07 12:11:46 crc kubenswrapper[4700]: I1007 12:11:46.122359 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8ee9baf-609a-49cc-b7cb-4dac97633cf7-public-tls-certs\") pod \"aodh-0\" (UID: \"d8ee9baf-609a-49cc-b7cb-4dac97633cf7\") " pod="openstack/aodh-0" Oct 07 12:11:46 crc kubenswrapper[4700]: I1007 12:11:46.122460 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8ee9baf-609a-49cc-b7cb-4dac97633cf7-config-data\") pod \"aodh-0\" (UID: \"d8ee9baf-609a-49cc-b7cb-4dac97633cf7\") " pod="openstack/aodh-0" Oct 07 12:11:46 crc kubenswrapper[4700]: I1007 12:11:46.122488 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8ee9baf-609a-49cc-b7cb-4dac97633cf7-scripts\") pod \"aodh-0\" (UID: \"d8ee9baf-609a-49cc-b7cb-4dac97633cf7\") " pod="openstack/aodh-0" Oct 07 12:11:46 crc kubenswrapper[4700]: I1007 12:11:46.126804 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8ee9baf-609a-49cc-b7cb-4dac97633cf7-public-tls-certs\") pod \"aodh-0\" (UID: \"d8ee9baf-609a-49cc-b7cb-4dac97633cf7\") " pod="openstack/aodh-0" Oct 07 12:11:46 crc kubenswrapper[4700]: I1007 12:11:46.127357 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8ee9baf-609a-49cc-b7cb-4dac97633cf7-config-data\") pod \"aodh-0\" (UID: \"d8ee9baf-609a-49cc-b7cb-4dac97633cf7\") " pod="openstack/aodh-0" Oct 07 12:11:46 crc kubenswrapper[4700]: I1007 12:11:46.128574 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8ee9baf-609a-49cc-b7cb-4dac97633cf7-combined-ca-bundle\") pod \"aodh-0\" (UID: \"d8ee9baf-609a-49cc-b7cb-4dac97633cf7\") " pod="openstack/aodh-0" Oct 07 12:11:46 crc kubenswrapper[4700]: I1007 12:11:46.128736 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8ee9baf-609a-49cc-b7cb-4dac97633cf7-internal-tls-certs\") pod \"aodh-0\" (UID: \"d8ee9baf-609a-49cc-b7cb-4dac97633cf7\") " pod="openstack/aodh-0" Oct 07 12:11:46 crc kubenswrapper[4700]: I1007 12:11:46.132887 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8ee9baf-609a-49cc-b7cb-4dac97633cf7-scripts\") pod \"aodh-0\" (UID: \"d8ee9baf-609a-49cc-b7cb-4dac97633cf7\") " pod="openstack/aodh-0" Oct 07 12:11:46 crc kubenswrapper[4700]: I1007 12:11:46.141101 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrvnv\" (UniqueName: \"kubernetes.io/projected/d8ee9baf-609a-49cc-b7cb-4dac97633cf7-kube-api-access-rrvnv\") pod \"aodh-0\" (UID: \"d8ee9baf-609a-49cc-b7cb-4dac97633cf7\") " pod="openstack/aodh-0" Oct 07 12:11:46 crc kubenswrapper[4700]: I1007 12:11:46.233297 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 07 12:11:46 crc kubenswrapper[4700]: I1007 12:11:46.733098 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 07 12:11:46 crc kubenswrapper[4700]: I1007 12:11:46.833858 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d8ee9baf-609a-49cc-b7cb-4dac97633cf7","Type":"ContainerStarted","Data":"14ce5b803c7b19ddb3f6f4a88e5b9bdff34e8717412d80fc9f74c5c822e44a7c"} Oct 07 12:11:48 crc kubenswrapper[4700]: I1007 12:11:48.853563 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d8ee9baf-609a-49cc-b7cb-4dac97633cf7","Type":"ContainerStarted","Data":"3d5925b3c4399abe1405865eba72559ada3848c20da7f72331882fcd3f746e77"} Oct 07 12:11:48 crc kubenswrapper[4700]: I1007 12:11:48.957092 4700 scope.go:117] "RemoveContainer" containerID="c06b2df83958e39f4d5be0954930f188e3730ce13fd533f1ae848cc8c7038468" Oct 07 12:11:48 crc kubenswrapper[4700]: E1007 12:11:48.957700 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:11:49 crc kubenswrapper[4700]: I1007 12:11:49.867495 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d8ee9baf-609a-49cc-b7cb-4dac97633cf7","Type":"ContainerStarted","Data":"68d410ba093c47f69da2a05ed03496e2f5cbe94daf2f506f20c5697bf54813ec"} Oct 07 12:11:50 crc kubenswrapper[4700]: I1007 12:11:50.882025 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d8ee9baf-609a-49cc-b7cb-4dac97633cf7","Type":"ContainerStarted","Data":"5d7564d0387cbfcc1a09a19e44f90fecda9b25f45f546cb2c9c2c9e3b00c13f9"} Oct 07 12:11:52 crc kubenswrapper[4700]: I1007 12:11:52.906392 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d8ee9baf-609a-49cc-b7cb-4dac97633cf7","Type":"ContainerStarted","Data":"c8011c24efaede31009a73b238f187fa55916e70bfc6fd31438c60b8ba170489"} Oct 07 12:11:52 crc kubenswrapper[4700]: I1007 12:11:52.938759 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.969554513 podStartE2EDuration="7.938732847s" podCreationTimestamp="2025-10-07 12:11:45 +0000 UTC" firstStartedPulling="2025-10-07 12:11:46.746468629 +0000 UTC m=+3073.542867638" lastFinishedPulling="2025-10-07 12:11:51.715646963 +0000 UTC m=+3078.512045972" observedRunningTime="2025-10-07 12:11:52.929612439 +0000 UTC m=+3079.726011488" watchObservedRunningTime="2025-10-07 12:11:52.938732847 +0000 UTC m=+3079.735131866" Oct 07 12:12:03 crc kubenswrapper[4700]: I1007 12:12:03.968839 4700 scope.go:117] "RemoveContainer" containerID="c06b2df83958e39f4d5be0954930f188e3730ce13fd533f1ae848cc8c7038468" Oct 07 12:12:03 crc kubenswrapper[4700]: E1007 12:12:03.969615 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:12:14 crc kubenswrapper[4700]: I1007 12:12:14.957246 4700 scope.go:117] "RemoveContainer" containerID="c06b2df83958e39f4d5be0954930f188e3730ce13fd533f1ae848cc8c7038468" Oct 07 12:12:14 crc kubenswrapper[4700]: E1007 12:12:14.958297 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:12:29 crc kubenswrapper[4700]: I1007 12:12:29.957051 4700 scope.go:117] "RemoveContainer" containerID="c06b2df83958e39f4d5be0954930f188e3730ce13fd533f1ae848cc8c7038468" Oct 07 12:12:29 crc kubenswrapper[4700]: E1007 12:12:29.958201 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:12:43 crc kubenswrapper[4700]: I1007 12:12:43.973636 4700 scope.go:117] "RemoveContainer" containerID="c06b2df83958e39f4d5be0954930f188e3730ce13fd533f1ae848cc8c7038468" Oct 07 12:12:43 crc kubenswrapper[4700]: E1007 12:12:43.974684 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:12:58 crc kubenswrapper[4700]: I1007 12:12:58.957386 4700 scope.go:117] "RemoveContainer" containerID="c06b2df83958e39f4d5be0954930f188e3730ce13fd533f1ae848cc8c7038468" Oct 07 12:12:58 crc kubenswrapper[4700]: E1007 12:12:58.958562 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:13:10 crc kubenswrapper[4700]: I1007 12:13:10.957366 4700 scope.go:117] "RemoveContainer" containerID="c06b2df83958e39f4d5be0954930f188e3730ce13fd533f1ae848cc8c7038468" Oct 07 12:13:10 crc kubenswrapper[4700]: E1007 12:13:10.958473 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:13:23 crc kubenswrapper[4700]: I1007 12:13:23.959173 4700 scope.go:117] "RemoveContainer" containerID="c06b2df83958e39f4d5be0954930f188e3730ce13fd533f1ae848cc8c7038468" Oct 07 12:13:23 crc kubenswrapper[4700]: E1007 12:13:23.960358 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:13:36 crc kubenswrapper[4700]: I1007 12:13:36.958459 4700 scope.go:117] "RemoveContainer" containerID="c06b2df83958e39f4d5be0954930f188e3730ce13fd533f1ae848cc8c7038468" Oct 07 12:13:36 crc kubenswrapper[4700]: E1007 12:13:36.959718 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:13:47 crc kubenswrapper[4700]: I1007 12:13:47.961116 4700 scope.go:117] "RemoveContainer" containerID="c06b2df83958e39f4d5be0954930f188e3730ce13fd533f1ae848cc8c7038468" Oct 07 12:13:49 crc kubenswrapper[4700]: I1007 12:13:49.241923 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" event={"ID":"97a77b38-e9b1-4243-ac3a-28d83d87cf15","Type":"ContainerStarted","Data":"0bdab40f62fe10a31c63de84e7e40648ed0233a3222811a5befdca5ac33ff4be"} Oct 07 12:14:31 crc kubenswrapper[4700]: I1007 12:14:31.051581 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l49wv"] Oct 07 12:14:31 crc kubenswrapper[4700]: I1007 12:14:31.054757 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l49wv" Oct 07 12:14:31 crc kubenswrapper[4700]: I1007 12:14:31.063869 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l49wv"] Oct 07 12:14:31 crc kubenswrapper[4700]: I1007 12:14:31.109996 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59b759c1-b3ea-47da-a8df-be6fb064bda1-utilities\") pod \"redhat-marketplace-l49wv\" (UID: \"59b759c1-b3ea-47da-a8df-be6fb064bda1\") " pod="openshift-marketplace/redhat-marketplace-l49wv" Oct 07 12:14:31 crc kubenswrapper[4700]: I1007 12:14:31.110353 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7tkq\" (UniqueName: \"kubernetes.io/projected/59b759c1-b3ea-47da-a8df-be6fb064bda1-kube-api-access-p7tkq\") pod \"redhat-marketplace-l49wv\" (UID: \"59b759c1-b3ea-47da-a8df-be6fb064bda1\") " pod="openshift-marketplace/redhat-marketplace-l49wv" Oct 07 12:14:31 crc kubenswrapper[4700]: I1007 12:14:31.110427 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59b759c1-b3ea-47da-a8df-be6fb064bda1-catalog-content\") pod \"redhat-marketplace-l49wv\" (UID: \"59b759c1-b3ea-47da-a8df-be6fb064bda1\") " pod="openshift-marketplace/redhat-marketplace-l49wv" Oct 07 12:14:31 crc kubenswrapper[4700]: I1007 12:14:31.212058 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7tkq\" (UniqueName: \"kubernetes.io/projected/59b759c1-b3ea-47da-a8df-be6fb064bda1-kube-api-access-p7tkq\") pod \"redhat-marketplace-l49wv\" (UID: \"59b759c1-b3ea-47da-a8df-be6fb064bda1\") " pod="openshift-marketplace/redhat-marketplace-l49wv" Oct 07 12:14:31 crc kubenswrapper[4700]: I1007 12:14:31.212109 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59b759c1-b3ea-47da-a8df-be6fb064bda1-catalog-content\") pod \"redhat-marketplace-l49wv\" (UID: \"59b759c1-b3ea-47da-a8df-be6fb064bda1\") " pod="openshift-marketplace/redhat-marketplace-l49wv" Oct 07 12:14:31 crc kubenswrapper[4700]: I1007 12:14:31.212213 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59b759c1-b3ea-47da-a8df-be6fb064bda1-utilities\") pod \"redhat-marketplace-l49wv\" (UID: \"59b759c1-b3ea-47da-a8df-be6fb064bda1\") " pod="openshift-marketplace/redhat-marketplace-l49wv" Oct 07 12:14:31 crc kubenswrapper[4700]: I1007 12:14:31.212720 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59b759c1-b3ea-47da-a8df-be6fb064bda1-catalog-content\") pod \"redhat-marketplace-l49wv\" (UID: \"59b759c1-b3ea-47da-a8df-be6fb064bda1\") " pod="openshift-marketplace/redhat-marketplace-l49wv" Oct 07 12:14:31 crc kubenswrapper[4700]: I1007 12:14:31.212733 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59b759c1-b3ea-47da-a8df-be6fb064bda1-utilities\") pod \"redhat-marketplace-l49wv\" (UID: \"59b759c1-b3ea-47da-a8df-be6fb064bda1\") " pod="openshift-marketplace/redhat-marketplace-l49wv" Oct 07 12:14:31 crc kubenswrapper[4700]: I1007 12:14:31.232552 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7tkq\" (UniqueName: \"kubernetes.io/projected/59b759c1-b3ea-47da-a8df-be6fb064bda1-kube-api-access-p7tkq\") pod \"redhat-marketplace-l49wv\" (UID: \"59b759c1-b3ea-47da-a8df-be6fb064bda1\") " pod="openshift-marketplace/redhat-marketplace-l49wv" Oct 07 12:14:31 crc kubenswrapper[4700]: I1007 12:14:31.385405 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l49wv" Oct 07 12:14:31 crc kubenswrapper[4700]: I1007 12:14:31.885437 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l49wv"] Oct 07 12:14:32 crc kubenswrapper[4700]: I1007 12:14:32.737229 4700 generic.go:334] "Generic (PLEG): container finished" podID="59b759c1-b3ea-47da-a8df-be6fb064bda1" containerID="bacedd331ebb549f222987ac6d784b52b372a89ef00a6b1e192e3cf0285b692b" exitCode=0 Oct 07 12:14:32 crc kubenswrapper[4700]: I1007 12:14:32.737278 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l49wv" event={"ID":"59b759c1-b3ea-47da-a8df-be6fb064bda1","Type":"ContainerDied","Data":"bacedd331ebb549f222987ac6d784b52b372a89ef00a6b1e192e3cf0285b692b"} Oct 07 12:14:32 crc kubenswrapper[4700]: I1007 12:14:32.738582 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l49wv" event={"ID":"59b759c1-b3ea-47da-a8df-be6fb064bda1","Type":"ContainerStarted","Data":"fbeea05e03667587952dde520cdc3b61e58be2dcbbffe3c89eab4d3518065d37"} Oct 07 12:14:33 crc kubenswrapper[4700]: I1007 12:14:33.756560 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l49wv" event={"ID":"59b759c1-b3ea-47da-a8df-be6fb064bda1","Type":"ContainerStarted","Data":"22d94784dad4454cf25b03fd22b2aa4741637c510274466ea4c7006d0e5e5bfe"} Oct 07 12:14:34 crc kubenswrapper[4700]: I1007 12:14:34.708993 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-bf98bb7b6-ghwv9_b5911608-e23e-46e8-9637-488593110278/manager/0.log" Oct 07 12:14:34 crc kubenswrapper[4700]: I1007 12:14:34.770420 4700 generic.go:334] "Generic (PLEG): container finished" podID="59b759c1-b3ea-47da-a8df-be6fb064bda1" containerID="22d94784dad4454cf25b03fd22b2aa4741637c510274466ea4c7006d0e5e5bfe" exitCode=0 Oct 07 12:14:34 crc kubenswrapper[4700]: I1007 12:14:34.770468 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l49wv" event={"ID":"59b759c1-b3ea-47da-a8df-be6fb064bda1","Type":"ContainerDied","Data":"22d94784dad4454cf25b03fd22b2aa4741637c510274466ea4c7006d0e5e5bfe"} Oct 07 12:14:35 crc kubenswrapper[4700]: I1007 12:14:35.783716 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l49wv" event={"ID":"59b759c1-b3ea-47da-a8df-be6fb064bda1","Type":"ContainerStarted","Data":"78ef3c9b474fb068ed58a26672762a05b5100b168ad60a55036600beb3886506"} Oct 07 12:14:35 crc kubenswrapper[4700]: I1007 12:14:35.809024 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l49wv" podStartSLOduration=2.289497361 podStartE2EDuration="4.808999398s" podCreationTimestamp="2025-10-07 12:14:31 +0000 UTC" firstStartedPulling="2025-10-07 12:14:32.740142651 +0000 UTC m=+3239.536541640" lastFinishedPulling="2025-10-07 12:14:35.259644668 +0000 UTC m=+3242.056043677" observedRunningTime="2025-10-07 12:14:35.800823964 +0000 UTC m=+3242.597222963" watchObservedRunningTime="2025-10-07 12:14:35.808999398 +0000 UTC m=+3242.605398397" Oct 07 12:14:41 crc kubenswrapper[4700]: I1007 12:14:41.385705 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l49wv" Oct 07 12:14:41 crc kubenswrapper[4700]: I1007 12:14:41.387478 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l49wv" Oct 07 12:14:41 crc kubenswrapper[4700]: I1007 12:14:41.466810 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l49wv" Oct 07 12:14:41 crc kubenswrapper[4700]: I1007 12:14:41.908391 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l49wv" Oct 07 12:14:41 crc kubenswrapper[4700]: I1007 12:14:41.972297 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l49wv"] Oct 07 12:14:43 crc kubenswrapper[4700]: I1007 12:14:43.865044 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l49wv" podUID="59b759c1-b3ea-47da-a8df-be6fb064bda1" containerName="registry-server" containerID="cri-o://78ef3c9b474fb068ed58a26672762a05b5100b168ad60a55036600beb3886506" gracePeriod=2 Oct 07 12:14:44 crc kubenswrapper[4700]: I1007 12:14:44.458048 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l49wv" Oct 07 12:14:44 crc kubenswrapper[4700]: I1007 12:14:44.653224 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7tkq\" (UniqueName: \"kubernetes.io/projected/59b759c1-b3ea-47da-a8df-be6fb064bda1-kube-api-access-p7tkq\") pod \"59b759c1-b3ea-47da-a8df-be6fb064bda1\" (UID: \"59b759c1-b3ea-47da-a8df-be6fb064bda1\") " Oct 07 12:14:44 crc kubenswrapper[4700]: I1007 12:14:44.653513 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59b759c1-b3ea-47da-a8df-be6fb064bda1-catalog-content\") pod \"59b759c1-b3ea-47da-a8df-be6fb064bda1\" (UID: \"59b759c1-b3ea-47da-a8df-be6fb064bda1\") " Oct 07 12:14:44 crc kubenswrapper[4700]: I1007 12:14:44.653562 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59b759c1-b3ea-47da-a8df-be6fb064bda1-utilities\") pod \"59b759c1-b3ea-47da-a8df-be6fb064bda1\" (UID: \"59b759c1-b3ea-47da-a8df-be6fb064bda1\") " Oct 07 12:14:44 crc kubenswrapper[4700]: I1007 12:14:44.654966 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59b759c1-b3ea-47da-a8df-be6fb064bda1-utilities" (OuterVolumeSpecName: "utilities") pod "59b759c1-b3ea-47da-a8df-be6fb064bda1" (UID: "59b759c1-b3ea-47da-a8df-be6fb064bda1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:14:44 crc kubenswrapper[4700]: I1007 12:14:44.662841 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59b759c1-b3ea-47da-a8df-be6fb064bda1-kube-api-access-p7tkq" (OuterVolumeSpecName: "kube-api-access-p7tkq") pod "59b759c1-b3ea-47da-a8df-be6fb064bda1" (UID: "59b759c1-b3ea-47da-a8df-be6fb064bda1"). InnerVolumeSpecName "kube-api-access-p7tkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:14:44 crc kubenswrapper[4700]: I1007 12:14:44.667975 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59b759c1-b3ea-47da-a8df-be6fb064bda1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "59b759c1-b3ea-47da-a8df-be6fb064bda1" (UID: "59b759c1-b3ea-47da-a8df-be6fb064bda1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:14:44 crc kubenswrapper[4700]: I1007 12:14:44.756079 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7tkq\" (UniqueName: \"kubernetes.io/projected/59b759c1-b3ea-47da-a8df-be6fb064bda1-kube-api-access-p7tkq\") on node \"crc\" DevicePath \"\"" Oct 07 12:14:44 crc kubenswrapper[4700]: I1007 12:14:44.756127 4700 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59b759c1-b3ea-47da-a8df-be6fb064bda1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 12:14:44 crc kubenswrapper[4700]: I1007 12:14:44.756144 4700 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59b759c1-b3ea-47da-a8df-be6fb064bda1-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 12:14:44 crc kubenswrapper[4700]: I1007 12:14:44.878517 4700 generic.go:334] "Generic (PLEG): container finished" podID="59b759c1-b3ea-47da-a8df-be6fb064bda1" containerID="78ef3c9b474fb068ed58a26672762a05b5100b168ad60a55036600beb3886506" exitCode=0 Oct 07 12:14:44 crc kubenswrapper[4700]: I1007 12:14:44.878579 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l49wv" Oct 07 12:14:44 crc kubenswrapper[4700]: I1007 12:14:44.878596 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l49wv" event={"ID":"59b759c1-b3ea-47da-a8df-be6fb064bda1","Type":"ContainerDied","Data":"78ef3c9b474fb068ed58a26672762a05b5100b168ad60a55036600beb3886506"} Oct 07 12:14:44 crc kubenswrapper[4700]: I1007 12:14:44.878673 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l49wv" event={"ID":"59b759c1-b3ea-47da-a8df-be6fb064bda1","Type":"ContainerDied","Data":"fbeea05e03667587952dde520cdc3b61e58be2dcbbffe3c89eab4d3518065d37"} Oct 07 12:14:44 crc kubenswrapper[4700]: I1007 12:14:44.878712 4700 scope.go:117] "RemoveContainer" containerID="78ef3c9b474fb068ed58a26672762a05b5100b168ad60a55036600beb3886506" Oct 07 12:14:44 crc kubenswrapper[4700]: I1007 12:14:44.922402 4700 scope.go:117] "RemoveContainer" containerID="22d94784dad4454cf25b03fd22b2aa4741637c510274466ea4c7006d0e5e5bfe" Oct 07 12:14:44 crc kubenswrapper[4700]: I1007 12:14:44.939345 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l49wv"] Oct 07 12:14:44 crc kubenswrapper[4700]: I1007 12:14:44.949661 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l49wv"] Oct 07 12:14:44 crc kubenswrapper[4700]: I1007 12:14:44.950255 4700 scope.go:117] "RemoveContainer" containerID="bacedd331ebb549f222987ac6d784b52b372a89ef00a6b1e192e3cf0285b692b" Oct 07 12:14:45 crc kubenswrapper[4700]: I1007 12:14:45.001871 4700 scope.go:117] "RemoveContainer" containerID="78ef3c9b474fb068ed58a26672762a05b5100b168ad60a55036600beb3886506" Oct 07 12:14:45 crc kubenswrapper[4700]: E1007 12:14:45.002524 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78ef3c9b474fb068ed58a26672762a05b5100b168ad60a55036600beb3886506\": container with ID starting with 78ef3c9b474fb068ed58a26672762a05b5100b168ad60a55036600beb3886506 not found: ID does not exist" containerID="78ef3c9b474fb068ed58a26672762a05b5100b168ad60a55036600beb3886506" Oct 07 12:14:45 crc kubenswrapper[4700]: I1007 12:14:45.002577 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78ef3c9b474fb068ed58a26672762a05b5100b168ad60a55036600beb3886506"} err="failed to get container status \"78ef3c9b474fb068ed58a26672762a05b5100b168ad60a55036600beb3886506\": rpc error: code = NotFound desc = could not find container \"78ef3c9b474fb068ed58a26672762a05b5100b168ad60a55036600beb3886506\": container with ID starting with 78ef3c9b474fb068ed58a26672762a05b5100b168ad60a55036600beb3886506 not found: ID does not exist" Oct 07 12:14:45 crc kubenswrapper[4700]: I1007 12:14:45.002611 4700 scope.go:117] "RemoveContainer" containerID="22d94784dad4454cf25b03fd22b2aa4741637c510274466ea4c7006d0e5e5bfe" Oct 07 12:14:45 crc kubenswrapper[4700]: E1007 12:14:45.002957 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22d94784dad4454cf25b03fd22b2aa4741637c510274466ea4c7006d0e5e5bfe\": container with ID starting with 22d94784dad4454cf25b03fd22b2aa4741637c510274466ea4c7006d0e5e5bfe not found: ID does not exist" containerID="22d94784dad4454cf25b03fd22b2aa4741637c510274466ea4c7006d0e5e5bfe" Oct 07 12:14:45 crc kubenswrapper[4700]: I1007 12:14:45.002996 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22d94784dad4454cf25b03fd22b2aa4741637c510274466ea4c7006d0e5e5bfe"} err="failed to get container status \"22d94784dad4454cf25b03fd22b2aa4741637c510274466ea4c7006d0e5e5bfe\": rpc error: code = NotFound desc = could not find container \"22d94784dad4454cf25b03fd22b2aa4741637c510274466ea4c7006d0e5e5bfe\": container with ID starting with 22d94784dad4454cf25b03fd22b2aa4741637c510274466ea4c7006d0e5e5bfe not found: ID does not exist" Oct 07 12:14:45 crc kubenswrapper[4700]: I1007 12:14:45.003021 4700 scope.go:117] "RemoveContainer" containerID="bacedd331ebb549f222987ac6d784b52b372a89ef00a6b1e192e3cf0285b692b" Oct 07 12:14:45 crc kubenswrapper[4700]: E1007 12:14:45.004754 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bacedd331ebb549f222987ac6d784b52b372a89ef00a6b1e192e3cf0285b692b\": container with ID starting with bacedd331ebb549f222987ac6d784b52b372a89ef00a6b1e192e3cf0285b692b not found: ID does not exist" containerID="bacedd331ebb549f222987ac6d784b52b372a89ef00a6b1e192e3cf0285b692b" Oct 07 12:14:45 crc kubenswrapper[4700]: I1007 12:14:45.004795 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bacedd331ebb549f222987ac6d784b52b372a89ef00a6b1e192e3cf0285b692b"} err="failed to get container status \"bacedd331ebb549f222987ac6d784b52b372a89ef00a6b1e192e3cf0285b692b\": rpc error: code = NotFound desc = could not find container \"bacedd331ebb549f222987ac6d784b52b372a89ef00a6b1e192e3cf0285b692b\": container with ID starting with bacedd331ebb549f222987ac6d784b52b372a89ef00a6b1e192e3cf0285b692b not found: ID does not exist" Oct 07 12:14:45 crc kubenswrapper[4700]: I1007 12:14:45.967932 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59b759c1-b3ea-47da-a8df-be6fb064bda1" path="/var/lib/kubelet/pods/59b759c1-b3ea-47da-a8df-be6fb064bda1/volumes" Oct 07 12:14:50 crc kubenswrapper[4700]: I1007 12:14:50.562130 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m"] Oct 07 12:14:50 crc kubenswrapper[4700]: E1007 12:14:50.563859 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59b759c1-b3ea-47da-a8df-be6fb064bda1" containerName="registry-server" Oct 07 12:14:50 crc kubenswrapper[4700]: I1007 12:14:50.563878 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="59b759c1-b3ea-47da-a8df-be6fb064bda1" containerName="registry-server" Oct 07 12:14:50 crc kubenswrapper[4700]: E1007 12:14:50.563912 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59b759c1-b3ea-47da-a8df-be6fb064bda1" containerName="extract-content" Oct 07 12:14:50 crc kubenswrapper[4700]: I1007 12:14:50.563920 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="59b759c1-b3ea-47da-a8df-be6fb064bda1" containerName="extract-content" Oct 07 12:14:50 crc kubenswrapper[4700]: E1007 12:14:50.563949 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59b759c1-b3ea-47da-a8df-be6fb064bda1" containerName="extract-utilities" Oct 07 12:14:50 crc kubenswrapper[4700]: I1007 12:14:50.563958 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="59b759c1-b3ea-47da-a8df-be6fb064bda1" containerName="extract-utilities" Oct 07 12:14:50 crc kubenswrapper[4700]: I1007 12:14:50.564185 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="59b759c1-b3ea-47da-a8df-be6fb064bda1" containerName="registry-server" Oct 07 12:14:50 crc kubenswrapper[4700]: I1007 12:14:50.566099 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m" Oct 07 12:14:50 crc kubenswrapper[4700]: I1007 12:14:50.568859 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 07 12:14:50 crc kubenswrapper[4700]: I1007 12:14:50.583883 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m"] Oct 07 12:14:50 crc kubenswrapper[4700]: I1007 12:14:50.668172 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw9bd\" (UniqueName: \"kubernetes.io/projected/a1e54796-2008-49d7-9ab4-0a865b57e743-kube-api-access-sw9bd\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m\" (UID: \"a1e54796-2008-49d7-9ab4-0a865b57e743\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m" Oct 07 12:14:50 crc kubenswrapper[4700]: I1007 12:14:50.668251 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a1e54796-2008-49d7-9ab4-0a865b57e743-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m\" (UID: \"a1e54796-2008-49d7-9ab4-0a865b57e743\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m" Oct 07 12:14:50 crc kubenswrapper[4700]: I1007 12:14:50.668377 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a1e54796-2008-49d7-9ab4-0a865b57e743-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m\" (UID: \"a1e54796-2008-49d7-9ab4-0a865b57e743\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m" Oct 07 12:14:50 crc kubenswrapper[4700]: I1007 12:14:50.769666 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a1e54796-2008-49d7-9ab4-0a865b57e743-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m\" (UID: \"a1e54796-2008-49d7-9ab4-0a865b57e743\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m" Oct 07 12:14:50 crc kubenswrapper[4700]: I1007 12:14:50.769855 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw9bd\" (UniqueName: \"kubernetes.io/projected/a1e54796-2008-49d7-9ab4-0a865b57e743-kube-api-access-sw9bd\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m\" (UID: \"a1e54796-2008-49d7-9ab4-0a865b57e743\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m" Oct 07 12:14:50 crc kubenswrapper[4700]: I1007 12:14:50.769880 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a1e54796-2008-49d7-9ab4-0a865b57e743-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m\" (UID: \"a1e54796-2008-49d7-9ab4-0a865b57e743\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m" Oct 07 12:14:50 crc kubenswrapper[4700]: I1007 12:14:50.770251 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a1e54796-2008-49d7-9ab4-0a865b57e743-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m\" (UID: \"a1e54796-2008-49d7-9ab4-0a865b57e743\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m" Oct 07 12:14:50 crc kubenswrapper[4700]: I1007 12:14:50.770373 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a1e54796-2008-49d7-9ab4-0a865b57e743-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m\" (UID: \"a1e54796-2008-49d7-9ab4-0a865b57e743\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m" Oct 07 12:14:50 crc kubenswrapper[4700]: I1007 12:14:50.794513 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw9bd\" (UniqueName: \"kubernetes.io/projected/a1e54796-2008-49d7-9ab4-0a865b57e743-kube-api-access-sw9bd\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m\" (UID: \"a1e54796-2008-49d7-9ab4-0a865b57e743\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m" Oct 07 12:14:50 crc kubenswrapper[4700]: I1007 12:14:50.887131 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m" Oct 07 12:14:51 crc kubenswrapper[4700]: I1007 12:14:51.385135 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m"] Oct 07 12:14:51 crc kubenswrapper[4700]: I1007 12:14:51.979475 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m" event={"ID":"a1e54796-2008-49d7-9ab4-0a865b57e743","Type":"ContainerStarted","Data":"6e160bff208376bcc33399c177556c6bbbf933e7f00cfdd061d6e3c27af6c49f"} Oct 07 12:14:51 crc kubenswrapper[4700]: I1007 12:14:51.979889 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m" event={"ID":"a1e54796-2008-49d7-9ab4-0a865b57e743","Type":"ContainerStarted","Data":"ff77eafeea2af27be7646b21a32936cbb89162df457e521e184e59558ba34792"} Oct 07 12:14:52 crc kubenswrapper[4700]: I1007 12:14:52.975610 4700 generic.go:334] "Generic (PLEG): container finished" podID="a1e54796-2008-49d7-9ab4-0a865b57e743" containerID="6e160bff208376bcc33399c177556c6bbbf933e7f00cfdd061d6e3c27af6c49f" exitCode=0 Oct 07 12:14:52 crc kubenswrapper[4700]: I1007 12:14:52.975721 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m" event={"ID":"a1e54796-2008-49d7-9ab4-0a865b57e743","Type":"ContainerDied","Data":"6e160bff208376bcc33399c177556c6bbbf933e7f00cfdd061d6e3c27af6c49f"} Oct 07 12:14:56 crc kubenswrapper[4700]: I1007 12:14:56.005676 4700 generic.go:334] "Generic (PLEG): container finished" podID="a1e54796-2008-49d7-9ab4-0a865b57e743" containerID="c9e236a27176aafbd9540288210fefbf65dfbf29fb76924f04abeb37f13f915c" exitCode=0 Oct 07 12:14:56 crc kubenswrapper[4700]: I1007 12:14:56.005714 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m" event={"ID":"a1e54796-2008-49d7-9ab4-0a865b57e743","Type":"ContainerDied","Data":"c9e236a27176aafbd9540288210fefbf65dfbf29fb76924f04abeb37f13f915c"} Oct 07 12:14:57 crc kubenswrapper[4700]: I1007 12:14:57.022052 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m" event={"ID":"a1e54796-2008-49d7-9ab4-0a865b57e743","Type":"ContainerStarted","Data":"24b06dd879ea9d6e8f56a37f3623b3142293e4ce9194bff93b71c441c4d7fd7a"} Oct 07 12:14:57 crc kubenswrapper[4700]: I1007 12:14:57.049922 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m" podStartSLOduration=5.180210037 podStartE2EDuration="7.049906624s" podCreationTimestamp="2025-10-07 12:14:50 +0000 UTC" firstStartedPulling="2025-10-07 12:14:52.979176931 +0000 UTC m=+3259.775575920" lastFinishedPulling="2025-10-07 12:14:54.848873508 +0000 UTC m=+3261.645272507" observedRunningTime="2025-10-07 12:14:57.047867981 +0000 UTC m=+3263.844267010" watchObservedRunningTime="2025-10-07 12:14:57.049906624 +0000 UTC m=+3263.846305613" Oct 07 12:14:58 crc kubenswrapper[4700]: I1007 12:14:58.038402 4700 generic.go:334] "Generic (PLEG): container finished" podID="a1e54796-2008-49d7-9ab4-0a865b57e743" containerID="24b06dd879ea9d6e8f56a37f3623b3142293e4ce9194bff93b71c441c4d7fd7a" exitCode=0 Oct 07 12:14:58 crc kubenswrapper[4700]: I1007 12:14:58.038517 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m" event={"ID":"a1e54796-2008-49d7-9ab4-0a865b57e743","Type":"ContainerDied","Data":"24b06dd879ea9d6e8f56a37f3623b3142293e4ce9194bff93b71c441c4d7fd7a"} Oct 07 12:14:59 crc kubenswrapper[4700]: I1007 12:14:59.478441 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m" Oct 07 12:14:59 crc kubenswrapper[4700]: I1007 12:14:59.563603 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a1e54796-2008-49d7-9ab4-0a865b57e743-util\") pod \"a1e54796-2008-49d7-9ab4-0a865b57e743\" (UID: \"a1e54796-2008-49d7-9ab4-0a865b57e743\") " Oct 07 12:14:59 crc kubenswrapper[4700]: I1007 12:14:59.563835 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sw9bd\" (UniqueName: \"kubernetes.io/projected/a1e54796-2008-49d7-9ab4-0a865b57e743-kube-api-access-sw9bd\") pod \"a1e54796-2008-49d7-9ab4-0a865b57e743\" (UID: \"a1e54796-2008-49d7-9ab4-0a865b57e743\") " Oct 07 12:14:59 crc kubenswrapper[4700]: I1007 12:14:59.563970 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a1e54796-2008-49d7-9ab4-0a865b57e743-bundle\") pod \"a1e54796-2008-49d7-9ab4-0a865b57e743\" (UID: \"a1e54796-2008-49d7-9ab4-0a865b57e743\") " Oct 07 12:14:59 crc kubenswrapper[4700]: I1007 12:14:59.566108 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1e54796-2008-49d7-9ab4-0a865b57e743-bundle" (OuterVolumeSpecName: "bundle") pod "a1e54796-2008-49d7-9ab4-0a865b57e743" (UID: "a1e54796-2008-49d7-9ab4-0a865b57e743"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:14:59 crc kubenswrapper[4700]: I1007 12:14:59.573439 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1e54796-2008-49d7-9ab4-0a865b57e743-kube-api-access-sw9bd" (OuterVolumeSpecName: "kube-api-access-sw9bd") pod "a1e54796-2008-49d7-9ab4-0a865b57e743" (UID: "a1e54796-2008-49d7-9ab4-0a865b57e743"). InnerVolumeSpecName "kube-api-access-sw9bd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:14:59 crc kubenswrapper[4700]: I1007 12:14:59.587843 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1e54796-2008-49d7-9ab4-0a865b57e743-util" (OuterVolumeSpecName: "util") pod "a1e54796-2008-49d7-9ab4-0a865b57e743" (UID: "a1e54796-2008-49d7-9ab4-0a865b57e743"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:14:59 crc kubenswrapper[4700]: I1007 12:14:59.666483 4700 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a1e54796-2008-49d7-9ab4-0a865b57e743-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:14:59 crc kubenswrapper[4700]: I1007 12:14:59.666514 4700 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a1e54796-2008-49d7-9ab4-0a865b57e743-util\") on node \"crc\" DevicePath \"\"" Oct 07 12:14:59 crc kubenswrapper[4700]: I1007 12:14:59.666524 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sw9bd\" (UniqueName: \"kubernetes.io/projected/a1e54796-2008-49d7-9ab4-0a865b57e743-kube-api-access-sw9bd\") on node \"crc\" DevicePath \"\"" Oct 07 12:15:00 crc kubenswrapper[4700]: I1007 12:15:00.067542 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m" event={"ID":"a1e54796-2008-49d7-9ab4-0a865b57e743","Type":"ContainerDied","Data":"ff77eafeea2af27be7646b21a32936cbb89162df457e521e184e59558ba34792"} Oct 07 12:15:00 crc kubenswrapper[4700]: I1007 12:15:00.067650 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m" Oct 07 12:15:00 crc kubenswrapper[4700]: I1007 12:15:00.067633 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff77eafeea2af27be7646b21a32936cbb89162df457e521e184e59558ba34792" Oct 07 12:15:00 crc kubenswrapper[4700]: I1007 12:15:00.172895 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330655-5hf7h"] Oct 07 12:15:00 crc kubenswrapper[4700]: E1007 12:15:00.173334 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1e54796-2008-49d7-9ab4-0a865b57e743" containerName="pull" Oct 07 12:15:00 crc kubenswrapper[4700]: I1007 12:15:00.173349 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1e54796-2008-49d7-9ab4-0a865b57e743" containerName="pull" Oct 07 12:15:00 crc kubenswrapper[4700]: E1007 12:15:00.173377 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1e54796-2008-49d7-9ab4-0a865b57e743" containerName="extract" Oct 07 12:15:00 crc kubenswrapper[4700]: I1007 12:15:00.173383 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1e54796-2008-49d7-9ab4-0a865b57e743" containerName="extract" Oct 07 12:15:00 crc kubenswrapper[4700]: E1007 12:15:00.173402 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1e54796-2008-49d7-9ab4-0a865b57e743" containerName="util" Oct 07 12:15:00 crc kubenswrapper[4700]: I1007 12:15:00.173407 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1e54796-2008-49d7-9ab4-0a865b57e743" containerName="util" Oct 07 12:15:00 crc kubenswrapper[4700]: I1007 12:15:00.173592 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1e54796-2008-49d7-9ab4-0a865b57e743" containerName="extract" Oct 07 12:15:00 crc kubenswrapper[4700]: I1007 12:15:00.174227 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330655-5hf7h" Oct 07 12:15:00 crc kubenswrapper[4700]: I1007 12:15:00.190768 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 12:15:00 crc kubenswrapper[4700]: I1007 12:15:00.190795 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 12:15:00 crc kubenswrapper[4700]: I1007 12:15:00.200938 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330655-5hf7h"] Oct 07 12:15:00 crc kubenswrapper[4700]: I1007 12:15:00.294093 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed0d7b5f-8f96-4bb8-a646-105aadbd7947-config-volume\") pod \"collect-profiles-29330655-5hf7h\" (UID: \"ed0d7b5f-8f96-4bb8-a646-105aadbd7947\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330655-5hf7h" Oct 07 12:15:00 crc kubenswrapper[4700]: I1007 12:15:00.294835 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxnjm\" (UniqueName: \"kubernetes.io/projected/ed0d7b5f-8f96-4bb8-a646-105aadbd7947-kube-api-access-vxnjm\") pod \"collect-profiles-29330655-5hf7h\" (UID: \"ed0d7b5f-8f96-4bb8-a646-105aadbd7947\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330655-5hf7h" Oct 07 12:15:00 crc kubenswrapper[4700]: I1007 12:15:00.295047 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed0d7b5f-8f96-4bb8-a646-105aadbd7947-secret-volume\") pod \"collect-profiles-29330655-5hf7h\" (UID: \"ed0d7b5f-8f96-4bb8-a646-105aadbd7947\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330655-5hf7h" Oct 07 12:15:00 crc kubenswrapper[4700]: I1007 12:15:00.397083 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed0d7b5f-8f96-4bb8-a646-105aadbd7947-config-volume\") pod \"collect-profiles-29330655-5hf7h\" (UID: \"ed0d7b5f-8f96-4bb8-a646-105aadbd7947\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330655-5hf7h" Oct 07 12:15:00 crc kubenswrapper[4700]: I1007 12:15:00.397135 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxnjm\" (UniqueName: \"kubernetes.io/projected/ed0d7b5f-8f96-4bb8-a646-105aadbd7947-kube-api-access-vxnjm\") pod \"collect-profiles-29330655-5hf7h\" (UID: \"ed0d7b5f-8f96-4bb8-a646-105aadbd7947\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330655-5hf7h" Oct 07 12:15:00 crc kubenswrapper[4700]: I1007 12:15:00.397208 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed0d7b5f-8f96-4bb8-a646-105aadbd7947-secret-volume\") pod \"collect-profiles-29330655-5hf7h\" (UID: \"ed0d7b5f-8f96-4bb8-a646-105aadbd7947\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330655-5hf7h" Oct 07 12:15:00 crc kubenswrapper[4700]: I1007 12:15:00.398275 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed0d7b5f-8f96-4bb8-a646-105aadbd7947-config-volume\") pod \"collect-profiles-29330655-5hf7h\" (UID: \"ed0d7b5f-8f96-4bb8-a646-105aadbd7947\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330655-5hf7h" Oct 07 12:15:00 crc kubenswrapper[4700]: I1007 12:15:00.407247 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed0d7b5f-8f96-4bb8-a646-105aadbd7947-secret-volume\") pod \"collect-profiles-29330655-5hf7h\" (UID: \"ed0d7b5f-8f96-4bb8-a646-105aadbd7947\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330655-5hf7h" Oct 07 12:15:00 crc kubenswrapper[4700]: I1007 12:15:00.422439 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxnjm\" (UniqueName: \"kubernetes.io/projected/ed0d7b5f-8f96-4bb8-a646-105aadbd7947-kube-api-access-vxnjm\") pod \"collect-profiles-29330655-5hf7h\" (UID: \"ed0d7b5f-8f96-4bb8-a646-105aadbd7947\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330655-5hf7h" Oct 07 12:15:00 crc kubenswrapper[4700]: I1007 12:15:00.516498 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330655-5hf7h" Oct 07 12:15:00 crc kubenswrapper[4700]: I1007 12:15:00.819887 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330655-5hf7h"] Oct 07 12:15:01 crc kubenswrapper[4700]: I1007 12:15:01.091841 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330655-5hf7h" event={"ID":"ed0d7b5f-8f96-4bb8-a646-105aadbd7947","Type":"ContainerStarted","Data":"087f68a5ed91c4710694e7df5ae54b01c68c8d130c4743359c44654eb9b4b834"} Oct 07 12:15:02 crc kubenswrapper[4700]: I1007 12:15:02.104844 4700 generic.go:334] "Generic (PLEG): container finished" podID="ed0d7b5f-8f96-4bb8-a646-105aadbd7947" containerID="9c68e570c0f4e91378fa89d0fbb8438da73117d45646fc392ea1b5dd6a1edc52" exitCode=0 Oct 07 12:15:02 crc kubenswrapper[4700]: I1007 12:15:02.104888 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330655-5hf7h" event={"ID":"ed0d7b5f-8f96-4bb8-a646-105aadbd7947","Type":"ContainerDied","Data":"9c68e570c0f4e91378fa89d0fbb8438da73117d45646fc392ea1b5dd6a1edc52"} Oct 07 12:15:03 crc kubenswrapper[4700]: I1007 12:15:03.786682 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330655-5hf7h" Oct 07 12:15:03 crc kubenswrapper[4700]: I1007 12:15:03.865567 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed0d7b5f-8f96-4bb8-a646-105aadbd7947-secret-volume\") pod \"ed0d7b5f-8f96-4bb8-a646-105aadbd7947\" (UID: \"ed0d7b5f-8f96-4bb8-a646-105aadbd7947\") " Oct 07 12:15:03 crc kubenswrapper[4700]: I1007 12:15:03.865748 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxnjm\" (UniqueName: \"kubernetes.io/projected/ed0d7b5f-8f96-4bb8-a646-105aadbd7947-kube-api-access-vxnjm\") pod \"ed0d7b5f-8f96-4bb8-a646-105aadbd7947\" (UID: \"ed0d7b5f-8f96-4bb8-a646-105aadbd7947\") " Oct 07 12:15:03 crc kubenswrapper[4700]: I1007 12:15:03.865826 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed0d7b5f-8f96-4bb8-a646-105aadbd7947-config-volume\") pod \"ed0d7b5f-8f96-4bb8-a646-105aadbd7947\" (UID: \"ed0d7b5f-8f96-4bb8-a646-105aadbd7947\") " Oct 07 12:15:03 crc kubenswrapper[4700]: I1007 12:15:03.866858 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed0d7b5f-8f96-4bb8-a646-105aadbd7947-config-volume" (OuterVolumeSpecName: "config-volume") pod "ed0d7b5f-8f96-4bb8-a646-105aadbd7947" (UID: "ed0d7b5f-8f96-4bb8-a646-105aadbd7947"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:15:03 crc kubenswrapper[4700]: I1007 12:15:03.875552 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed0d7b5f-8f96-4bb8-a646-105aadbd7947-kube-api-access-vxnjm" (OuterVolumeSpecName: "kube-api-access-vxnjm") pod "ed0d7b5f-8f96-4bb8-a646-105aadbd7947" (UID: "ed0d7b5f-8f96-4bb8-a646-105aadbd7947"). InnerVolumeSpecName "kube-api-access-vxnjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:15:03 crc kubenswrapper[4700]: I1007 12:15:03.878427 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed0d7b5f-8f96-4bb8-a646-105aadbd7947-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ed0d7b5f-8f96-4bb8-a646-105aadbd7947" (UID: "ed0d7b5f-8f96-4bb8-a646-105aadbd7947"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:15:03 crc kubenswrapper[4700]: I1007 12:15:03.973224 4700 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed0d7b5f-8f96-4bb8-a646-105aadbd7947-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 12:15:03 crc kubenswrapper[4700]: I1007 12:15:03.973256 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxnjm\" (UniqueName: \"kubernetes.io/projected/ed0d7b5f-8f96-4bb8-a646-105aadbd7947-kube-api-access-vxnjm\") on node \"crc\" DevicePath \"\"" Oct 07 12:15:03 crc kubenswrapper[4700]: I1007 12:15:03.973266 4700 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed0d7b5f-8f96-4bb8-a646-105aadbd7947-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 12:15:04 crc kubenswrapper[4700]: I1007 12:15:04.123419 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330655-5hf7h" event={"ID":"ed0d7b5f-8f96-4bb8-a646-105aadbd7947","Type":"ContainerDied","Data":"087f68a5ed91c4710694e7df5ae54b01c68c8d130c4743359c44654eb9b4b834"} Oct 07 12:15:04 crc kubenswrapper[4700]: I1007 12:15:04.123455 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="087f68a5ed91c4710694e7df5ae54b01c68c8d130c4743359c44654eb9b4b834" Oct 07 12:15:04 crc kubenswrapper[4700]: I1007 12:15:04.123489 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330655-5hf7h" Oct 07 12:15:04 crc kubenswrapper[4700]: I1007 12:15:04.862611 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330610-l56qp"] Oct 07 12:15:04 crc kubenswrapper[4700]: I1007 12:15:04.878150 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330610-l56qp"] Oct 07 12:15:06 crc kubenswrapper[4700]: I1007 12:15:06.047237 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f8db84c-25d7-401f-b5f1-93fb1c324f87" path="/var/lib/kubelet/pods/3f8db84c-25d7-401f-b5f1-93fb1c324f87/volumes" Oct 07 12:15:07 crc kubenswrapper[4700]: I1007 12:15:07.171450 4700 scope.go:117] "RemoveContainer" containerID="f895705b2f8fe8c60aee9726859046272402b99218a6bdd7f69f6941b2c470b6" Oct 07 12:15:07 crc kubenswrapper[4700]: I1007 12:15:07.237176 4700 scope.go:117] "RemoveContainer" containerID="106f8f681a92f0289f32586390c0e4ea292eb272cb0a722039ae45aa9393ca87" Oct 07 12:15:07 crc kubenswrapper[4700]: I1007 12:15:07.260990 4700 scope.go:117] "RemoveContainer" containerID="8e26b16f7a5607efd4b0fd262d95abd1e18bd84a8ddae5c9d60d789acb6c604d" Oct 07 12:15:07 crc kubenswrapper[4700]: I1007 12:15:07.289933 4700 scope.go:117] "RemoveContainer" containerID="e91a3b957339d6877c3f15cb001648019413c3ecb1378a4c57201096db49f984" Oct 07 12:15:07 crc kubenswrapper[4700]: I1007 12:15:07.681951 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-8xvhj"] Oct 07 12:15:07 crc kubenswrapper[4700]: E1007 12:15:07.682334 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed0d7b5f-8f96-4bb8-a646-105aadbd7947" containerName="collect-profiles" Oct 07 12:15:07 crc kubenswrapper[4700]: I1007 12:15:07.682350 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed0d7b5f-8f96-4bb8-a646-105aadbd7947" containerName="collect-profiles" Oct 07 12:15:07 crc kubenswrapper[4700]: I1007 12:15:07.682540 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed0d7b5f-8f96-4bb8-a646-105aadbd7947" containerName="collect-profiles" Oct 07 12:15:07 crc kubenswrapper[4700]: I1007 12:15:07.683102 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-8xvhj" Oct 07 12:15:07 crc kubenswrapper[4700]: I1007 12:15:07.687791 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Oct 07 12:15:07 crc kubenswrapper[4700]: I1007 12:15:07.688171 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Oct 07 12:15:07 crc kubenswrapper[4700]: I1007 12:15:07.688387 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-wvffd" Oct 07 12:15:07 crc kubenswrapper[4700]: I1007 12:15:07.697028 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-8xvhj"] Oct 07 12:15:07 crc kubenswrapper[4700]: I1007 12:15:07.742025 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58bxt\" (UniqueName: \"kubernetes.io/projected/398ce44d-03fb-4ee9-ac61-2ca3fd52074e-kube-api-access-58bxt\") pod \"obo-prometheus-operator-7c8cf85677-8xvhj\" (UID: \"398ce44d-03fb-4ee9-ac61-2ca3fd52074e\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-8xvhj" Oct 07 12:15:07 crc kubenswrapper[4700]: I1007 12:15:07.844248 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58bxt\" (UniqueName: \"kubernetes.io/projected/398ce44d-03fb-4ee9-ac61-2ca3fd52074e-kube-api-access-58bxt\") pod \"obo-prometheus-operator-7c8cf85677-8xvhj\" (UID: \"398ce44d-03fb-4ee9-ac61-2ca3fd52074e\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-8xvhj" Oct 07 12:15:07 crc kubenswrapper[4700]: I1007 12:15:07.866780 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58bxt\" (UniqueName: \"kubernetes.io/projected/398ce44d-03fb-4ee9-ac61-2ca3fd52074e-kube-api-access-58bxt\") pod \"obo-prometheus-operator-7c8cf85677-8xvhj\" (UID: \"398ce44d-03fb-4ee9-ac61-2ca3fd52074e\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-8xvhj" Oct 07 12:15:07 crc kubenswrapper[4700]: I1007 12:15:07.881554 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-76fc5d7d48-7mcmz"] Oct 07 12:15:07 crc kubenswrapper[4700]: I1007 12:15:07.882839 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76fc5d7d48-7mcmz" Oct 07 12:15:07 crc kubenswrapper[4700]: I1007 12:15:07.887801 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-ptfsx" Oct 07 12:15:07 crc kubenswrapper[4700]: I1007 12:15:07.888036 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Oct 07 12:15:07 crc kubenswrapper[4700]: I1007 12:15:07.926505 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-76fc5d7d48-82r7d"] Oct 07 12:15:07 crc kubenswrapper[4700]: I1007 12:15:07.927993 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76fc5d7d48-82r7d" Oct 07 12:15:07 crc kubenswrapper[4700]: I1007 12:15:07.946634 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0319fc60-cd28-49d8-af70-3a2306fe89fd-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-76fc5d7d48-7mcmz\" (UID: \"0319fc60-cd28-49d8-af70-3a2306fe89fd\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-76fc5d7d48-7mcmz" Oct 07 12:15:07 crc kubenswrapper[4700]: I1007 12:15:07.946788 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0319fc60-cd28-49d8-af70-3a2306fe89fd-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-76fc5d7d48-7mcmz\" (UID: \"0319fc60-cd28-49d8-af70-3a2306fe89fd\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-76fc5d7d48-7mcmz" Oct 07 12:15:07 crc kubenswrapper[4700]: I1007 12:15:07.951904 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-76fc5d7d48-82r7d"] Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.002652 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-76fc5d7d48-7mcmz"] Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.013080 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-8xvhj" Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.049536 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7d619134-8aae-4140-b5ca-33deeac1a66c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-76fc5d7d48-82r7d\" (UID: \"7d619134-8aae-4140-b5ca-33deeac1a66c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-76fc5d7d48-82r7d" Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.049751 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0319fc60-cd28-49d8-af70-3a2306fe89fd-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-76fc5d7d48-7mcmz\" (UID: \"0319fc60-cd28-49d8-af70-3a2306fe89fd\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-76fc5d7d48-7mcmz" Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.049798 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7d619134-8aae-4140-b5ca-33deeac1a66c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-76fc5d7d48-82r7d\" (UID: \"7d619134-8aae-4140-b5ca-33deeac1a66c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-76fc5d7d48-82r7d" Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.049954 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0319fc60-cd28-49d8-af70-3a2306fe89fd-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-76fc5d7d48-7mcmz\" (UID: \"0319fc60-cd28-49d8-af70-3a2306fe89fd\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-76fc5d7d48-7mcmz" Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.055963 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0319fc60-cd28-49d8-af70-3a2306fe89fd-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-76fc5d7d48-7mcmz\" (UID: \"0319fc60-cd28-49d8-af70-3a2306fe89fd\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-76fc5d7d48-7mcmz" Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.072349 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0319fc60-cd28-49d8-af70-3a2306fe89fd-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-76fc5d7d48-7mcmz\" (UID: \"0319fc60-cd28-49d8-af70-3a2306fe89fd\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-76fc5d7d48-7mcmz" Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.103377 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-w5vqc"] Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.105082 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-w5vqc" Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.107706 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-8c8tw" Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.110249 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.125964 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-w5vqc"] Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.158000 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/c82137a0-2748-492d-bd33-39b03e9c8139-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-w5vqc\" (UID: \"c82137a0-2748-492d-bd33-39b03e9c8139\") " pod="openshift-operators/observability-operator-cc5f78dfc-w5vqc" Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.158058 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdfn7\" (UniqueName: \"kubernetes.io/projected/c82137a0-2748-492d-bd33-39b03e9c8139-kube-api-access-pdfn7\") pod \"observability-operator-cc5f78dfc-w5vqc\" (UID: \"c82137a0-2748-492d-bd33-39b03e9c8139\") " pod="openshift-operators/observability-operator-cc5f78dfc-w5vqc" Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.158152 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7d619134-8aae-4140-b5ca-33deeac1a66c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-76fc5d7d48-82r7d\" (UID: \"7d619134-8aae-4140-b5ca-33deeac1a66c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-76fc5d7d48-82r7d" Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.158270 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7d619134-8aae-4140-b5ca-33deeac1a66c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-76fc5d7d48-82r7d\" (UID: \"7d619134-8aae-4140-b5ca-33deeac1a66c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-76fc5d7d48-82r7d" Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.175261 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7d619134-8aae-4140-b5ca-33deeac1a66c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-76fc5d7d48-82r7d\" (UID: \"7d619134-8aae-4140-b5ca-33deeac1a66c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-76fc5d7d48-82r7d" Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.175847 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7d619134-8aae-4140-b5ca-33deeac1a66c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-76fc5d7d48-82r7d\" (UID: \"7d619134-8aae-4140-b5ca-33deeac1a66c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-76fc5d7d48-82r7d" Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.257658 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76fc5d7d48-7mcmz" Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.259442 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/c82137a0-2748-492d-bd33-39b03e9c8139-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-w5vqc\" (UID: \"c82137a0-2748-492d-bd33-39b03e9c8139\") " pod="openshift-operators/observability-operator-cc5f78dfc-w5vqc" Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.259482 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdfn7\" (UniqueName: \"kubernetes.io/projected/c82137a0-2748-492d-bd33-39b03e9c8139-kube-api-access-pdfn7\") pod \"observability-operator-cc5f78dfc-w5vqc\" (UID: \"c82137a0-2748-492d-bd33-39b03e9c8139\") " pod="openshift-operators/observability-operator-cc5f78dfc-w5vqc" Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.285213 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdfn7\" (UniqueName: \"kubernetes.io/projected/c82137a0-2748-492d-bd33-39b03e9c8139-kube-api-access-pdfn7\") pod \"observability-operator-cc5f78dfc-w5vqc\" (UID: \"c82137a0-2748-492d-bd33-39b03e9c8139\") " pod="openshift-operators/observability-operator-cc5f78dfc-w5vqc" Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.292700 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/c82137a0-2748-492d-bd33-39b03e9c8139-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-w5vqc\" (UID: \"c82137a0-2748-492d-bd33-39b03e9c8139\") " pod="openshift-operators/observability-operator-cc5f78dfc-w5vqc" Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.295158 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76fc5d7d48-82r7d" Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.315816 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-fnrbp"] Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.317373 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-fnrbp" Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.321397 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-nvfb6" Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.354362 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-fnrbp"] Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.360770 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/45384a63-61c2-4d8b-906a-e7545addde11-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-fnrbp\" (UID: \"45384a63-61c2-4d8b-906a-e7545addde11\") " pod="openshift-operators/perses-operator-54bc95c9fb-fnrbp" Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.360823 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9h5j\" (UniqueName: \"kubernetes.io/projected/45384a63-61c2-4d8b-906a-e7545addde11-kube-api-access-t9h5j\") pod \"perses-operator-54bc95c9fb-fnrbp\" (UID: \"45384a63-61c2-4d8b-906a-e7545addde11\") " pod="openshift-operators/perses-operator-54bc95c9fb-fnrbp" Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.457576 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-w5vqc" Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.462509 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/45384a63-61c2-4d8b-906a-e7545addde11-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-fnrbp\" (UID: \"45384a63-61c2-4d8b-906a-e7545addde11\") " pod="openshift-operators/perses-operator-54bc95c9fb-fnrbp" Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.462580 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9h5j\" (UniqueName: \"kubernetes.io/projected/45384a63-61c2-4d8b-906a-e7545addde11-kube-api-access-t9h5j\") pod \"perses-operator-54bc95c9fb-fnrbp\" (UID: \"45384a63-61c2-4d8b-906a-e7545addde11\") " pod="openshift-operators/perses-operator-54bc95c9fb-fnrbp" Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.464018 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/45384a63-61c2-4d8b-906a-e7545addde11-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-fnrbp\" (UID: \"45384a63-61c2-4d8b-906a-e7545addde11\") " pod="openshift-operators/perses-operator-54bc95c9fb-fnrbp" Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.504111 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9h5j\" (UniqueName: \"kubernetes.io/projected/45384a63-61c2-4d8b-906a-e7545addde11-kube-api-access-t9h5j\") pod \"perses-operator-54bc95c9fb-fnrbp\" (UID: \"45384a63-61c2-4d8b-906a-e7545addde11\") " pod="openshift-operators/perses-operator-54bc95c9fb-fnrbp" Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.703369 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-8xvhj"] Oct 07 12:15:08 crc kubenswrapper[4700]: W1007 12:15:08.715555 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod398ce44d_03fb_4ee9_ac61_2ca3fd52074e.slice/crio-b3a5fa4e3fb4834b0c2c212df8b9802bb3c4f4f9afcf4ee185b9be45cfd22ed5 WatchSource:0}: Error finding container b3a5fa4e3fb4834b0c2c212df8b9802bb3c4f4f9afcf4ee185b9be45cfd22ed5: Status 404 returned error can't find the container with id b3a5fa4e3fb4834b0c2c212df8b9802bb3c4f4f9afcf4ee185b9be45cfd22ed5 Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.760997 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-fnrbp" Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.947187 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-76fc5d7d48-82r7d"] Oct 07 12:15:08 crc kubenswrapper[4700]: W1007 12:15:08.952324 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0319fc60_cd28_49d8_af70_3a2306fe89fd.slice/crio-d5aebf63e1a3d1c022aa9cf03f9d56e4bd8d9cdc68453d84a14e42acc6d66d94 WatchSource:0}: Error finding container d5aebf63e1a3d1c022aa9cf03f9d56e4bd8d9cdc68453d84a14e42acc6d66d94: Status 404 returned error can't find the container with id d5aebf63e1a3d1c022aa9cf03f9d56e4bd8d9cdc68453d84a14e42acc6d66d94 Oct 07 12:15:08 crc kubenswrapper[4700]: I1007 12:15:08.956151 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-76fc5d7d48-7mcmz"] Oct 07 12:15:08 crc kubenswrapper[4700]: W1007 12:15:08.977585 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d619134_8aae_4140_b5ca_33deeac1a66c.slice/crio-a63ff3eaf6699ee5022f3870a921da08f6494f2b648732f5523795f548b1b935 WatchSource:0}: Error finding container a63ff3eaf6699ee5022f3870a921da08f6494f2b648732f5523795f548b1b935: Status 404 returned error can't find the container with id a63ff3eaf6699ee5022f3870a921da08f6494f2b648732f5523795f548b1b935 Oct 07 12:15:09 crc kubenswrapper[4700]: I1007 12:15:09.129535 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-w5vqc"] Oct 07 12:15:09 crc kubenswrapper[4700]: I1007 12:15:09.221530 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76fc5d7d48-82r7d" event={"ID":"7d619134-8aae-4140-b5ca-33deeac1a66c","Type":"ContainerStarted","Data":"a63ff3eaf6699ee5022f3870a921da08f6494f2b648732f5523795f548b1b935"} Oct 07 12:15:09 crc kubenswrapper[4700]: I1007 12:15:09.229672 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-8xvhj" event={"ID":"398ce44d-03fb-4ee9-ac61-2ca3fd52074e","Type":"ContainerStarted","Data":"b3a5fa4e3fb4834b0c2c212df8b9802bb3c4f4f9afcf4ee185b9be45cfd22ed5"} Oct 07 12:15:09 crc kubenswrapper[4700]: I1007 12:15:09.231078 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-w5vqc" event={"ID":"c82137a0-2748-492d-bd33-39b03e9c8139","Type":"ContainerStarted","Data":"76f7252c7d7cb094131c617ea91b27ee8bb9b7e8faf2a8c7a36c016389423d30"} Oct 07 12:15:09 crc kubenswrapper[4700]: I1007 12:15:09.233122 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76fc5d7d48-7mcmz" event={"ID":"0319fc60-cd28-49d8-af70-3a2306fe89fd","Type":"ContainerStarted","Data":"d5aebf63e1a3d1c022aa9cf03f9d56e4bd8d9cdc68453d84a14e42acc6d66d94"} Oct 07 12:15:09 crc kubenswrapper[4700]: I1007 12:15:09.278293 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-fnrbp"] Oct 07 12:15:09 crc kubenswrapper[4700]: W1007 12:15:09.285647 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45384a63_61c2_4d8b_906a_e7545addde11.slice/crio-b5dd21b0ecbbe9eb8e6a890dae71d0bdd5b01f4b39451def72aef7ee9770b429 WatchSource:0}: Error finding container b5dd21b0ecbbe9eb8e6a890dae71d0bdd5b01f4b39451def72aef7ee9770b429: Status 404 returned error can't find the container with id b5dd21b0ecbbe9eb8e6a890dae71d0bdd5b01f4b39451def72aef7ee9770b429 Oct 07 12:15:10 crc kubenswrapper[4700]: I1007 12:15:10.284918 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-fnrbp" event={"ID":"45384a63-61c2-4d8b-906a-e7545addde11","Type":"ContainerStarted","Data":"b5dd21b0ecbbe9eb8e6a890dae71d0bdd5b01f4b39451def72aef7ee9770b429"} Oct 07 12:15:30 crc kubenswrapper[4700]: E1007 12:15:30.568263 4700 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage420814131/2\": happened during read: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:e54c1e1301be66933f3ecb01d5a0ca27f58aabfd905b18b7d057bbf23bdb7b0d" Oct 07 12:15:30 crc kubenswrapper[4700]: E1007 12:15:30.568851 4700 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:e54c1e1301be66933f3ecb01d5a0ca27f58aabfd905b18b7d057bbf23bdb7b0d,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.2.2,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-76fc5d7d48-82r7d_openshift-operators(7d619134-8aae-4140-b5ca-33deeac1a66c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage420814131/2\": happened during read: context canceled" logger="UnhandledError" Oct 07 12:15:30 crc kubenswrapper[4700]: E1007 12:15:30.570017 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \\\"/var/tmp/container_images_storage420814131/2\\\": happened during read: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76fc5d7d48-82r7d" podUID="7d619134-8aae-4140-b5ca-33deeac1a66c" Oct 07 12:15:30 crc kubenswrapper[4700]: E1007 12:15:30.596213 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:e54c1e1301be66933f3ecb01d5a0ca27f58aabfd905b18b7d057bbf23bdb7b0d\\\"\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76fc5d7d48-82r7d" podUID="7d619134-8aae-4140-b5ca-33deeac1a66c" Oct 07 12:15:32 crc kubenswrapper[4700]: E1007 12:15:32.269791 4700 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:e2681bce57dc9c15701f5591532c2dfe8f19778606661339553a28dc003dbca5" Oct 07 12:15:32 crc kubenswrapper[4700]: E1007 12:15:32.270385 4700 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:e2681bce57dc9c15701f5591532c2dfe8f19778606661339553a28dc003dbca5,Command:[],Args:[--prometheus-config-reloader=$(RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER) --prometheus-instance-selector=app.kubernetes.io/managed-by=observability-operator --alertmanager-instance-selector=app.kubernetes.io/managed-by=observability-operator --thanos-ruler-instance-selector=app.kubernetes.io/managed-by=observability-operator],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:GOGC,Value:30,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER,Value:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:8597c48fc71fc6ec8e87dbe40dace4dbb7b817c1039db608af76a0d90f7ac2d0,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.2.2,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{157286400 0} {} 150Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-58bxt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-7c8cf85677-8xvhj_openshift-operators(398ce44d-03fb-4ee9-ac61-2ca3fd52074e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 12:15:32 crc kubenswrapper[4700]: E1007 12:15:32.271823 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-8xvhj" podUID="398ce44d-03fb-4ee9-ac61-2ca3fd52074e" Oct 07 12:15:32 crc kubenswrapper[4700]: E1007 12:15:32.631717 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:e2681bce57dc9c15701f5591532c2dfe8f19778606661339553a28dc003dbca5\\\"\"" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-8xvhj" podUID="398ce44d-03fb-4ee9-ac61-2ca3fd52074e" Oct 07 12:15:33 crc kubenswrapper[4700]: E1007 12:15:33.013601 4700 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:e54c1e1301be66933f3ecb01d5a0ca27f58aabfd905b18b7d057bbf23bdb7b0d" Oct 07 12:15:33 crc kubenswrapper[4700]: E1007 12:15:33.014212 4700 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:e54c1e1301be66933f3ecb01d5a0ca27f58aabfd905b18b7d057bbf23bdb7b0d,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.2.2,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-76fc5d7d48-7mcmz_openshift-operators(0319fc60-cd28-49d8-af70-3a2306fe89fd): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 12:15:33 crc kubenswrapper[4700]: E1007 12:15:33.015360 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76fc5d7d48-7mcmz" podUID="0319fc60-cd28-49d8-af70-3a2306fe89fd" Oct 07 12:15:33 crc kubenswrapper[4700]: I1007 12:15:33.401769 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Oct 07 12:15:33 crc kubenswrapper[4700]: I1007 12:15:33.402047 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="d8ee9baf-609a-49cc-b7cb-4dac97633cf7" containerName="aodh-api" containerID="cri-o://3d5925b3c4399abe1405865eba72559ada3848c20da7f72331882fcd3f746e77" gracePeriod=30 Oct 07 12:15:33 crc kubenswrapper[4700]: I1007 12:15:33.402092 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="d8ee9baf-609a-49cc-b7cb-4dac97633cf7" containerName="aodh-listener" containerID="cri-o://c8011c24efaede31009a73b238f187fa55916e70bfc6fd31438c60b8ba170489" gracePeriod=30 Oct 07 12:15:33 crc kubenswrapper[4700]: I1007 12:15:33.402159 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="d8ee9baf-609a-49cc-b7cb-4dac97633cf7" containerName="aodh-notifier" containerID="cri-o://5d7564d0387cbfcc1a09a19e44f90fecda9b25f45f546cb2c9c2c9e3b00c13f9" gracePeriod=30 Oct 07 12:15:33 crc kubenswrapper[4700]: I1007 12:15:33.402178 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="d8ee9baf-609a-49cc-b7cb-4dac97633cf7" containerName="aodh-evaluator" containerID="cri-o://68d410ba093c47f69da2a05ed03496e2f5cbe94daf2f506f20c5697bf54813ec" gracePeriod=30 Oct 07 12:15:33 crc kubenswrapper[4700]: I1007 12:15:33.642070 4700 generic.go:334] "Generic (PLEG): container finished" podID="d8ee9baf-609a-49cc-b7cb-4dac97633cf7" containerID="3d5925b3c4399abe1405865eba72559ada3848c20da7f72331882fcd3f746e77" exitCode=0 Oct 07 12:15:33 crc kubenswrapper[4700]: I1007 12:15:33.642149 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d8ee9baf-609a-49cc-b7cb-4dac97633cf7","Type":"ContainerDied","Data":"3d5925b3c4399abe1405865eba72559ada3848c20da7f72331882fcd3f746e77"} Oct 07 12:15:33 crc kubenswrapper[4700]: E1007 12:15:33.643838 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:e54c1e1301be66933f3ecb01d5a0ca27f58aabfd905b18b7d057bbf23bdb7b0d\\\"\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76fc5d7d48-7mcmz" podUID="0319fc60-cd28-49d8-af70-3a2306fe89fd" Oct 07 12:15:34 crc kubenswrapper[4700]: E1007 12:15:34.006971 4700 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/perses-0-1-rhel9-operator@sha256:bfed9f442aea6e8165644f1dc615beea06ec7fd84ea3f8ca393a63d3627c6a7c" Oct 07 12:15:34 crc kubenswrapper[4700]: E1007 12:15:34.007519 4700 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:perses-operator,Image:registry.redhat.io/cluster-observability-operator/perses-0-1-rhel9-operator@sha256:bfed9f442aea6e8165644f1dc615beea06ec7fd84ea3f8ca393a63d3627c6a7c,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.2.2,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{134217728 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openshift-service-ca,ReadOnly:true,MountPath:/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t9h5j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod perses-operator-54bc95c9fb-fnrbp_openshift-operators(45384a63-61c2-4d8b-906a-e7545addde11): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 12:15:34 crc kubenswrapper[4700]: E1007 12:15:34.009011 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/perses-operator-54bc95c9fb-fnrbp" podUID="45384a63-61c2-4d8b-906a-e7545addde11" Oct 07 12:15:34 crc kubenswrapper[4700]: I1007 12:15:34.654087 4700 generic.go:334] "Generic (PLEG): container finished" podID="d8ee9baf-609a-49cc-b7cb-4dac97633cf7" containerID="c8011c24efaede31009a73b238f187fa55916e70bfc6fd31438c60b8ba170489" exitCode=0 Oct 07 12:15:34 crc kubenswrapper[4700]: I1007 12:15:34.654114 4700 generic.go:334] "Generic (PLEG): container finished" podID="d8ee9baf-609a-49cc-b7cb-4dac97633cf7" containerID="68d410ba093c47f69da2a05ed03496e2f5cbe94daf2f506f20c5697bf54813ec" exitCode=0 Oct 07 12:15:34 crc kubenswrapper[4700]: I1007 12:15:34.654906 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d8ee9baf-609a-49cc-b7cb-4dac97633cf7","Type":"ContainerDied","Data":"c8011c24efaede31009a73b238f187fa55916e70bfc6fd31438c60b8ba170489"} Oct 07 12:15:34 crc kubenswrapper[4700]: I1007 12:15:34.654929 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d8ee9baf-609a-49cc-b7cb-4dac97633cf7","Type":"ContainerDied","Data":"68d410ba093c47f69da2a05ed03496e2f5cbe94daf2f506f20c5697bf54813ec"} Oct 07 12:15:34 crc kubenswrapper[4700]: E1007 12:15:34.655899 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/perses-0-1-rhel9-operator@sha256:bfed9f442aea6e8165644f1dc615beea06ec7fd84ea3f8ca393a63d3627c6a7c\\\"\"" pod="openshift-operators/perses-operator-54bc95c9fb-fnrbp" podUID="45384a63-61c2-4d8b-906a-e7545addde11" Oct 07 12:15:37 crc kubenswrapper[4700]: E1007 12:15:37.148460 4700 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:27ffe36aad6e606e6d0a211f48f3cdb58a53aa0d5e8ead6a444427231261ab9e" Oct 07 12:15:37 crc kubenswrapper[4700]: E1007 12:15:37.149054 4700 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:27ffe36aad6e606e6d0a211f48f3cdb58a53aa0d5e8ead6a444427231261ab9e,Command:[],Args:[--namespace=$(NAMESPACE) --images=alertmanager=$(RELATED_IMAGE_ALERTMANAGER) --images=prometheus=$(RELATED_IMAGE_PROMETHEUS) --images=thanos=$(RELATED_IMAGE_THANOS) --images=perses=$(RELATED_IMAGE_PERSES) --images=ui-dashboards=$(RELATED_IMAGE_CONSOLE_DASHBOARDS_PLUGIN) --images=ui-distributed-tracing=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN) --images=ui-distributed-tracing-pf5=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF5) --images=ui-distributed-tracing-pf4=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF4) --images=ui-logging=$(RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN) --images=ui-logging-pf4=$(RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN_PF4) --images=ui-troubleshooting-panel=$(RELATED_IMAGE_CONSOLE_TROUBLESHOOTING_PANEL_PLUGIN) --images=ui-monitoring=$(RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN) --images=ui-monitoring-pf5=$(RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN_PF5) --images=korrel8r=$(RELATED_IMAGE_KORREL8R) --images=health-analyzer=$(RELATED_IMAGE_CLUSTER_HEALTH_ANALYZER) --openshift.enabled=true],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:RELATED_IMAGE_ALERTMANAGER,Value:registry.redhat.io/cluster-observability-operator/alertmanager-rhel9@sha256:4d25b0e31549d780928d2dd3eed7defd9c6d460deb92dcff0fe72c5023029404,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS,Value:registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:a0a1d0e39de54c5b2786c2b82d0104f358b479135c069075ddd4f7cd76826c00,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_THANOS,Value:registry.redhat.io/cluster-observability-operator/thanos-rhel9@sha256:f3806c97420ec8ba91895ce7627df7612cccb927c05d7854377f45cdd6c924a8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PERSES,Value:registry.redhat.io/cluster-observability-operator/perses-0-50-rhel9@sha256:4b5e53d226733237fc5abd0476eb3c96162cf3d8da7aeba8deda631fa8987223,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DASHBOARDS_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/dashboards-console-plugin-0-4-rhel9@sha256:53125bddbefca2ba2b57c3fd74bd4b376da803e420201220548878f557bd6610,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-1-0-rhel9@sha256:1dbe9a684271e00c8f36d8b96c9b22f6ee3c6f907ea6ad20980901bd533f9a3a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF5,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-0-4-rhel9@sha256:6aafab2c90bcbc6702f2d63d585a764baa8de8207e6af7afa60f3976ddfa9bd3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF4,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-0-3-rhel9@sha256:9f80851e8137c2c5e5c2aee13fc663f6c7124d9524d88c06c1507748ce84e1ed,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/logging-console-plugin-6-1-rhel9@sha256:2c9b2be12f15f06a24393dbab6a31682cee399d42e2cc04b0dcf03b2b598d5cf,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN_PF4,Value:registry.redhat.io/cluster-observability-operator/logging-console-plugin-6-0-rhel9@sha256:e9042d93f624790c450724158a8323277e4dd136530c763fec8db31f51fd8552,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_TROUBLESHOOTING_PANEL_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/troubleshooting-panel-console-plugin-0-4-rhel9@sha256:456d45001816b9adc38745e0ad8705bdc0150d03d0f65e0dfa9caf3fb8980fad,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/monitoring-console-plugin-0-5-rhel9@sha256:f3446969c67c18b44bee38ac946091fe9397a2117cb5b7aacb39406461c1efe1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN_PF5,Value:registry.redhat.io/cluster-observability-operator/monitoring-console-plugin-0-4-rhel9@sha256:ade84f8be7d23bd4b9c80e07462dc947280f0bcf6071e6edd927fef54c254b7e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KORREL8R,Value:registry.redhat.io/cluster-observability-operator/korrel8r-rhel9@sha256:039e139cf9217bbe72248674df76cbe4baf4bef9f8dc367d2cb51eae9c4aa9d7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLUSTER_HEALTH_ANALYZER,Value:registry.redhat.io/cluster-observability-operator/cluster-health-analyzer-rhel9@sha256:142180f277f0221ef2d4176f9af6dcdb4e7ab434a68f0dfad2ee5bee0e667ddd,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.2.2,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{400 -3} {} 400m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:observability-operator-tls,ReadOnly:true,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pdfn7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod observability-operator-cc5f78dfc-w5vqc_openshift-operators(c82137a0-2748-492d-bd33-39b03e9c8139): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 12:15:37 crc kubenswrapper[4700]: E1007 12:15:37.150793 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/observability-operator-cc5f78dfc-w5vqc" podUID="c82137a0-2748-492d-bd33-39b03e9c8139" Oct 07 12:15:37 crc kubenswrapper[4700]: E1007 12:15:37.698424 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:27ffe36aad6e606e6d0a211f48f3cdb58a53aa0d5e8ead6a444427231261ab9e\\\"\"" pod="openshift-operators/observability-operator-cc5f78dfc-w5vqc" podUID="c82137a0-2748-492d-bd33-39b03e9c8139" Oct 07 12:15:39 crc kubenswrapper[4700]: I1007 12:15:39.654766 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 07 12:15:39 crc kubenswrapper[4700]: I1007 12:15:39.717517 4700 generic.go:334] "Generic (PLEG): container finished" podID="d8ee9baf-609a-49cc-b7cb-4dac97633cf7" containerID="5d7564d0387cbfcc1a09a19e44f90fecda9b25f45f546cb2c9c2c9e3b00c13f9" exitCode=0 Oct 07 12:15:39 crc kubenswrapper[4700]: I1007 12:15:39.717812 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d8ee9baf-609a-49cc-b7cb-4dac97633cf7","Type":"ContainerDied","Data":"5d7564d0387cbfcc1a09a19e44f90fecda9b25f45f546cb2c9c2c9e3b00c13f9"} Oct 07 12:15:39 crc kubenswrapper[4700]: I1007 12:15:39.717838 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d8ee9baf-609a-49cc-b7cb-4dac97633cf7","Type":"ContainerDied","Data":"14ce5b803c7b19ddb3f6f4a88e5b9bdff34e8717412d80fc9f74c5c822e44a7c"} Oct 07 12:15:39 crc kubenswrapper[4700]: I1007 12:15:39.717853 4700 scope.go:117] "RemoveContainer" containerID="c8011c24efaede31009a73b238f187fa55916e70bfc6fd31438c60b8ba170489" Oct 07 12:15:39 crc kubenswrapper[4700]: I1007 12:15:39.718175 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 07 12:15:39 crc kubenswrapper[4700]: I1007 12:15:39.761058 4700 scope.go:117] "RemoveContainer" containerID="5d7564d0387cbfcc1a09a19e44f90fecda9b25f45f546cb2c9c2c9e3b00c13f9" Oct 07 12:15:39 crc kubenswrapper[4700]: I1007 12:15:39.774133 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8ee9baf-609a-49cc-b7cb-4dac97633cf7-public-tls-certs\") pod \"d8ee9baf-609a-49cc-b7cb-4dac97633cf7\" (UID: \"d8ee9baf-609a-49cc-b7cb-4dac97633cf7\") " Oct 07 12:15:39 crc kubenswrapper[4700]: I1007 12:15:39.774264 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8ee9baf-609a-49cc-b7cb-4dac97633cf7-internal-tls-certs\") pod \"d8ee9baf-609a-49cc-b7cb-4dac97633cf7\" (UID: \"d8ee9baf-609a-49cc-b7cb-4dac97633cf7\") " Oct 07 12:15:39 crc kubenswrapper[4700]: I1007 12:15:39.774331 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8ee9baf-609a-49cc-b7cb-4dac97633cf7-config-data\") pod \"d8ee9baf-609a-49cc-b7cb-4dac97633cf7\" (UID: \"d8ee9baf-609a-49cc-b7cb-4dac97633cf7\") " Oct 07 12:15:39 crc kubenswrapper[4700]: I1007 12:15:39.774348 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8ee9baf-609a-49cc-b7cb-4dac97633cf7-combined-ca-bundle\") pod \"d8ee9baf-609a-49cc-b7cb-4dac97633cf7\" (UID: \"d8ee9baf-609a-49cc-b7cb-4dac97633cf7\") " Oct 07 12:15:39 crc kubenswrapper[4700]: I1007 12:15:39.774381 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrvnv\" (UniqueName: \"kubernetes.io/projected/d8ee9baf-609a-49cc-b7cb-4dac97633cf7-kube-api-access-rrvnv\") pod \"d8ee9baf-609a-49cc-b7cb-4dac97633cf7\" (UID: \"d8ee9baf-609a-49cc-b7cb-4dac97633cf7\") " Oct 07 12:15:39 crc kubenswrapper[4700]: I1007 12:15:39.774496 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8ee9baf-609a-49cc-b7cb-4dac97633cf7-scripts\") pod \"d8ee9baf-609a-49cc-b7cb-4dac97633cf7\" (UID: \"d8ee9baf-609a-49cc-b7cb-4dac97633cf7\") " Oct 07 12:15:39 crc kubenswrapper[4700]: I1007 12:15:39.780707 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8ee9baf-609a-49cc-b7cb-4dac97633cf7-kube-api-access-rrvnv" (OuterVolumeSpecName: "kube-api-access-rrvnv") pod "d8ee9baf-609a-49cc-b7cb-4dac97633cf7" (UID: "d8ee9baf-609a-49cc-b7cb-4dac97633cf7"). InnerVolumeSpecName "kube-api-access-rrvnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:15:39 crc kubenswrapper[4700]: I1007 12:15:39.782510 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8ee9baf-609a-49cc-b7cb-4dac97633cf7-scripts" (OuterVolumeSpecName: "scripts") pod "d8ee9baf-609a-49cc-b7cb-4dac97633cf7" (UID: "d8ee9baf-609a-49cc-b7cb-4dac97633cf7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:15:39 crc kubenswrapper[4700]: I1007 12:15:39.795477 4700 scope.go:117] "RemoveContainer" containerID="68d410ba093c47f69da2a05ed03496e2f5cbe94daf2f506f20c5697bf54813ec" Oct 07 12:15:39 crc kubenswrapper[4700]: I1007 12:15:39.847494 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8ee9baf-609a-49cc-b7cb-4dac97633cf7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d8ee9baf-609a-49cc-b7cb-4dac97633cf7" (UID: "d8ee9baf-609a-49cc-b7cb-4dac97633cf7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:15:39 crc kubenswrapper[4700]: I1007 12:15:39.875123 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8ee9baf-609a-49cc-b7cb-4dac97633cf7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d8ee9baf-609a-49cc-b7cb-4dac97633cf7" (UID: "d8ee9baf-609a-49cc-b7cb-4dac97633cf7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:15:39 crc kubenswrapper[4700]: I1007 12:15:39.877510 4700 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8ee9baf-609a-49cc-b7cb-4dac97633cf7-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:15:39 crc kubenswrapper[4700]: I1007 12:15:39.877527 4700 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8ee9baf-609a-49cc-b7cb-4dac97633cf7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:15:39 crc kubenswrapper[4700]: I1007 12:15:39.877536 4700 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8ee9baf-609a-49cc-b7cb-4dac97633cf7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:15:39 crc kubenswrapper[4700]: I1007 12:15:39.877544 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrvnv\" (UniqueName: \"kubernetes.io/projected/d8ee9baf-609a-49cc-b7cb-4dac97633cf7-kube-api-access-rrvnv\") on node \"crc\" DevicePath \"\"" Oct 07 12:15:39 crc kubenswrapper[4700]: I1007 12:15:39.915797 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8ee9baf-609a-49cc-b7cb-4dac97633cf7-config-data" (OuterVolumeSpecName: "config-data") pod "d8ee9baf-609a-49cc-b7cb-4dac97633cf7" (UID: "d8ee9baf-609a-49cc-b7cb-4dac97633cf7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:15:39 crc kubenswrapper[4700]: I1007 12:15:39.919169 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8ee9baf-609a-49cc-b7cb-4dac97633cf7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8ee9baf-609a-49cc-b7cb-4dac97633cf7" (UID: "d8ee9baf-609a-49cc-b7cb-4dac97633cf7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:15:39 crc kubenswrapper[4700]: I1007 12:15:39.956723 4700 scope.go:117] "RemoveContainer" containerID="3d5925b3c4399abe1405865eba72559ada3848c20da7f72331882fcd3f746e77" Oct 07 12:15:39 crc kubenswrapper[4700]: I1007 12:15:39.979230 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8ee9baf-609a-49cc-b7cb-4dac97633cf7-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:15:39 crc kubenswrapper[4700]: I1007 12:15:39.979602 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8ee9baf-609a-49cc-b7cb-4dac97633cf7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:15:39 crc kubenswrapper[4700]: I1007 12:15:39.998874 4700 scope.go:117] "RemoveContainer" containerID="c8011c24efaede31009a73b238f187fa55916e70bfc6fd31438c60b8ba170489" Oct 07 12:15:39 crc kubenswrapper[4700]: E1007 12:15:39.999248 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8011c24efaede31009a73b238f187fa55916e70bfc6fd31438c60b8ba170489\": container with ID starting with c8011c24efaede31009a73b238f187fa55916e70bfc6fd31438c60b8ba170489 not found: ID does not exist" containerID="c8011c24efaede31009a73b238f187fa55916e70bfc6fd31438c60b8ba170489" Oct 07 12:15:39 crc kubenswrapper[4700]: I1007 12:15:39.999372 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8011c24efaede31009a73b238f187fa55916e70bfc6fd31438c60b8ba170489"} err="failed to get container status \"c8011c24efaede31009a73b238f187fa55916e70bfc6fd31438c60b8ba170489\": rpc error: code = NotFound desc = could not find container \"c8011c24efaede31009a73b238f187fa55916e70bfc6fd31438c60b8ba170489\": container with ID starting with c8011c24efaede31009a73b238f187fa55916e70bfc6fd31438c60b8ba170489 not found: ID does not exist" Oct 07 12:15:39 crc kubenswrapper[4700]: I1007 12:15:39.999401 4700 scope.go:117] "RemoveContainer" containerID="5d7564d0387cbfcc1a09a19e44f90fecda9b25f45f546cb2c9c2c9e3b00c13f9" Oct 07 12:15:39 crc kubenswrapper[4700]: E1007 12:15:39.999738 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d7564d0387cbfcc1a09a19e44f90fecda9b25f45f546cb2c9c2c9e3b00c13f9\": container with ID starting with 5d7564d0387cbfcc1a09a19e44f90fecda9b25f45f546cb2c9c2c9e3b00c13f9 not found: ID does not exist" containerID="5d7564d0387cbfcc1a09a19e44f90fecda9b25f45f546cb2c9c2c9e3b00c13f9" Oct 07 12:15:39 crc kubenswrapper[4700]: I1007 12:15:39.999798 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d7564d0387cbfcc1a09a19e44f90fecda9b25f45f546cb2c9c2c9e3b00c13f9"} err="failed to get container status \"5d7564d0387cbfcc1a09a19e44f90fecda9b25f45f546cb2c9c2c9e3b00c13f9\": rpc error: code = NotFound desc = could not find container \"5d7564d0387cbfcc1a09a19e44f90fecda9b25f45f546cb2c9c2c9e3b00c13f9\": container with ID starting with 5d7564d0387cbfcc1a09a19e44f90fecda9b25f45f546cb2c9c2c9e3b00c13f9 not found: ID does not exist" Oct 07 12:15:39 crc kubenswrapper[4700]: I1007 12:15:39.999832 4700 scope.go:117] "RemoveContainer" containerID="68d410ba093c47f69da2a05ed03496e2f5cbe94daf2f506f20c5697bf54813ec" Oct 07 12:15:40 crc kubenswrapper[4700]: E1007 12:15:40.000170 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68d410ba093c47f69da2a05ed03496e2f5cbe94daf2f506f20c5697bf54813ec\": container with ID starting with 68d410ba093c47f69da2a05ed03496e2f5cbe94daf2f506f20c5697bf54813ec not found: ID does not exist" containerID="68d410ba093c47f69da2a05ed03496e2f5cbe94daf2f506f20c5697bf54813ec" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.000209 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68d410ba093c47f69da2a05ed03496e2f5cbe94daf2f506f20c5697bf54813ec"} err="failed to get container status \"68d410ba093c47f69da2a05ed03496e2f5cbe94daf2f506f20c5697bf54813ec\": rpc error: code = NotFound desc = could not find container \"68d410ba093c47f69da2a05ed03496e2f5cbe94daf2f506f20c5697bf54813ec\": container with ID starting with 68d410ba093c47f69da2a05ed03496e2f5cbe94daf2f506f20c5697bf54813ec not found: ID does not exist" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.000237 4700 scope.go:117] "RemoveContainer" containerID="3d5925b3c4399abe1405865eba72559ada3848c20da7f72331882fcd3f746e77" Oct 07 12:15:40 crc kubenswrapper[4700]: E1007 12:15:40.000489 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d5925b3c4399abe1405865eba72559ada3848c20da7f72331882fcd3f746e77\": container with ID starting with 3d5925b3c4399abe1405865eba72559ada3848c20da7f72331882fcd3f746e77 not found: ID does not exist" containerID="3d5925b3c4399abe1405865eba72559ada3848c20da7f72331882fcd3f746e77" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.000511 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d5925b3c4399abe1405865eba72559ada3848c20da7f72331882fcd3f746e77"} err="failed to get container status \"3d5925b3c4399abe1405865eba72559ada3848c20da7f72331882fcd3f746e77\": rpc error: code = NotFound desc = could not find container \"3d5925b3c4399abe1405865eba72559ada3848c20da7f72331882fcd3f746e77\": container with ID starting with 3d5925b3c4399abe1405865eba72559ada3848c20da7f72331882fcd3f746e77 not found: ID does not exist" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.044862 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.053710 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.068382 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Oct 07 12:15:40 crc kubenswrapper[4700]: E1007 12:15:40.069196 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8ee9baf-609a-49cc-b7cb-4dac97633cf7" containerName="aodh-evaluator" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.069266 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8ee9baf-609a-49cc-b7cb-4dac97633cf7" containerName="aodh-evaluator" Oct 07 12:15:40 crc kubenswrapper[4700]: E1007 12:15:40.069295 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8ee9baf-609a-49cc-b7cb-4dac97633cf7" containerName="aodh-api" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.069357 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8ee9baf-609a-49cc-b7cb-4dac97633cf7" containerName="aodh-api" Oct 07 12:15:40 crc kubenswrapper[4700]: E1007 12:15:40.069428 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8ee9baf-609a-49cc-b7cb-4dac97633cf7" containerName="aodh-listener" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.069445 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8ee9baf-609a-49cc-b7cb-4dac97633cf7" containerName="aodh-listener" Oct 07 12:15:40 crc kubenswrapper[4700]: E1007 12:15:40.069502 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8ee9baf-609a-49cc-b7cb-4dac97633cf7" containerName="aodh-notifier" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.069519 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8ee9baf-609a-49cc-b7cb-4dac97633cf7" containerName="aodh-notifier" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.070168 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8ee9baf-609a-49cc-b7cb-4dac97633cf7" containerName="aodh-notifier" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.070219 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8ee9baf-609a-49cc-b7cb-4dac97633cf7" containerName="aodh-api" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.070243 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8ee9baf-609a-49cc-b7cb-4dac97633cf7" containerName="aodh-evaluator" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.070267 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8ee9baf-609a-49cc-b7cb-4dac97633cf7" containerName="aodh-listener" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.073373 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.075778 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.075821 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.076020 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-k7ltr" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.076064 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.076145 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.077936 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.183518 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa-public-tls-certs\") pod \"aodh-0\" (UID: \"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa\") " pod="openstack/aodh-0" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.183579 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa-combined-ca-bundle\") pod \"aodh-0\" (UID: \"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa\") " pod="openstack/aodh-0" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.183813 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa-internal-tls-certs\") pod \"aodh-0\" (UID: \"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa\") " pod="openstack/aodh-0" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.184124 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg2wb\" (UniqueName: \"kubernetes.io/projected/c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa-kube-api-access-vg2wb\") pod \"aodh-0\" (UID: \"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa\") " pod="openstack/aodh-0" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.184354 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa-scripts\") pod \"aodh-0\" (UID: \"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa\") " pod="openstack/aodh-0" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.184442 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa-config-data\") pod \"aodh-0\" (UID: \"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa\") " pod="openstack/aodh-0" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.286447 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa-scripts\") pod \"aodh-0\" (UID: \"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa\") " pod="openstack/aodh-0" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.286535 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa-config-data\") pod \"aodh-0\" (UID: \"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa\") " pod="openstack/aodh-0" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.286585 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa-public-tls-certs\") pod \"aodh-0\" (UID: \"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa\") " pod="openstack/aodh-0" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.286636 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa-combined-ca-bundle\") pod \"aodh-0\" (UID: \"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa\") " pod="openstack/aodh-0" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.286684 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa-internal-tls-certs\") pod \"aodh-0\" (UID: \"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa\") " pod="openstack/aodh-0" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.286754 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg2wb\" (UniqueName: \"kubernetes.io/projected/c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa-kube-api-access-vg2wb\") pod \"aodh-0\" (UID: \"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa\") " pod="openstack/aodh-0" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.291070 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa-config-data\") pod \"aodh-0\" (UID: \"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa\") " pod="openstack/aodh-0" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.292536 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa-scripts\") pod \"aodh-0\" (UID: \"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa\") " pod="openstack/aodh-0" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.293064 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa-combined-ca-bundle\") pod \"aodh-0\" (UID: \"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa\") " pod="openstack/aodh-0" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.294734 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa-internal-tls-certs\") pod \"aodh-0\" (UID: \"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa\") " pod="openstack/aodh-0" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.295091 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa-public-tls-certs\") pod \"aodh-0\" (UID: \"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa\") " pod="openstack/aodh-0" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.309594 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg2wb\" (UniqueName: \"kubernetes.io/projected/c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa-kube-api-access-vg2wb\") pod \"aodh-0\" (UID: \"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa\") " pod="openstack/aodh-0" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.410167 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 07 12:15:40 crc kubenswrapper[4700]: I1007 12:15:40.901935 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 07 12:15:41 crc kubenswrapper[4700]: I1007 12:15:41.742977 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa","Type":"ContainerStarted","Data":"87eddb294453950166f6acf1c7f259a303ebc46deb062ed4807ab505003d201e"} Oct 07 12:15:41 crc kubenswrapper[4700]: I1007 12:15:41.978753 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8ee9baf-609a-49cc-b7cb-4dac97633cf7" path="/var/lib/kubelet/pods/d8ee9baf-609a-49cc-b7cb-4dac97633cf7/volumes" Oct 07 12:15:44 crc kubenswrapper[4700]: I1007 12:15:44.800619 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa","Type":"ContainerStarted","Data":"c8ebf6c572a1d0045b4d3567acfcd8b1ef596b70299affbd185a96d51b1a5d15"} Oct 07 12:15:45 crc kubenswrapper[4700]: I1007 12:15:45.838564 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa","Type":"ContainerStarted","Data":"71c567ffb2171f93e772220f6f17e84784a2de84f7eef5d68f81d1d7b7e6f75e"} Oct 07 12:15:45 crc kubenswrapper[4700]: I1007 12:15:45.842591 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-8xvhj" event={"ID":"398ce44d-03fb-4ee9-ac61-2ca3fd52074e","Type":"ContainerStarted","Data":"4349bf33969cffb6025f83583f76ed7bdd91a47e08ee7a76b09b557616aeaac1"} Oct 07 12:15:45 crc kubenswrapper[4700]: I1007 12:15:45.882098 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-8xvhj" podStartSLOduration=2.482738934 podStartE2EDuration="38.882076895s" podCreationTimestamp="2025-10-07 12:15:07 +0000 UTC" firstStartedPulling="2025-10-07 12:15:08.717772714 +0000 UTC m=+3275.514171703" lastFinishedPulling="2025-10-07 12:15:45.117110665 +0000 UTC m=+3311.913509664" observedRunningTime="2025-10-07 12:15:45.87612736 +0000 UTC m=+3312.672526349" watchObservedRunningTime="2025-10-07 12:15:45.882076895 +0000 UTC m=+3312.678475884" Oct 07 12:15:46 crc kubenswrapper[4700]: I1007 12:15:46.854990 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76fc5d7d48-7mcmz" event={"ID":"0319fc60-cd28-49d8-af70-3a2306fe89fd","Type":"ContainerStarted","Data":"c0577dc1cd27b4ecfe2c6bcefc72e1f41d22bb8c00e4b087d8f9007dc763716d"} Oct 07 12:15:46 crc kubenswrapper[4700]: I1007 12:15:46.857079 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76fc5d7d48-82r7d" event={"ID":"7d619134-8aae-4140-b5ca-33deeac1a66c","Type":"ContainerStarted","Data":"1c25fdfffcb3cccf2b0268a1b86ef5e5d72123202f42b54839c7885f9cb5fcb5"} Oct 07 12:15:46 crc kubenswrapper[4700]: I1007 12:15:46.859372 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa","Type":"ContainerStarted","Data":"982c82799820fa195cf5f9e3ac7199473f9f9a69a83fb56d5ee791f9aa775fac"} Oct 07 12:15:46 crc kubenswrapper[4700]: I1007 12:15:46.879508 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76fc5d7d48-7mcmz" podStartSLOduration=-9223371996.975286 podStartE2EDuration="39.879489886s" podCreationTimestamp="2025-10-07 12:15:07 +0000 UTC" firstStartedPulling="2025-10-07 12:15:08.960139294 +0000 UTC m=+3275.756538283" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:15:46.868695624 +0000 UTC m=+3313.665094633" watchObservedRunningTime="2025-10-07 12:15:46.879489886 +0000 UTC m=+3313.675888895" Oct 07 12:15:46 crc kubenswrapper[4700]: I1007 12:15:46.910159 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76fc5d7d48-82r7d" podStartSLOduration=3.09578016 podStartE2EDuration="39.910140838s" podCreationTimestamp="2025-10-07 12:15:07 +0000 UTC" firstStartedPulling="2025-10-07 12:15:08.981981575 +0000 UTC m=+3275.778380564" lastFinishedPulling="2025-10-07 12:15:45.796342253 +0000 UTC m=+3312.592741242" observedRunningTime="2025-10-07 12:15:46.89453013 +0000 UTC m=+3313.690929139" watchObservedRunningTime="2025-10-07 12:15:46.910140838 +0000 UTC m=+3313.706539827" Oct 07 12:15:48 crc kubenswrapper[4700]: I1007 12:15:48.887525 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa","Type":"ContainerStarted","Data":"c370b1badeb86ec56c55a7661f360de1c3fda515948096088ee374a1e12ff751"} Oct 07 12:15:48 crc kubenswrapper[4700]: I1007 12:15:48.891645 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-fnrbp" event={"ID":"45384a63-61c2-4d8b-906a-e7545addde11","Type":"ContainerStarted","Data":"69fab556f3a371518a777a091e0fb1a05ef99479bc1bac42c5e491235bf79174"} Oct 07 12:15:48 crc kubenswrapper[4700]: I1007 12:15:48.891819 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54bc95c9fb-fnrbp" Oct 07 12:15:48 crc kubenswrapper[4700]: I1007 12:15:48.907763 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.8739620289999999 podStartE2EDuration="8.907748581s" podCreationTimestamp="2025-10-07 12:15:40 +0000 UTC" firstStartedPulling="2025-10-07 12:15:40.908199927 +0000 UTC m=+3307.704598926" lastFinishedPulling="2025-10-07 12:15:47.941986479 +0000 UTC m=+3314.738385478" observedRunningTime="2025-10-07 12:15:48.90657236 +0000 UTC m=+3315.702971349" watchObservedRunningTime="2025-10-07 12:15:48.907748581 +0000 UTC m=+3315.704147570" Oct 07 12:15:48 crc kubenswrapper[4700]: I1007 12:15:48.929713 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54bc95c9fb-fnrbp" podStartSLOduration=2.275248085 podStartE2EDuration="40.929693775s" podCreationTimestamp="2025-10-07 12:15:08 +0000 UTC" firstStartedPulling="2025-10-07 12:15:09.288676988 +0000 UTC m=+3276.085075977" lastFinishedPulling="2025-10-07 12:15:47.943122658 +0000 UTC m=+3314.739521667" observedRunningTime="2025-10-07 12:15:48.92909446 +0000 UTC m=+3315.725493449" watchObservedRunningTime="2025-10-07 12:15:48.929693775 +0000 UTC m=+3315.726092764" Oct 07 12:15:53 crc kubenswrapper[4700]: I1007 12:15:53.951380 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-w5vqc" event={"ID":"c82137a0-2748-492d-bd33-39b03e9c8139","Type":"ContainerStarted","Data":"2cc46864bed81ab08e2c9df6b0c457d35dfe51225e615d3c0ed34b4e031c206c"} Oct 07 12:15:53 crc kubenswrapper[4700]: I1007 12:15:53.953103 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-cc5f78dfc-w5vqc" Oct 07 12:15:53 crc kubenswrapper[4700]: I1007 12:15:53.980445 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-cc5f78dfc-w5vqc" podStartSLOduration=1.620869697 podStartE2EDuration="45.980423133s" podCreationTimestamp="2025-10-07 12:15:08 +0000 UTC" firstStartedPulling="2025-10-07 12:15:09.145490882 +0000 UTC m=+3275.941889871" lastFinishedPulling="2025-10-07 12:15:53.505044308 +0000 UTC m=+3320.301443307" observedRunningTime="2025-10-07 12:15:53.973449611 +0000 UTC m=+3320.769848600" watchObservedRunningTime="2025-10-07 12:15:53.980423133 +0000 UTC m=+3320.776822132" Oct 07 12:15:53 crc kubenswrapper[4700]: I1007 12:15:53.986525 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-cc5f78dfc-w5vqc" Oct 07 12:15:56 crc kubenswrapper[4700]: I1007 12:15:56.623194 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 07 12:15:56 crc kubenswrapper[4700]: I1007 12:15:56.626558 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Oct 07 12:15:56 crc kubenswrapper[4700]: I1007 12:15:56.628527 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Oct 07 12:15:56 crc kubenswrapper[4700]: I1007 12:15:56.628599 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Oct 07 12:15:56 crc kubenswrapper[4700]: I1007 12:15:56.629489 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Oct 07 12:15:56 crc kubenswrapper[4700]: I1007 12:15:56.637512 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-gq8d7" Oct 07 12:15:56 crc kubenswrapper[4700]: I1007 12:15:56.651202 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 07 12:15:56 crc kubenswrapper[4700]: I1007 12:15:56.799476 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m85v8\" (UniqueName: \"kubernetes.io/projected/de281a78-c284-4c5e-8312-6661e2543668-kube-api-access-m85v8\") pod \"alertmanager-metric-storage-0\" (UID: \"de281a78-c284-4c5e-8312-6661e2543668\") " pod="openstack/alertmanager-metric-storage-0" Oct 07 12:15:56 crc kubenswrapper[4700]: I1007 12:15:56.799543 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/de281a78-c284-4c5e-8312-6661e2543668-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"de281a78-c284-4c5e-8312-6661e2543668\") " pod="openstack/alertmanager-metric-storage-0" Oct 07 12:15:56 crc kubenswrapper[4700]: I1007 12:15:56.799578 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/de281a78-c284-4c5e-8312-6661e2543668-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"de281a78-c284-4c5e-8312-6661e2543668\") " pod="openstack/alertmanager-metric-storage-0" Oct 07 12:15:56 crc kubenswrapper[4700]: I1007 12:15:56.799596 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/de281a78-c284-4c5e-8312-6661e2543668-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"de281a78-c284-4c5e-8312-6661e2543668\") " pod="openstack/alertmanager-metric-storage-0" Oct 07 12:15:56 crc kubenswrapper[4700]: I1007 12:15:56.799616 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/de281a78-c284-4c5e-8312-6661e2543668-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"de281a78-c284-4c5e-8312-6661e2543668\") " pod="openstack/alertmanager-metric-storage-0" Oct 07 12:15:56 crc kubenswrapper[4700]: I1007 12:15:56.799720 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/de281a78-c284-4c5e-8312-6661e2543668-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"de281a78-c284-4c5e-8312-6661e2543668\") " pod="openstack/alertmanager-metric-storage-0" Oct 07 12:15:56 crc kubenswrapper[4700]: I1007 12:15:56.901086 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/de281a78-c284-4c5e-8312-6661e2543668-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"de281a78-c284-4c5e-8312-6661e2543668\") " pod="openstack/alertmanager-metric-storage-0" Oct 07 12:15:56 crc kubenswrapper[4700]: I1007 12:15:56.901198 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m85v8\" (UniqueName: \"kubernetes.io/projected/de281a78-c284-4c5e-8312-6661e2543668-kube-api-access-m85v8\") pod \"alertmanager-metric-storage-0\" (UID: \"de281a78-c284-4c5e-8312-6661e2543668\") " pod="openstack/alertmanager-metric-storage-0" Oct 07 12:15:56 crc kubenswrapper[4700]: I1007 12:15:56.901252 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/de281a78-c284-4c5e-8312-6661e2543668-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"de281a78-c284-4c5e-8312-6661e2543668\") " pod="openstack/alertmanager-metric-storage-0" Oct 07 12:15:56 crc kubenswrapper[4700]: I1007 12:15:56.901296 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/de281a78-c284-4c5e-8312-6661e2543668-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"de281a78-c284-4c5e-8312-6661e2543668\") " pod="openstack/alertmanager-metric-storage-0" Oct 07 12:15:56 crc kubenswrapper[4700]: I1007 12:15:56.901333 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/de281a78-c284-4c5e-8312-6661e2543668-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"de281a78-c284-4c5e-8312-6661e2543668\") " pod="openstack/alertmanager-metric-storage-0" Oct 07 12:15:56 crc kubenswrapper[4700]: I1007 12:15:56.901359 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/de281a78-c284-4c5e-8312-6661e2543668-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"de281a78-c284-4c5e-8312-6661e2543668\") " pod="openstack/alertmanager-metric-storage-0" Oct 07 12:15:56 crc kubenswrapper[4700]: I1007 12:15:56.902044 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/de281a78-c284-4c5e-8312-6661e2543668-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"de281a78-c284-4c5e-8312-6661e2543668\") " pod="openstack/alertmanager-metric-storage-0" Oct 07 12:15:56 crc kubenswrapper[4700]: I1007 12:15:56.906870 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/de281a78-c284-4c5e-8312-6661e2543668-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"de281a78-c284-4c5e-8312-6661e2543668\") " pod="openstack/alertmanager-metric-storage-0" Oct 07 12:15:56 crc kubenswrapper[4700]: I1007 12:15:56.907161 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/de281a78-c284-4c5e-8312-6661e2543668-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"de281a78-c284-4c5e-8312-6661e2543668\") " pod="openstack/alertmanager-metric-storage-0" Oct 07 12:15:56 crc kubenswrapper[4700]: I1007 12:15:56.907955 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/de281a78-c284-4c5e-8312-6661e2543668-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"de281a78-c284-4c5e-8312-6661e2543668\") " pod="openstack/alertmanager-metric-storage-0" Oct 07 12:15:56 crc kubenswrapper[4700]: I1007 12:15:56.908503 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/de281a78-c284-4c5e-8312-6661e2543668-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"de281a78-c284-4c5e-8312-6661e2543668\") " pod="openstack/alertmanager-metric-storage-0" Oct 07 12:15:56 crc kubenswrapper[4700]: I1007 12:15:56.929161 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m85v8\" (UniqueName: \"kubernetes.io/projected/de281a78-c284-4c5e-8312-6661e2543668-kube-api-access-m85v8\") pod \"alertmanager-metric-storage-0\" (UID: \"de281a78-c284-4c5e-8312-6661e2543668\") " pod="openstack/alertmanager-metric-storage-0" Oct 07 12:15:56 crc kubenswrapper[4700]: I1007 12:15:56.950202 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.182283 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.185759 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.188707 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.188959 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.189076 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-cgn6d" Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.189108 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.189788 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.189900 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.205278 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.310018 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e554be8a-675d-4ec2-b2d9-1b74f4ebb37f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e554be8a-675d-4ec2-b2d9-1b74f4ebb37f\") pod \"prometheus-metric-storage-0\" (UID: \"a460e42e-4de8-416d-8ffd-fbc501c5a047\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.310084 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pmls\" (UniqueName: \"kubernetes.io/projected/a460e42e-4de8-416d-8ffd-fbc501c5a047-kube-api-access-6pmls\") pod \"prometheus-metric-storage-0\" (UID: \"a460e42e-4de8-416d-8ffd-fbc501c5a047\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.310142 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a460e42e-4de8-416d-8ffd-fbc501c5a047-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"a460e42e-4de8-416d-8ffd-fbc501c5a047\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.310165 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a460e42e-4de8-416d-8ffd-fbc501c5a047-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"a460e42e-4de8-416d-8ffd-fbc501c5a047\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.310191 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a460e42e-4de8-416d-8ffd-fbc501c5a047-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"a460e42e-4de8-416d-8ffd-fbc501c5a047\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.310217 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a460e42e-4de8-416d-8ffd-fbc501c5a047-config\") pod \"prometheus-metric-storage-0\" (UID: \"a460e42e-4de8-416d-8ffd-fbc501c5a047\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.310261 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a460e42e-4de8-416d-8ffd-fbc501c5a047-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"a460e42e-4de8-416d-8ffd-fbc501c5a047\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.310283 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a460e42e-4de8-416d-8ffd-fbc501c5a047-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"a460e42e-4de8-416d-8ffd-fbc501c5a047\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.412159 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e554be8a-675d-4ec2-b2d9-1b74f4ebb37f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e554be8a-675d-4ec2-b2d9-1b74f4ebb37f\") pod \"prometheus-metric-storage-0\" (UID: \"a460e42e-4de8-416d-8ffd-fbc501c5a047\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.412232 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pmls\" (UniqueName: \"kubernetes.io/projected/a460e42e-4de8-416d-8ffd-fbc501c5a047-kube-api-access-6pmls\") pod \"prometheus-metric-storage-0\" (UID: \"a460e42e-4de8-416d-8ffd-fbc501c5a047\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.412289 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a460e42e-4de8-416d-8ffd-fbc501c5a047-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"a460e42e-4de8-416d-8ffd-fbc501c5a047\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.412337 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a460e42e-4de8-416d-8ffd-fbc501c5a047-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"a460e42e-4de8-416d-8ffd-fbc501c5a047\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.412364 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a460e42e-4de8-416d-8ffd-fbc501c5a047-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"a460e42e-4de8-416d-8ffd-fbc501c5a047\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.412387 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a460e42e-4de8-416d-8ffd-fbc501c5a047-config\") pod \"prometheus-metric-storage-0\" (UID: \"a460e42e-4de8-416d-8ffd-fbc501c5a047\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.413514 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a460e42e-4de8-416d-8ffd-fbc501c5a047-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"a460e42e-4de8-416d-8ffd-fbc501c5a047\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.413564 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a460e42e-4de8-416d-8ffd-fbc501c5a047-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"a460e42e-4de8-416d-8ffd-fbc501c5a047\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.414379 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a460e42e-4de8-416d-8ffd-fbc501c5a047-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"a460e42e-4de8-416d-8ffd-fbc501c5a047\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.417739 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a460e42e-4de8-416d-8ffd-fbc501c5a047-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"a460e42e-4de8-416d-8ffd-fbc501c5a047\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.419612 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a460e42e-4de8-416d-8ffd-fbc501c5a047-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"a460e42e-4de8-416d-8ffd-fbc501c5a047\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.419888 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a460e42e-4de8-416d-8ffd-fbc501c5a047-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"a460e42e-4de8-416d-8ffd-fbc501c5a047\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.420247 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a460e42e-4de8-416d-8ffd-fbc501c5a047-config\") pod \"prometheus-metric-storage-0\" (UID: \"a460e42e-4de8-416d-8ffd-fbc501c5a047\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.420788 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a460e42e-4de8-416d-8ffd-fbc501c5a047-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"a460e42e-4de8-416d-8ffd-fbc501c5a047\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.431110 4700 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.431149 4700 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e554be8a-675d-4ec2-b2d9-1b74f4ebb37f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e554be8a-675d-4ec2-b2d9-1b74f4ebb37f\") pod \"prometheus-metric-storage-0\" (UID: \"a460e42e-4de8-416d-8ffd-fbc501c5a047\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/902ed96a99d52744c097638c6d4b432f7c41d1a9eceb55821960a227d9fe2797/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.438697 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pmls\" (UniqueName: \"kubernetes.io/projected/a460e42e-4de8-416d-8ffd-fbc501c5a047-kube-api-access-6pmls\") pod \"prometheus-metric-storage-0\" (UID: \"a460e42e-4de8-416d-8ffd-fbc501c5a047\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.446284 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.449601 4700 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.517495 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e554be8a-675d-4ec2-b2d9-1b74f4ebb37f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e554be8a-675d-4ec2-b2d9-1b74f4ebb37f\") pod \"prometheus-metric-storage-0\" (UID: \"a460e42e-4de8-416d-8ffd-fbc501c5a047\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:15:57 crc kubenswrapper[4700]: I1007 12:15:57.530604 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 07 12:15:58 crc kubenswrapper[4700]: I1007 12:15:58.026966 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"de281a78-c284-4c5e-8312-6661e2543668","Type":"ContainerStarted","Data":"86a8f20491556613b8afbc347da4adc604bec3166aa2c9e11b6977671e03497b"} Oct 07 12:15:58 crc kubenswrapper[4700]: I1007 12:15:58.087734 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 07 12:15:58 crc kubenswrapper[4700]: W1007 12:15:58.105134 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda460e42e_4de8_416d_8ffd_fbc501c5a047.slice/crio-b452285874aa5a1c46ca9fd55b52d52e49d113bfc57d29030156d612ad6c5dc9 WatchSource:0}: Error finding container b452285874aa5a1c46ca9fd55b52d52e49d113bfc57d29030156d612ad6c5dc9: Status 404 returned error can't find the container with id b452285874aa5a1c46ca9fd55b52d52e49d113bfc57d29030156d612ad6c5dc9 Oct 07 12:15:58 crc kubenswrapper[4700]: I1007 12:15:58.766766 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54bc95c9fb-fnrbp" Oct 07 12:15:59 crc kubenswrapper[4700]: I1007 12:15:59.036408 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a460e42e-4de8-416d-8ffd-fbc501c5a047","Type":"ContainerStarted","Data":"b452285874aa5a1c46ca9fd55b52d52e49d113bfc57d29030156d612ad6c5dc9"} Oct 07 12:16:04 crc kubenswrapper[4700]: I1007 12:16:04.090929 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"de281a78-c284-4c5e-8312-6661e2543668","Type":"ContainerStarted","Data":"ee1e5cd6b0dc3e5c217d411bef05e641d96c539c115cded1fc5cf52874f6ea8c"} Oct 07 12:16:04 crc kubenswrapper[4700]: I1007 12:16:04.092290 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a460e42e-4de8-416d-8ffd-fbc501c5a047","Type":"ContainerStarted","Data":"74d637a47f56537d37c339755a1f8596ac6ae05a2b67a832691725e6e5ae3b13"} Oct 07 12:16:12 crc kubenswrapper[4700]: I1007 12:16:12.207529 4700 generic.go:334] "Generic (PLEG): container finished" podID="de281a78-c284-4c5e-8312-6661e2543668" containerID="ee1e5cd6b0dc3e5c217d411bef05e641d96c539c115cded1fc5cf52874f6ea8c" exitCode=0 Oct 07 12:16:12 crc kubenswrapper[4700]: I1007 12:16:12.207613 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"de281a78-c284-4c5e-8312-6661e2543668","Type":"ContainerDied","Data":"ee1e5cd6b0dc3e5c217d411bef05e641d96c539c115cded1fc5cf52874f6ea8c"} Oct 07 12:16:13 crc kubenswrapper[4700]: I1007 12:16:13.225799 4700 generic.go:334] "Generic (PLEG): container finished" podID="a460e42e-4de8-416d-8ffd-fbc501c5a047" containerID="74d637a47f56537d37c339755a1f8596ac6ae05a2b67a832691725e6e5ae3b13" exitCode=0 Oct 07 12:16:13 crc kubenswrapper[4700]: I1007 12:16:13.226382 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a460e42e-4de8-416d-8ffd-fbc501c5a047","Type":"ContainerDied","Data":"74d637a47f56537d37c339755a1f8596ac6ae05a2b67a832691725e6e5ae3b13"} Oct 07 12:16:15 crc kubenswrapper[4700]: I1007 12:16:15.334232 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:16:15 crc kubenswrapper[4700]: I1007 12:16:15.334598 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:16:21 crc kubenswrapper[4700]: I1007 12:16:21.331929 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"de281a78-c284-4c5e-8312-6661e2543668","Type":"ContainerStarted","Data":"0a330d3abae02dbecc8314c2164fbcae51fce11e3e4ea0ced33e53046616cbec"} Oct 07 12:16:25 crc kubenswrapper[4700]: I1007 12:16:25.392813 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"de281a78-c284-4c5e-8312-6661e2543668","Type":"ContainerStarted","Data":"465b2d9734f4b695a0d71599894eb307279988508fb00dadaa35003adc936077"} Oct 07 12:16:26 crc kubenswrapper[4700]: I1007 12:16:26.403639 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Oct 07 12:16:26 crc kubenswrapper[4700]: I1007 12:16:26.406747 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Oct 07 12:16:26 crc kubenswrapper[4700]: I1007 12:16:26.433590 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=7.773448392 podStartE2EDuration="30.43356515s" podCreationTimestamp="2025-10-07 12:15:56 +0000 UTC" firstStartedPulling="2025-10-07 12:15:57.449325454 +0000 UTC m=+3324.245724443" lastFinishedPulling="2025-10-07 12:16:20.109442202 +0000 UTC m=+3346.905841201" observedRunningTime="2025-10-07 12:16:26.422180572 +0000 UTC m=+3353.218579581" watchObservedRunningTime="2025-10-07 12:16:26.43356515 +0000 UTC m=+3353.229964169" Oct 07 12:16:32 crc kubenswrapper[4700]: I1007 12:16:32.473243 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a460e42e-4de8-416d-8ffd-fbc501c5a047","Type":"ContainerStarted","Data":"f832e5451396ef0310d90bf120267c0a7f6a8d5d1fe2e07ffca2254f9b87fe85"} Oct 07 12:16:36 crc kubenswrapper[4700]: I1007 12:16:36.526724 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a460e42e-4de8-416d-8ffd-fbc501c5a047","Type":"ContainerStarted","Data":"bd5902b9845888a66e9214ee52b0ec370dd107768716b6e2db56fbc3b144a79d"} Oct 07 12:16:43 crc kubenswrapper[4700]: I1007 12:16:43.617008 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a460e42e-4de8-416d-8ffd-fbc501c5a047","Type":"ContainerStarted","Data":"2520fefbc940a97e347b5ef55ebb3cf845d4ccfab7c3fd5ee2809f1343ab13fe"} Oct 07 12:16:43 crc kubenswrapper[4700]: I1007 12:16:43.669829 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=2.594890892 podStartE2EDuration="47.669801879s" podCreationTimestamp="2025-10-07 12:15:56 +0000 UTC" firstStartedPulling="2025-10-07 12:15:58.107548153 +0000 UTC m=+3324.903947142" lastFinishedPulling="2025-10-07 12:16:43.1824591 +0000 UTC m=+3369.978858129" observedRunningTime="2025-10-07 12:16:43.648687483 +0000 UTC m=+3370.445086502" watchObservedRunningTime="2025-10-07 12:16:43.669801879 +0000 UTC m=+3370.466200898" Oct 07 12:16:45 crc kubenswrapper[4700]: I1007 12:16:45.334027 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:16:45 crc kubenswrapper[4700]: I1007 12:16:45.334106 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:16:47 crc kubenswrapper[4700]: I1007 12:16:47.531770 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 07 12:16:57 crc kubenswrapper[4700]: I1007 12:16:57.531440 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 07 12:16:57 crc kubenswrapper[4700]: I1007 12:16:57.541881 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 07 12:16:57 crc kubenswrapper[4700]: I1007 12:16:57.718137 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-82kbr"] Oct 07 12:16:57 crc kubenswrapper[4700]: I1007 12:16:57.723458 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-82kbr" Oct 07 12:16:57 crc kubenswrapper[4700]: I1007 12:16:57.739883 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-82kbr"] Oct 07 12:16:57 crc kubenswrapper[4700]: I1007 12:16:57.784703 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 07 12:16:57 crc kubenswrapper[4700]: I1007 12:16:57.867611 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6968d3d1-c3bb-43df-a62f-17bf683ba455-catalog-content\") pod \"certified-operators-82kbr\" (UID: \"6968d3d1-c3bb-43df-a62f-17bf683ba455\") " pod="openshift-marketplace/certified-operators-82kbr" Oct 07 12:16:57 crc kubenswrapper[4700]: I1007 12:16:57.868084 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frnp7\" (UniqueName: \"kubernetes.io/projected/6968d3d1-c3bb-43df-a62f-17bf683ba455-kube-api-access-frnp7\") pod \"certified-operators-82kbr\" (UID: \"6968d3d1-c3bb-43df-a62f-17bf683ba455\") " pod="openshift-marketplace/certified-operators-82kbr" Oct 07 12:16:57 crc kubenswrapper[4700]: I1007 12:16:57.868111 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6968d3d1-c3bb-43df-a62f-17bf683ba455-utilities\") pod \"certified-operators-82kbr\" (UID: \"6968d3d1-c3bb-43df-a62f-17bf683ba455\") " pod="openshift-marketplace/certified-operators-82kbr" Oct 07 12:16:57 crc kubenswrapper[4700]: I1007 12:16:57.969641 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6968d3d1-c3bb-43df-a62f-17bf683ba455-catalog-content\") pod \"certified-operators-82kbr\" (UID: \"6968d3d1-c3bb-43df-a62f-17bf683ba455\") " pod="openshift-marketplace/certified-operators-82kbr" Oct 07 12:16:57 crc kubenswrapper[4700]: I1007 12:16:57.969723 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frnp7\" (UniqueName: \"kubernetes.io/projected/6968d3d1-c3bb-43df-a62f-17bf683ba455-kube-api-access-frnp7\") pod \"certified-operators-82kbr\" (UID: \"6968d3d1-c3bb-43df-a62f-17bf683ba455\") " pod="openshift-marketplace/certified-operators-82kbr" Oct 07 12:16:57 crc kubenswrapper[4700]: I1007 12:16:57.969750 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6968d3d1-c3bb-43df-a62f-17bf683ba455-utilities\") pod \"certified-operators-82kbr\" (UID: \"6968d3d1-c3bb-43df-a62f-17bf683ba455\") " pod="openshift-marketplace/certified-operators-82kbr" Oct 07 12:16:57 crc kubenswrapper[4700]: I1007 12:16:57.970114 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6968d3d1-c3bb-43df-a62f-17bf683ba455-catalog-content\") pod \"certified-operators-82kbr\" (UID: \"6968d3d1-c3bb-43df-a62f-17bf683ba455\") " pod="openshift-marketplace/certified-operators-82kbr" Oct 07 12:16:57 crc kubenswrapper[4700]: I1007 12:16:57.970242 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6968d3d1-c3bb-43df-a62f-17bf683ba455-utilities\") pod \"certified-operators-82kbr\" (UID: \"6968d3d1-c3bb-43df-a62f-17bf683ba455\") " pod="openshift-marketplace/certified-operators-82kbr" Oct 07 12:16:57 crc kubenswrapper[4700]: I1007 12:16:57.994611 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frnp7\" (UniqueName: \"kubernetes.io/projected/6968d3d1-c3bb-43df-a62f-17bf683ba455-kube-api-access-frnp7\") pod \"certified-operators-82kbr\" (UID: \"6968d3d1-c3bb-43df-a62f-17bf683ba455\") " pod="openshift-marketplace/certified-operators-82kbr" Oct 07 12:16:58 crc kubenswrapper[4700]: I1007 12:16:58.051142 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-82kbr" Oct 07 12:16:59 crc kubenswrapper[4700]: I1007 12:16:59.186358 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 07 12:16:59 crc kubenswrapper[4700]: I1007 12:16:59.186917 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="d6005c71-9a39-4ab6-876a-99dd2f60c9ae" containerName="openstackclient" containerID="cri-o://0a05451550bcedc55ea69b11889bc6287e0978e9ed60e5232005deae9188df7c" gracePeriod=2 Oct 07 12:16:59 crc kubenswrapper[4700]: I1007 12:16:59.207572 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 07 12:16:59 crc kubenswrapper[4700]: I1007 12:16:59.228712 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 07 12:16:59 crc kubenswrapper[4700]: E1007 12:16:59.229103 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6005c71-9a39-4ab6-876a-99dd2f60c9ae" containerName="openstackclient" Oct 07 12:16:59 crc kubenswrapper[4700]: I1007 12:16:59.229116 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6005c71-9a39-4ab6-876a-99dd2f60c9ae" containerName="openstackclient" Oct 07 12:16:59 crc kubenswrapper[4700]: I1007 12:16:59.237659 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6005c71-9a39-4ab6-876a-99dd2f60c9ae" containerName="openstackclient" Oct 07 12:16:59 crc kubenswrapper[4700]: I1007 12:16:59.238382 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 12:16:59 crc kubenswrapper[4700]: I1007 12:16:59.240537 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 07 12:16:59 crc kubenswrapper[4700]: I1007 12:16:59.241369 4700 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="d6005c71-9a39-4ab6-876a-99dd2f60c9ae" podUID="7474ed66-6936-4cd0-b7ca-0182eaeec767" Oct 07 12:16:59 crc kubenswrapper[4700]: I1007 12:16:59.305099 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7474ed66-6936-4cd0-b7ca-0182eaeec767-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7474ed66-6936-4cd0-b7ca-0182eaeec767\") " pod="openstack/openstackclient" Oct 07 12:16:59 crc kubenswrapper[4700]: I1007 12:16:59.305511 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7474ed66-6936-4cd0-b7ca-0182eaeec767-openstack-config\") pod \"openstackclient\" (UID: \"7474ed66-6936-4cd0-b7ca-0182eaeec767\") " pod="openstack/openstackclient" Oct 07 12:16:59 crc kubenswrapper[4700]: I1007 12:16:59.305707 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7474ed66-6936-4cd0-b7ca-0182eaeec767-openstack-config-secret\") pod \"openstackclient\" (UID: \"7474ed66-6936-4cd0-b7ca-0182eaeec767\") " pod="openstack/openstackclient" Oct 07 12:16:59 crc kubenswrapper[4700]: I1007 12:16:59.305750 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtpws\" (UniqueName: \"kubernetes.io/projected/7474ed66-6936-4cd0-b7ca-0182eaeec767-kube-api-access-wtpws\") pod \"openstackclient\" (UID: \"7474ed66-6936-4cd0-b7ca-0182eaeec767\") " pod="openstack/openstackclient" Oct 07 12:16:59 crc kubenswrapper[4700]: I1007 12:16:59.407765 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7474ed66-6936-4cd0-b7ca-0182eaeec767-openstack-config\") pod \"openstackclient\" (UID: \"7474ed66-6936-4cd0-b7ca-0182eaeec767\") " pod="openstack/openstackclient" Oct 07 12:16:59 crc kubenswrapper[4700]: I1007 12:16:59.407878 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7474ed66-6936-4cd0-b7ca-0182eaeec767-openstack-config-secret\") pod \"openstackclient\" (UID: \"7474ed66-6936-4cd0-b7ca-0182eaeec767\") " pod="openstack/openstackclient" Oct 07 12:16:59 crc kubenswrapper[4700]: I1007 12:16:59.407920 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtpws\" (UniqueName: \"kubernetes.io/projected/7474ed66-6936-4cd0-b7ca-0182eaeec767-kube-api-access-wtpws\") pod \"openstackclient\" (UID: \"7474ed66-6936-4cd0-b7ca-0182eaeec767\") " pod="openstack/openstackclient" Oct 07 12:16:59 crc kubenswrapper[4700]: I1007 12:16:59.407968 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7474ed66-6936-4cd0-b7ca-0182eaeec767-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7474ed66-6936-4cd0-b7ca-0182eaeec767\") " pod="openstack/openstackclient" Oct 07 12:16:59 crc kubenswrapper[4700]: I1007 12:16:59.408720 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7474ed66-6936-4cd0-b7ca-0182eaeec767-openstack-config\") pod \"openstackclient\" (UID: \"7474ed66-6936-4cd0-b7ca-0182eaeec767\") " pod="openstack/openstackclient" Oct 07 12:16:59 crc kubenswrapper[4700]: I1007 12:16:59.413552 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7474ed66-6936-4cd0-b7ca-0182eaeec767-openstack-config-secret\") pod \"openstackclient\" (UID: \"7474ed66-6936-4cd0-b7ca-0182eaeec767\") " pod="openstack/openstackclient" Oct 07 12:16:59 crc kubenswrapper[4700]: I1007 12:16:59.414265 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7474ed66-6936-4cd0-b7ca-0182eaeec767-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7474ed66-6936-4cd0-b7ca-0182eaeec767\") " pod="openstack/openstackclient" Oct 07 12:16:59 crc kubenswrapper[4700]: I1007 12:16:59.427095 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtpws\" (UniqueName: \"kubernetes.io/projected/7474ed66-6936-4cd0-b7ca-0182eaeec767-kube-api-access-wtpws\") pod \"openstackclient\" (UID: \"7474ed66-6936-4cd0-b7ca-0182eaeec767\") " pod="openstack/openstackclient" Oct 07 12:16:59 crc kubenswrapper[4700]: I1007 12:16:59.545759 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Oct 07 12:16:59 crc kubenswrapper[4700]: I1007 12:16:59.546069 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa" containerName="aodh-api" containerID="cri-o://c8ebf6c572a1d0045b4d3567acfcd8b1ef596b70299affbd185a96d51b1a5d15" gracePeriod=30 Oct 07 12:16:59 crc kubenswrapper[4700]: I1007 12:16:59.546130 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa" containerName="aodh-listener" containerID="cri-o://c370b1badeb86ec56c55a7661f360de1c3fda515948096088ee374a1e12ff751" gracePeriod=30 Oct 07 12:16:59 crc kubenswrapper[4700]: I1007 12:16:59.546179 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa" containerName="aodh-notifier" containerID="cri-o://982c82799820fa195cf5f9e3ac7199473f9f9a69a83fb56d5ee791f9aa775fac" gracePeriod=30 Oct 07 12:16:59 crc kubenswrapper[4700]: I1007 12:16:59.546246 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa" containerName="aodh-evaluator" containerID="cri-o://71c567ffb2171f93e772220f6f17e84784a2de84f7eef5d68f81d1d7b7e6f75e" gracePeriod=30 Oct 07 12:16:59 crc kubenswrapper[4700]: I1007 12:16:59.582331 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 12:16:59 crc kubenswrapper[4700]: I1007 12:16:59.603536 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-82kbr"] Oct 07 12:16:59 crc kubenswrapper[4700]: I1007 12:16:59.873552 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-82kbr" event={"ID":"6968d3d1-c3bb-43df-a62f-17bf683ba455","Type":"ContainerStarted","Data":"6d1e6300951ad5f8bf7f13200709b6179655bd1ae273cacde7dc1742bdd75c74"} Oct 07 12:17:00 crc kubenswrapper[4700]: I1007 12:17:00.285226 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 07 12:17:00 crc kubenswrapper[4700]: W1007 12:17:00.293434 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7474ed66_6936_4cd0_b7ca_0182eaeec767.slice/crio-e83f30bc19bf413fe4f4fa5d6336a1228d48b3d379514d36381043ab40be0d0b WatchSource:0}: Error finding container e83f30bc19bf413fe4f4fa5d6336a1228d48b3d379514d36381043ab40be0d0b: Status 404 returned error can't find the container with id e83f30bc19bf413fe4f4fa5d6336a1228d48b3d379514d36381043ab40be0d0b Oct 07 12:17:00 crc kubenswrapper[4700]: I1007 12:17:00.452691 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 07 12:17:00 crc kubenswrapper[4700]: I1007 12:17:00.453374 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="a460e42e-4de8-416d-8ffd-fbc501c5a047" containerName="thanos-sidecar" containerID="cri-o://2520fefbc940a97e347b5ef55ebb3cf845d4ccfab7c3fd5ee2809f1343ab13fe" gracePeriod=600 Oct 07 12:17:00 crc kubenswrapper[4700]: I1007 12:17:00.453405 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="a460e42e-4de8-416d-8ffd-fbc501c5a047" containerName="config-reloader" containerID="cri-o://bd5902b9845888a66e9214ee52b0ec370dd107768716b6e2db56fbc3b144a79d" gracePeriod=600 Oct 07 12:17:00 crc kubenswrapper[4700]: I1007 12:17:00.455640 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="a460e42e-4de8-416d-8ffd-fbc501c5a047" containerName="prometheus" containerID="cri-o://f832e5451396ef0310d90bf120267c0a7f6a8d5d1fe2e07ffca2254f9b87fe85" gracePeriod=600 Oct 07 12:17:00 crc kubenswrapper[4700]: I1007 12:17:00.887486 4700 generic.go:334] "Generic (PLEG): container finished" podID="6968d3d1-c3bb-43df-a62f-17bf683ba455" containerID="cd984c3627dfe98e33edef2d18a2f82fce4b16c2cd94b98a124775f8dadabf04" exitCode=0 Oct 07 12:17:00 crc kubenswrapper[4700]: I1007 12:17:00.887555 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-82kbr" event={"ID":"6968d3d1-c3bb-43df-a62f-17bf683ba455","Type":"ContainerDied","Data":"cd984c3627dfe98e33edef2d18a2f82fce4b16c2cd94b98a124775f8dadabf04"} Oct 07 12:17:00 crc kubenswrapper[4700]: I1007 12:17:00.894756 4700 generic.go:334] "Generic (PLEG): container finished" podID="c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa" containerID="71c567ffb2171f93e772220f6f17e84784a2de84f7eef5d68f81d1d7b7e6f75e" exitCode=0 Oct 07 12:17:00 crc kubenswrapper[4700]: I1007 12:17:00.894785 4700 generic.go:334] "Generic (PLEG): container finished" podID="c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa" containerID="c8ebf6c572a1d0045b4d3567acfcd8b1ef596b70299affbd185a96d51b1a5d15" exitCode=0 Oct 07 12:17:00 crc kubenswrapper[4700]: I1007 12:17:00.894829 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa","Type":"ContainerDied","Data":"71c567ffb2171f93e772220f6f17e84784a2de84f7eef5d68f81d1d7b7e6f75e"} Oct 07 12:17:00 crc kubenswrapper[4700]: I1007 12:17:00.894857 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa","Type":"ContainerDied","Data":"c8ebf6c572a1d0045b4d3567acfcd8b1ef596b70299affbd185a96d51b1a5d15"} Oct 07 12:17:00 crc kubenswrapper[4700]: I1007 12:17:00.897458 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"7474ed66-6936-4cd0-b7ca-0182eaeec767","Type":"ContainerStarted","Data":"04dd03e23faedd5941331ae2968c70f58b1f4ae888aedb6c2353dcc34b3a1178"} Oct 07 12:17:00 crc kubenswrapper[4700]: I1007 12:17:00.897502 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"7474ed66-6936-4cd0-b7ca-0182eaeec767","Type":"ContainerStarted","Data":"e83f30bc19bf413fe4f4fa5d6336a1228d48b3d379514d36381043ab40be0d0b"} Oct 07 12:17:00 crc kubenswrapper[4700]: I1007 12:17:00.902034 4700 generic.go:334] "Generic (PLEG): container finished" podID="a460e42e-4de8-416d-8ffd-fbc501c5a047" containerID="2520fefbc940a97e347b5ef55ebb3cf845d4ccfab7c3fd5ee2809f1343ab13fe" exitCode=0 Oct 07 12:17:00 crc kubenswrapper[4700]: I1007 12:17:00.902070 4700 generic.go:334] "Generic (PLEG): container finished" podID="a460e42e-4de8-416d-8ffd-fbc501c5a047" containerID="f832e5451396ef0310d90bf120267c0a7f6a8d5d1fe2e07ffca2254f9b87fe85" exitCode=0 Oct 07 12:17:00 crc kubenswrapper[4700]: I1007 12:17:00.902073 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a460e42e-4de8-416d-8ffd-fbc501c5a047","Type":"ContainerDied","Data":"2520fefbc940a97e347b5ef55ebb3cf845d4ccfab7c3fd5ee2809f1343ab13fe"} Oct 07 12:17:00 crc kubenswrapper[4700]: I1007 12:17:00.902158 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a460e42e-4de8-416d-8ffd-fbc501c5a047","Type":"ContainerDied","Data":"f832e5451396ef0310d90bf120267c0a7f6a8d5d1fe2e07ffca2254f9b87fe85"} Oct 07 12:17:01 crc kubenswrapper[4700]: I1007 12:17:01.657514 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 12:17:01 crc kubenswrapper[4700]: I1007 12:17:01.762826 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wvrn\" (UniqueName: \"kubernetes.io/projected/d6005c71-9a39-4ab6-876a-99dd2f60c9ae-kube-api-access-5wvrn\") pod \"d6005c71-9a39-4ab6-876a-99dd2f60c9ae\" (UID: \"d6005c71-9a39-4ab6-876a-99dd2f60c9ae\") " Oct 07 12:17:01 crc kubenswrapper[4700]: I1007 12:17:01.762882 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d6005c71-9a39-4ab6-876a-99dd2f60c9ae-openstack-config-secret\") pod \"d6005c71-9a39-4ab6-876a-99dd2f60c9ae\" (UID: \"d6005c71-9a39-4ab6-876a-99dd2f60c9ae\") " Oct 07 12:17:01 crc kubenswrapper[4700]: I1007 12:17:01.763126 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6005c71-9a39-4ab6-876a-99dd2f60c9ae-combined-ca-bundle\") pod \"d6005c71-9a39-4ab6-876a-99dd2f60c9ae\" (UID: \"d6005c71-9a39-4ab6-876a-99dd2f60c9ae\") " Oct 07 12:17:01 crc kubenswrapper[4700]: I1007 12:17:01.763161 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d6005c71-9a39-4ab6-876a-99dd2f60c9ae-openstack-config\") pod \"d6005c71-9a39-4ab6-876a-99dd2f60c9ae\" (UID: \"d6005c71-9a39-4ab6-876a-99dd2f60c9ae\") " Oct 07 12:17:01 crc kubenswrapper[4700]: I1007 12:17:01.784732 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6005c71-9a39-4ab6-876a-99dd2f60c9ae-kube-api-access-5wvrn" (OuterVolumeSpecName: "kube-api-access-5wvrn") pod "d6005c71-9a39-4ab6-876a-99dd2f60c9ae" (UID: "d6005c71-9a39-4ab6-876a-99dd2f60c9ae"). InnerVolumeSpecName "kube-api-access-5wvrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:17:01 crc kubenswrapper[4700]: I1007 12:17:01.791543 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6005c71-9a39-4ab6-876a-99dd2f60c9ae-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "d6005c71-9a39-4ab6-876a-99dd2f60c9ae" (UID: "d6005c71-9a39-4ab6-876a-99dd2f60c9ae"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:17:01 crc kubenswrapper[4700]: I1007 12:17:01.821060 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6005c71-9a39-4ab6-876a-99dd2f60c9ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6005c71-9a39-4ab6-876a-99dd2f60c9ae" (UID: "d6005c71-9a39-4ab6-876a-99dd2f60c9ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:17:01 crc kubenswrapper[4700]: I1007 12:17:01.845362 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6005c71-9a39-4ab6-876a-99dd2f60c9ae-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "d6005c71-9a39-4ab6-876a-99dd2f60c9ae" (UID: "d6005c71-9a39-4ab6-876a-99dd2f60c9ae"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:17:01 crc kubenswrapper[4700]: I1007 12:17:01.865461 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wvrn\" (UniqueName: \"kubernetes.io/projected/d6005c71-9a39-4ab6-876a-99dd2f60c9ae-kube-api-access-5wvrn\") on node \"crc\" DevicePath \"\"" Oct 07 12:17:01 crc kubenswrapper[4700]: I1007 12:17:01.865486 4700 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d6005c71-9a39-4ab6-876a-99dd2f60c9ae-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 07 12:17:01 crc kubenswrapper[4700]: I1007 12:17:01.865495 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6005c71-9a39-4ab6-876a-99dd2f60c9ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:17:01 crc kubenswrapper[4700]: I1007 12:17:01.865504 4700 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d6005c71-9a39-4ab6-876a-99dd2f60c9ae-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:17:01 crc kubenswrapper[4700]: I1007 12:17:01.912475 4700 generic.go:334] "Generic (PLEG): container finished" podID="a460e42e-4de8-416d-8ffd-fbc501c5a047" containerID="bd5902b9845888a66e9214ee52b0ec370dd107768716b6e2db56fbc3b144a79d" exitCode=0 Oct 07 12:17:01 crc kubenswrapper[4700]: I1007 12:17:01.912556 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a460e42e-4de8-416d-8ffd-fbc501c5a047","Type":"ContainerDied","Data":"bd5902b9845888a66e9214ee52b0ec370dd107768716b6e2db56fbc3b144a79d"} Oct 07 12:17:01 crc kubenswrapper[4700]: I1007 12:17:01.915347 4700 generic.go:334] "Generic (PLEG): container finished" podID="c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa" containerID="c370b1badeb86ec56c55a7661f360de1c3fda515948096088ee374a1e12ff751" exitCode=0 Oct 07 12:17:01 crc kubenswrapper[4700]: I1007 12:17:01.915468 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa","Type":"ContainerDied","Data":"c370b1badeb86ec56c55a7661f360de1c3fda515948096088ee374a1e12ff751"} Oct 07 12:17:01 crc kubenswrapper[4700]: I1007 12:17:01.917399 4700 generic.go:334] "Generic (PLEG): container finished" podID="d6005c71-9a39-4ab6-876a-99dd2f60c9ae" containerID="0a05451550bcedc55ea69b11889bc6287e0978e9ed60e5232005deae9188df7c" exitCode=137 Oct 07 12:17:01 crc kubenswrapper[4700]: I1007 12:17:01.917602 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 12:17:01 crc kubenswrapper[4700]: I1007 12:17:01.917623 4700 scope.go:117] "RemoveContainer" containerID="0a05451550bcedc55ea69b11889bc6287e0978e9ed60e5232005deae9188df7c" Oct 07 12:17:01 crc kubenswrapper[4700]: I1007 12:17:01.927239 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:01 crc kubenswrapper[4700]: I1007 12:17:01.935036 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.935018443 podStartE2EDuration="2.935018443s" podCreationTimestamp="2025-10-07 12:16:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:17:01.929771797 +0000 UTC m=+3388.726170786" watchObservedRunningTime="2025-10-07 12:17:01.935018443 +0000 UTC m=+3388.731417432" Oct 07 12:17:01 crc kubenswrapper[4700]: I1007 12:17:01.950970 4700 scope.go:117] "RemoveContainer" containerID="0a05451550bcedc55ea69b11889bc6287e0978e9ed60e5232005deae9188df7c" Oct 07 12:17:01 crc kubenswrapper[4700]: E1007 12:17:01.951536 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a05451550bcedc55ea69b11889bc6287e0978e9ed60e5232005deae9188df7c\": container with ID starting with 0a05451550bcedc55ea69b11889bc6287e0978e9ed60e5232005deae9188df7c not found: ID does not exist" containerID="0a05451550bcedc55ea69b11889bc6287e0978e9ed60e5232005deae9188df7c" Oct 07 12:17:01 crc kubenswrapper[4700]: I1007 12:17:01.951649 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a05451550bcedc55ea69b11889bc6287e0978e9ed60e5232005deae9188df7c"} err="failed to get container status \"0a05451550bcedc55ea69b11889bc6287e0978e9ed60e5232005deae9188df7c\": rpc error: code = NotFound desc = could not find container \"0a05451550bcedc55ea69b11889bc6287e0978e9ed60e5232005deae9188df7c\": container with ID starting with 0a05451550bcedc55ea69b11889bc6287e0978e9ed60e5232005deae9188df7c not found: ID does not exist" Oct 07 12:17:01 crc kubenswrapper[4700]: I1007 12:17:01.969874 4700 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="d6005c71-9a39-4ab6-876a-99dd2f60c9ae" podUID="7474ed66-6936-4cd0-b7ca-0182eaeec767" Oct 07 12:17:01 crc kubenswrapper[4700]: I1007 12:17:01.975105 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6005c71-9a39-4ab6-876a-99dd2f60c9ae" path="/var/lib/kubelet/pods/d6005c71-9a39-4ab6-876a-99dd2f60c9ae/volumes" Oct 07 12:17:02 crc kubenswrapper[4700]: I1007 12:17:02.073168 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a460e42e-4de8-416d-8ffd-fbc501c5a047-config-out\") pod \"a460e42e-4de8-416d-8ffd-fbc501c5a047\" (UID: \"a460e42e-4de8-416d-8ffd-fbc501c5a047\") " Oct 07 12:17:02 crc kubenswrapper[4700]: I1007 12:17:02.073253 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a460e42e-4de8-416d-8ffd-fbc501c5a047-config\") pod \"a460e42e-4de8-416d-8ffd-fbc501c5a047\" (UID: \"a460e42e-4de8-416d-8ffd-fbc501c5a047\") " Oct 07 12:17:02 crc kubenswrapper[4700]: I1007 12:17:02.073394 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pmls\" (UniqueName: \"kubernetes.io/projected/a460e42e-4de8-416d-8ffd-fbc501c5a047-kube-api-access-6pmls\") pod \"a460e42e-4de8-416d-8ffd-fbc501c5a047\" (UID: \"a460e42e-4de8-416d-8ffd-fbc501c5a047\") " Oct 07 12:17:02 crc kubenswrapper[4700]: I1007 12:17:02.073509 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a460e42e-4de8-416d-8ffd-fbc501c5a047-tls-assets\") pod \"a460e42e-4de8-416d-8ffd-fbc501c5a047\" (UID: \"a460e42e-4de8-416d-8ffd-fbc501c5a047\") " Oct 07 12:17:02 crc kubenswrapper[4700]: I1007 12:17:02.073551 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a460e42e-4de8-416d-8ffd-fbc501c5a047-thanos-prometheus-http-client-file\") pod \"a460e42e-4de8-416d-8ffd-fbc501c5a047\" (UID: \"a460e42e-4de8-416d-8ffd-fbc501c5a047\") " Oct 07 12:17:02 crc kubenswrapper[4700]: I1007 12:17:02.073591 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a460e42e-4de8-416d-8ffd-fbc501c5a047-prometheus-metric-storage-rulefiles-0\") pod \"a460e42e-4de8-416d-8ffd-fbc501c5a047\" (UID: \"a460e42e-4de8-416d-8ffd-fbc501c5a047\") " Oct 07 12:17:02 crc kubenswrapper[4700]: I1007 12:17:02.073638 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a460e42e-4de8-416d-8ffd-fbc501c5a047-web-config\") pod \"a460e42e-4de8-416d-8ffd-fbc501c5a047\" (UID: \"a460e42e-4de8-416d-8ffd-fbc501c5a047\") " Oct 07 12:17:02 crc kubenswrapper[4700]: I1007 12:17:02.074384 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a460e42e-4de8-416d-8ffd-fbc501c5a047-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "a460e42e-4de8-416d-8ffd-fbc501c5a047" (UID: "a460e42e-4de8-416d-8ffd-fbc501c5a047"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:17:02 crc kubenswrapper[4700]: I1007 12:17:02.074507 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e554be8a-675d-4ec2-b2d9-1b74f4ebb37f\") pod \"a460e42e-4de8-416d-8ffd-fbc501c5a047\" (UID: \"a460e42e-4de8-416d-8ffd-fbc501c5a047\") " Oct 07 12:17:02 crc kubenswrapper[4700]: I1007 12:17:02.076052 4700 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a460e42e-4de8-416d-8ffd-fbc501c5a047-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Oct 07 12:17:02 crc kubenswrapper[4700]: I1007 12:17:02.076769 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a460e42e-4de8-416d-8ffd-fbc501c5a047-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "a460e42e-4de8-416d-8ffd-fbc501c5a047" (UID: "a460e42e-4de8-416d-8ffd-fbc501c5a047"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:17:02 crc kubenswrapper[4700]: I1007 12:17:02.078345 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a460e42e-4de8-416d-8ffd-fbc501c5a047-kube-api-access-6pmls" (OuterVolumeSpecName: "kube-api-access-6pmls") pod "a460e42e-4de8-416d-8ffd-fbc501c5a047" (UID: "a460e42e-4de8-416d-8ffd-fbc501c5a047"). InnerVolumeSpecName "kube-api-access-6pmls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:17:02 crc kubenswrapper[4700]: I1007 12:17:02.079209 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a460e42e-4de8-416d-8ffd-fbc501c5a047-config-out" (OuterVolumeSpecName: "config-out") pod "a460e42e-4de8-416d-8ffd-fbc501c5a047" (UID: "a460e42e-4de8-416d-8ffd-fbc501c5a047"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:17:02 crc kubenswrapper[4700]: I1007 12:17:02.088010 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a460e42e-4de8-416d-8ffd-fbc501c5a047-config" (OuterVolumeSpecName: "config") pod "a460e42e-4de8-416d-8ffd-fbc501c5a047" (UID: "a460e42e-4de8-416d-8ffd-fbc501c5a047"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:17:02 crc kubenswrapper[4700]: I1007 12:17:02.088972 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a460e42e-4de8-416d-8ffd-fbc501c5a047-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "a460e42e-4de8-416d-8ffd-fbc501c5a047" (UID: "a460e42e-4de8-416d-8ffd-fbc501c5a047"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:17:02 crc kubenswrapper[4700]: I1007 12:17:02.111448 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a460e42e-4de8-416d-8ffd-fbc501c5a047-web-config" (OuterVolumeSpecName: "web-config") pod "a460e42e-4de8-416d-8ffd-fbc501c5a047" (UID: "a460e42e-4de8-416d-8ffd-fbc501c5a047"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:17:02 crc kubenswrapper[4700]: I1007 12:17:02.125381 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e554be8a-675d-4ec2-b2d9-1b74f4ebb37f" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "a460e42e-4de8-416d-8ffd-fbc501c5a047" (UID: "a460e42e-4de8-416d-8ffd-fbc501c5a047"). InnerVolumeSpecName "pvc-e554be8a-675d-4ec2-b2d9-1b74f4ebb37f". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 07 12:17:02 crc kubenswrapper[4700]: I1007 12:17:02.178410 4700 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-e554be8a-675d-4ec2-b2d9-1b74f4ebb37f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e554be8a-675d-4ec2-b2d9-1b74f4ebb37f\") on node \"crc\" " Oct 07 12:17:02 crc kubenswrapper[4700]: I1007 12:17:02.178440 4700 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a460e42e-4de8-416d-8ffd-fbc501c5a047-config-out\") on node \"crc\" DevicePath \"\"" Oct 07 12:17:02 crc kubenswrapper[4700]: I1007 12:17:02.178455 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a460e42e-4de8-416d-8ffd-fbc501c5a047-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:17:02 crc kubenswrapper[4700]: I1007 12:17:02.178465 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pmls\" (UniqueName: \"kubernetes.io/projected/a460e42e-4de8-416d-8ffd-fbc501c5a047-kube-api-access-6pmls\") on node \"crc\" DevicePath \"\"" Oct 07 12:17:02 crc kubenswrapper[4700]: I1007 12:17:02.178474 4700 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a460e42e-4de8-416d-8ffd-fbc501c5a047-tls-assets\") on node \"crc\" DevicePath \"\"" Oct 07 12:17:02 crc kubenswrapper[4700]: I1007 12:17:02.178483 4700 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a460e42e-4de8-416d-8ffd-fbc501c5a047-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Oct 07 12:17:02 crc kubenswrapper[4700]: I1007 12:17:02.178494 4700 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a460e42e-4de8-416d-8ffd-fbc501c5a047-web-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:17:02 crc kubenswrapper[4700]: I1007 12:17:02.206221 4700 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 07 12:17:02 crc kubenswrapper[4700]: I1007 12:17:02.206413 4700 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-e554be8a-675d-4ec2-b2d9-1b74f4ebb37f" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e554be8a-675d-4ec2-b2d9-1b74f4ebb37f") on node "crc" Oct 07 12:17:02 crc kubenswrapper[4700]: I1007 12:17:02.280768 4700 reconciler_common.go:293] "Volume detached for volume \"pvc-e554be8a-675d-4ec2-b2d9-1b74f4ebb37f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e554be8a-675d-4ec2-b2d9-1b74f4ebb37f\") on node \"crc\" DevicePath \"\"" Oct 07 12:17:02 crc kubenswrapper[4700]: I1007 12:17:02.937523 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a460e42e-4de8-416d-8ffd-fbc501c5a047","Type":"ContainerDied","Data":"b452285874aa5a1c46ca9fd55b52d52e49d113bfc57d29030156d612ad6c5dc9"} Oct 07 12:17:02 crc kubenswrapper[4700]: I1007 12:17:02.937656 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:02 crc kubenswrapper[4700]: I1007 12:17:02.937906 4700 scope.go:117] "RemoveContainer" containerID="2520fefbc940a97e347b5ef55ebb3cf845d4ccfab7c3fd5ee2809f1343ab13fe" Oct 07 12:17:02 crc kubenswrapper[4700]: I1007 12:17:02.996572 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.010985 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.031574 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 07 12:17:03 crc kubenswrapper[4700]: E1007 12:17:03.031995 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a460e42e-4de8-416d-8ffd-fbc501c5a047" containerName="prometheus" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.032011 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="a460e42e-4de8-416d-8ffd-fbc501c5a047" containerName="prometheus" Oct 07 12:17:03 crc kubenswrapper[4700]: E1007 12:17:03.032027 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a460e42e-4de8-416d-8ffd-fbc501c5a047" containerName="init-config-reloader" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.032034 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="a460e42e-4de8-416d-8ffd-fbc501c5a047" containerName="init-config-reloader" Oct 07 12:17:03 crc kubenswrapper[4700]: E1007 12:17:03.032052 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a460e42e-4de8-416d-8ffd-fbc501c5a047" containerName="config-reloader" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.032057 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="a460e42e-4de8-416d-8ffd-fbc501c5a047" containerName="config-reloader" Oct 07 12:17:03 crc kubenswrapper[4700]: E1007 12:17:03.032068 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a460e42e-4de8-416d-8ffd-fbc501c5a047" containerName="thanos-sidecar" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.032074 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="a460e42e-4de8-416d-8ffd-fbc501c5a047" containerName="thanos-sidecar" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.032246 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="a460e42e-4de8-416d-8ffd-fbc501c5a047" containerName="thanos-sidecar" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.032261 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="a460e42e-4de8-416d-8ffd-fbc501c5a047" containerName="config-reloader" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.032290 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="a460e42e-4de8-416d-8ffd-fbc501c5a047" containerName="prometheus" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.034005 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.039137 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-cgn6d" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.039384 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.039504 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.039610 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.039784 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.040632 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.049102 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.072504 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.099963 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.100030 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.100136 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.100197 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.100277 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnk6r\" (UniqueName: \"kubernetes.io/projected/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-kube-api-access-nnk6r\") pod \"prometheus-metric-storage-0\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.100355 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.100420 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e554be8a-675d-4ec2-b2d9-1b74f4ebb37f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e554be8a-675d-4ec2-b2d9-1b74f4ebb37f\") pod \"prometheus-metric-storage-0\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.100486 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.103357 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-config\") pod \"prometheus-metric-storage-0\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.103644 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.103735 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.113015 4700 scope.go:117] "RemoveContainer" containerID="bd5902b9845888a66e9214ee52b0ec370dd107768716b6e2db56fbc3b144a79d" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.205828 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.205896 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.205925 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.205959 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.205999 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnk6r\" (UniqueName: \"kubernetes.io/projected/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-kube-api-access-nnk6r\") pod \"prometheus-metric-storage-0\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.206052 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.206098 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e554be8a-675d-4ec2-b2d9-1b74f4ebb37f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e554be8a-675d-4ec2-b2d9-1b74f4ebb37f\") pod \"prometheus-metric-storage-0\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.206143 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.206190 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-config\") pod \"prometheus-metric-storage-0\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.206249 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.206272 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.206940 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.210468 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.211914 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.215258 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.216562 4700 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.216584 4700 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e554be8a-675d-4ec2-b2d9-1b74f4ebb37f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e554be8a-675d-4ec2-b2d9-1b74f4ebb37f\") pod \"prometheus-metric-storage-0\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/902ed96a99d52744c097638c6d4b432f7c41d1a9eceb55821960a227d9fe2797/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.216925 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.218286 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.218348 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-config\") pod \"prometheus-metric-storage-0\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.224296 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.224473 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.225895 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnk6r\" (UniqueName: \"kubernetes.io/projected/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-kube-api-access-nnk6r\") pod \"prometheus-metric-storage-0\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.281465 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e554be8a-675d-4ec2-b2d9-1b74f4ebb37f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e554be8a-675d-4ec2-b2d9-1b74f4ebb37f\") pod \"prometheus-metric-storage-0\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.359782 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.613693 4700 scope.go:117] "RemoveContainer" containerID="f832e5451396ef0310d90bf120267c0a7f6a8d5d1fe2e07ffca2254f9b87fe85" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.959637 4700 generic.go:334] "Generic (PLEG): container finished" podID="c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa" containerID="982c82799820fa195cf5f9e3ac7199473f9f9a69a83fb56d5ee791f9aa775fac" exitCode=0 Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.979783 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a460e42e-4de8-416d-8ffd-fbc501c5a047" path="/var/lib/kubelet/pods/a460e42e-4de8-416d-8ffd-fbc501c5a047/volumes" Oct 07 12:17:03 crc kubenswrapper[4700]: I1007 12:17:03.981588 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa","Type":"ContainerDied","Data":"982c82799820fa195cf5f9e3ac7199473f9f9a69a83fb56d5ee791f9aa775fac"} Oct 07 12:17:04 crc kubenswrapper[4700]: I1007 12:17:04.498137 4700 scope.go:117] "RemoveContainer" containerID="74d637a47f56537d37c339755a1f8596ac6ae05a2b67a832691725e6e5ae3b13" Oct 07 12:17:04 crc kubenswrapper[4700]: I1007 12:17:04.975398 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 07 12:17:04 crc kubenswrapper[4700]: I1007 12:17:04.977634 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa","Type":"ContainerDied","Data":"87eddb294453950166f6acf1c7f259a303ebc46deb062ed4807ab505003d201e"} Oct 07 12:17:04 crc kubenswrapper[4700]: I1007 12:17:04.977721 4700 scope.go:117] "RemoveContainer" containerID="c370b1badeb86ec56c55a7661f360de1c3fda515948096088ee374a1e12ff751" Oct 07 12:17:05 crc kubenswrapper[4700]: I1007 12:17:05.052674 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa-public-tls-certs\") pod \"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa\" (UID: \"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa\") " Oct 07 12:17:05 crc kubenswrapper[4700]: I1007 12:17:05.052820 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa-combined-ca-bundle\") pod \"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa\" (UID: \"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa\") " Oct 07 12:17:05 crc kubenswrapper[4700]: I1007 12:17:05.052849 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vg2wb\" (UniqueName: \"kubernetes.io/projected/c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa-kube-api-access-vg2wb\") pod \"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa\" (UID: \"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa\") " Oct 07 12:17:05 crc kubenswrapper[4700]: I1007 12:17:05.052914 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa-internal-tls-certs\") pod \"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa\" (UID: \"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa\") " Oct 07 12:17:05 crc kubenswrapper[4700]: I1007 12:17:05.053049 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa-scripts\") pod \"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa\" (UID: \"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa\") " Oct 07 12:17:05 crc kubenswrapper[4700]: I1007 12:17:05.053103 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa-config-data\") pod \"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa\" (UID: \"c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa\") " Oct 07 12:17:05 crc kubenswrapper[4700]: I1007 12:17:05.058809 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa-kube-api-access-vg2wb" (OuterVolumeSpecName: "kube-api-access-vg2wb") pod "c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa" (UID: "c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa"). InnerVolumeSpecName "kube-api-access-vg2wb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:17:05 crc kubenswrapper[4700]: I1007 12:17:05.059733 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa-scripts" (OuterVolumeSpecName: "scripts") pod "c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa" (UID: "c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:17:05 crc kubenswrapper[4700]: I1007 12:17:05.114826 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa" (UID: "c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:17:05 crc kubenswrapper[4700]: I1007 12:17:05.116980 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa" (UID: "c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:17:05 crc kubenswrapper[4700]: I1007 12:17:05.155297 4700 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:17:05 crc kubenswrapper[4700]: I1007 12:17:05.155629 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vg2wb\" (UniqueName: \"kubernetes.io/projected/c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa-kube-api-access-vg2wb\") on node \"crc\" DevicePath \"\"" Oct 07 12:17:05 crc kubenswrapper[4700]: I1007 12:17:05.155639 4700 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:17:05 crc kubenswrapper[4700]: I1007 12:17:05.155648 4700 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:17:05 crc kubenswrapper[4700]: I1007 12:17:05.155743 4700 scope.go:117] "RemoveContainer" containerID="982c82799820fa195cf5f9e3ac7199473f9f9a69a83fb56d5ee791f9aa775fac" Oct 07 12:17:05 crc kubenswrapper[4700]: I1007 12:17:05.156324 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa" (UID: "c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:17:05 crc kubenswrapper[4700]: I1007 12:17:05.188435 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa-config-data" (OuterVolumeSpecName: "config-data") pod "c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa" (UID: "c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:17:05 crc kubenswrapper[4700]: I1007 12:17:05.244285 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 07 12:17:05 crc kubenswrapper[4700]: I1007 12:17:05.257965 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:17:05 crc kubenswrapper[4700]: I1007 12:17:05.257990 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:17:05 crc kubenswrapper[4700]: I1007 12:17:05.524933 4700 scope.go:117] "RemoveContainer" containerID="71c567ffb2171f93e772220f6f17e84784a2de84f7eef5d68f81d1d7b7e6f75e" Oct 07 12:17:05 crc kubenswrapper[4700]: W1007 12:17:05.530596 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ea57797_7f86_46b3_ab7e_d35d95e1c4a4.slice/crio-37791f44e9a471223a4c304747490276a6fb5245a486b5af18c32bd4fffcaf2d WatchSource:0}: Error finding container 37791f44e9a471223a4c304747490276a6fb5245a486b5af18c32bd4fffcaf2d: Status 404 returned error can't find the container with id 37791f44e9a471223a4c304747490276a6fb5245a486b5af18c32bd4fffcaf2d Oct 07 12:17:05 crc kubenswrapper[4700]: I1007 12:17:05.584637 4700 scope.go:117] "RemoveContainer" containerID="c8ebf6c572a1d0045b4d3567acfcd8b1ef596b70299affbd185a96d51b1a5d15" Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.019776 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.036406 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4","Type":"ContainerStarted","Data":"37791f44e9a471223a4c304747490276a6fb5245a486b5af18c32bd4fffcaf2d"} Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.063432 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.096784 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.127467 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Oct 07 12:17:06 crc kubenswrapper[4700]: E1007 12:17:06.128521 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa" containerName="aodh-api" Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.128537 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa" containerName="aodh-api" Oct 07 12:17:06 crc kubenswrapper[4700]: E1007 12:17:06.128565 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa" containerName="aodh-evaluator" Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.128573 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa" containerName="aodh-evaluator" Oct 07 12:17:06 crc kubenswrapper[4700]: E1007 12:17:06.128586 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa" containerName="aodh-notifier" Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.128593 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa" containerName="aodh-notifier" Oct 07 12:17:06 crc kubenswrapper[4700]: E1007 12:17:06.128630 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa" containerName="aodh-listener" Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.128636 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa" containerName="aodh-listener" Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.129048 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa" containerName="aodh-listener" Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.129078 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa" containerName="aodh-api" Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.129087 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa" containerName="aodh-evaluator" Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.129110 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa" containerName="aodh-notifier" Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.133478 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.135765 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.139891 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.140476 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.143162 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-k7ltr" Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.143401 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.143441 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.180667 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90228e7e-2fff-43c5-8db3-6b803d39682c-scripts\") pod \"aodh-0\" (UID: \"90228e7e-2fff-43c5-8db3-6b803d39682c\") " pod="openstack/aodh-0" Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.180709 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90228e7e-2fff-43c5-8db3-6b803d39682c-config-data\") pod \"aodh-0\" (UID: \"90228e7e-2fff-43c5-8db3-6b803d39682c\") " pod="openstack/aodh-0" Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.180775 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djt7d\" (UniqueName: \"kubernetes.io/projected/90228e7e-2fff-43c5-8db3-6b803d39682c-kube-api-access-djt7d\") pod \"aodh-0\" (UID: \"90228e7e-2fff-43c5-8db3-6b803d39682c\") " pod="openstack/aodh-0" Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.180794 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90228e7e-2fff-43c5-8db3-6b803d39682c-internal-tls-certs\") pod \"aodh-0\" (UID: \"90228e7e-2fff-43c5-8db3-6b803d39682c\") " pod="openstack/aodh-0" Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.180821 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90228e7e-2fff-43c5-8db3-6b803d39682c-combined-ca-bundle\") pod \"aodh-0\" (UID: \"90228e7e-2fff-43c5-8db3-6b803d39682c\") " pod="openstack/aodh-0" Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.180913 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90228e7e-2fff-43c5-8db3-6b803d39682c-public-tls-certs\") pod \"aodh-0\" (UID: \"90228e7e-2fff-43c5-8db3-6b803d39682c\") " pod="openstack/aodh-0" Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.286686 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90228e7e-2fff-43c5-8db3-6b803d39682c-public-tls-certs\") pod \"aodh-0\" (UID: \"90228e7e-2fff-43c5-8db3-6b803d39682c\") " pod="openstack/aodh-0" Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.286751 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90228e7e-2fff-43c5-8db3-6b803d39682c-scripts\") pod \"aodh-0\" (UID: \"90228e7e-2fff-43c5-8db3-6b803d39682c\") " pod="openstack/aodh-0" Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.286772 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90228e7e-2fff-43c5-8db3-6b803d39682c-config-data\") pod \"aodh-0\" (UID: \"90228e7e-2fff-43c5-8db3-6b803d39682c\") " pod="openstack/aodh-0" Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.286828 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djt7d\" (UniqueName: \"kubernetes.io/projected/90228e7e-2fff-43c5-8db3-6b803d39682c-kube-api-access-djt7d\") pod \"aodh-0\" (UID: \"90228e7e-2fff-43c5-8db3-6b803d39682c\") " pod="openstack/aodh-0" Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.286845 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90228e7e-2fff-43c5-8db3-6b803d39682c-internal-tls-certs\") pod \"aodh-0\" (UID: \"90228e7e-2fff-43c5-8db3-6b803d39682c\") " pod="openstack/aodh-0" Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.286872 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90228e7e-2fff-43c5-8db3-6b803d39682c-combined-ca-bundle\") pod \"aodh-0\" (UID: \"90228e7e-2fff-43c5-8db3-6b803d39682c\") " pod="openstack/aodh-0" Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.302438 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90228e7e-2fff-43c5-8db3-6b803d39682c-scripts\") pod \"aodh-0\" (UID: \"90228e7e-2fff-43c5-8db3-6b803d39682c\") " pod="openstack/aodh-0" Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.302541 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90228e7e-2fff-43c5-8db3-6b803d39682c-public-tls-certs\") pod \"aodh-0\" (UID: \"90228e7e-2fff-43c5-8db3-6b803d39682c\") " pod="openstack/aodh-0" Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.303118 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90228e7e-2fff-43c5-8db3-6b803d39682c-internal-tls-certs\") pod \"aodh-0\" (UID: \"90228e7e-2fff-43c5-8db3-6b803d39682c\") " pod="openstack/aodh-0" Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.303268 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90228e7e-2fff-43c5-8db3-6b803d39682c-combined-ca-bundle\") pod \"aodh-0\" (UID: \"90228e7e-2fff-43c5-8db3-6b803d39682c\") " pod="openstack/aodh-0" Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.303345 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90228e7e-2fff-43c5-8db3-6b803d39682c-config-data\") pod \"aodh-0\" (UID: \"90228e7e-2fff-43c5-8db3-6b803d39682c\") " pod="openstack/aodh-0" Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.306823 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djt7d\" (UniqueName: \"kubernetes.io/projected/90228e7e-2fff-43c5-8db3-6b803d39682c-kube-api-access-djt7d\") pod \"aodh-0\" (UID: \"90228e7e-2fff-43c5-8db3-6b803d39682c\") " pod="openstack/aodh-0" Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.461486 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 07 12:17:06 crc kubenswrapper[4700]: I1007 12:17:06.958650 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 07 12:17:06 crc kubenswrapper[4700]: W1007 12:17:06.966804 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90228e7e_2fff_43c5_8db3_6b803d39682c.slice/crio-9504d46207a78f19b5e6bfac820998541062fc83a72092937168010b0670e452 WatchSource:0}: Error finding container 9504d46207a78f19b5e6bfac820998541062fc83a72092937168010b0670e452: Status 404 returned error can't find the container with id 9504d46207a78f19b5e6bfac820998541062fc83a72092937168010b0670e452 Oct 07 12:17:07 crc kubenswrapper[4700]: I1007 12:17:07.051282 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-82kbr" event={"ID":"6968d3d1-c3bb-43df-a62f-17bf683ba455","Type":"ContainerStarted","Data":"973675de124f56170a81d35860689254476d16c8a3ced834fd2f082abf0c501a"} Oct 07 12:17:07 crc kubenswrapper[4700]: I1007 12:17:07.052401 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"90228e7e-2fff-43c5-8db3-6b803d39682c","Type":"ContainerStarted","Data":"9504d46207a78f19b5e6bfac820998541062fc83a72092937168010b0670e452"} Oct 07 12:17:07 crc kubenswrapper[4700]: I1007 12:17:07.979939 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa" path="/var/lib/kubelet/pods/c2cb6c0e-98a9-4c06-b5f8-ec45c13faffa/volumes" Oct 07 12:17:08 crc kubenswrapper[4700]: I1007 12:17:08.074123 4700 generic.go:334] "Generic (PLEG): container finished" podID="6968d3d1-c3bb-43df-a62f-17bf683ba455" containerID="973675de124f56170a81d35860689254476d16c8a3ced834fd2f082abf0c501a" exitCode=0 Oct 07 12:17:08 crc kubenswrapper[4700]: I1007 12:17:08.074170 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-82kbr" event={"ID":"6968d3d1-c3bb-43df-a62f-17bf683ba455","Type":"ContainerDied","Data":"973675de124f56170a81d35860689254476d16c8a3ced834fd2f082abf0c501a"} Oct 07 12:17:10 crc kubenswrapper[4700]: I1007 12:17:10.104902 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"90228e7e-2fff-43c5-8db3-6b803d39682c","Type":"ContainerStarted","Data":"58ad82a45f336c5193483d194a619564c79c763de6e3274b529bee123c10b87d"} Oct 07 12:17:10 crc kubenswrapper[4700]: I1007 12:17:10.107624 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4","Type":"ContainerStarted","Data":"eba12554c6e1811a8e6ad023e5a462156b52507774eac34a88821da63a7c4a77"} Oct 07 12:17:11 crc kubenswrapper[4700]: I1007 12:17:11.122119 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-82kbr" event={"ID":"6968d3d1-c3bb-43df-a62f-17bf683ba455","Type":"ContainerStarted","Data":"247892545a67115b892ea82262bb747270c9e4b2528a457d936836da32273998"} Oct 07 12:17:11 crc kubenswrapper[4700]: I1007 12:17:11.127405 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"90228e7e-2fff-43c5-8db3-6b803d39682c","Type":"ContainerStarted","Data":"73fce3aa665178aa4b4199787742d85f97741ce0dfc0d9b3665c0379067546b6"} Oct 07 12:17:11 crc kubenswrapper[4700]: I1007 12:17:11.151117 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-82kbr" podStartSLOduration=4.603334194 podStartE2EDuration="14.1510922s" podCreationTimestamp="2025-10-07 12:16:57 +0000 UTC" firstStartedPulling="2025-10-07 12:17:00.890077005 +0000 UTC m=+3387.686475994" lastFinishedPulling="2025-10-07 12:17:10.437835001 +0000 UTC m=+3397.234234000" observedRunningTime="2025-10-07 12:17:11.142067226 +0000 UTC m=+3397.938466225" watchObservedRunningTime="2025-10-07 12:17:11.1510922 +0000 UTC m=+3397.947491189" Oct 07 12:17:14 crc kubenswrapper[4700]: I1007 12:17:14.167580 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"90228e7e-2fff-43c5-8db3-6b803d39682c","Type":"ContainerStarted","Data":"f5f27481206b577170dc58ad998e630abda78272002fbbbaaefc894186813195"} Oct 07 12:17:15 crc kubenswrapper[4700]: I1007 12:17:15.334378 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:17:15 crc kubenswrapper[4700]: I1007 12:17:15.335657 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:17:15 crc kubenswrapper[4700]: I1007 12:17:15.335720 4700 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" Oct 07 12:17:15 crc kubenswrapper[4700]: I1007 12:17:15.336437 4700 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0bdab40f62fe10a31c63de84e7e40648ed0233a3222811a5befdca5ac33ff4be"} pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 12:17:15 crc kubenswrapper[4700]: I1007 12:17:15.336494 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" containerID="cri-o://0bdab40f62fe10a31c63de84e7e40648ed0233a3222811a5befdca5ac33ff4be" gracePeriod=600 Oct 07 12:17:16 crc kubenswrapper[4700]: I1007 12:17:16.192631 4700 generic.go:334] "Generic (PLEG): container finished" podID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerID="0bdab40f62fe10a31c63de84e7e40648ed0233a3222811a5befdca5ac33ff4be" exitCode=0 Oct 07 12:17:16 crc kubenswrapper[4700]: I1007 12:17:16.192976 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" event={"ID":"97a77b38-e9b1-4243-ac3a-28d83d87cf15","Type":"ContainerDied","Data":"0bdab40f62fe10a31c63de84e7e40648ed0233a3222811a5befdca5ac33ff4be"} Oct 07 12:17:16 crc kubenswrapper[4700]: I1007 12:17:16.193012 4700 scope.go:117] "RemoveContainer" containerID="c06b2df83958e39f4d5be0954930f188e3730ce13fd533f1ae848cc8c7038468" Oct 07 12:17:17 crc kubenswrapper[4700]: I1007 12:17:17.208575 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"90228e7e-2fff-43c5-8db3-6b803d39682c","Type":"ContainerStarted","Data":"14e8137905a0c4c51619ad9331328a1736f1b4d99e5a500f57140831ba55dc98"} Oct 07 12:17:17 crc kubenswrapper[4700]: I1007 12:17:17.221477 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" event={"ID":"97a77b38-e9b1-4243-ac3a-28d83d87cf15","Type":"ContainerStarted","Data":"5607bc9974eeb0f551316eb4488f453bd2b7dc72fa28aa7ecde26c755a3be75b"} Oct 07 12:17:17 crc kubenswrapper[4700]: I1007 12:17:17.251864 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.046489559 podStartE2EDuration="11.251847311s" podCreationTimestamp="2025-10-07 12:17:06 +0000 UTC" firstStartedPulling="2025-10-07 12:17:06.969713348 +0000 UTC m=+3393.766112357" lastFinishedPulling="2025-10-07 12:17:16.17507112 +0000 UTC m=+3402.971470109" observedRunningTime="2025-10-07 12:17:17.232368998 +0000 UTC m=+3404.028768077" watchObservedRunningTime="2025-10-07 12:17:17.251847311 +0000 UTC m=+3404.048246300" Oct 07 12:17:18 crc kubenswrapper[4700]: I1007 12:17:18.052297 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-82kbr" Oct 07 12:17:18 crc kubenswrapper[4700]: I1007 12:17:18.052704 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-82kbr" Oct 07 12:17:18 crc kubenswrapper[4700]: I1007 12:17:18.122433 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-82kbr" Oct 07 12:17:18 crc kubenswrapper[4700]: I1007 12:17:18.229852 4700 generic.go:334] "Generic (PLEG): container finished" podID="4ea57797-7f86-46b3-ab7e-d35d95e1c4a4" containerID="eba12554c6e1811a8e6ad023e5a462156b52507774eac34a88821da63a7c4a77" exitCode=0 Oct 07 12:17:18 crc kubenswrapper[4700]: I1007 12:17:18.229994 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4","Type":"ContainerDied","Data":"eba12554c6e1811a8e6ad023e5a462156b52507774eac34a88821da63a7c4a77"} Oct 07 12:17:18 crc kubenswrapper[4700]: I1007 12:17:18.295041 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-82kbr" Oct 07 12:17:18 crc kubenswrapper[4700]: I1007 12:17:18.357673 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-82kbr"] Oct 07 12:17:19 crc kubenswrapper[4700]: I1007 12:17:19.241574 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4","Type":"ContainerStarted","Data":"1985dc8b1dc230fe52cd0ac0802d7229a57df9a2a5ae35a15ca54e1ac04e4b36"} Oct 07 12:17:20 crc kubenswrapper[4700]: I1007 12:17:20.256367 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-82kbr" podUID="6968d3d1-c3bb-43df-a62f-17bf683ba455" containerName="registry-server" containerID="cri-o://247892545a67115b892ea82262bb747270c9e4b2528a457d936836da32273998" gracePeriod=2 Oct 07 12:17:22 crc kubenswrapper[4700]: I1007 12:17:22.284737 4700 generic.go:334] "Generic (PLEG): container finished" podID="6968d3d1-c3bb-43df-a62f-17bf683ba455" containerID="247892545a67115b892ea82262bb747270c9e4b2528a457d936836da32273998" exitCode=0 Oct 07 12:17:22 crc kubenswrapper[4700]: I1007 12:17:22.284826 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-82kbr" event={"ID":"6968d3d1-c3bb-43df-a62f-17bf683ba455","Type":"ContainerDied","Data":"247892545a67115b892ea82262bb747270c9e4b2528a457d936836da32273998"} Oct 07 12:17:22 crc kubenswrapper[4700]: I1007 12:17:22.893506 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-82kbr" Oct 07 12:17:22 crc kubenswrapper[4700]: I1007 12:17:22.993922 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6968d3d1-c3bb-43df-a62f-17bf683ba455-catalog-content\") pod \"6968d3d1-c3bb-43df-a62f-17bf683ba455\" (UID: \"6968d3d1-c3bb-43df-a62f-17bf683ba455\") " Oct 07 12:17:22 crc kubenswrapper[4700]: I1007 12:17:22.994174 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frnp7\" (UniqueName: \"kubernetes.io/projected/6968d3d1-c3bb-43df-a62f-17bf683ba455-kube-api-access-frnp7\") pod \"6968d3d1-c3bb-43df-a62f-17bf683ba455\" (UID: \"6968d3d1-c3bb-43df-a62f-17bf683ba455\") " Oct 07 12:17:22 crc kubenswrapper[4700]: I1007 12:17:22.994222 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6968d3d1-c3bb-43df-a62f-17bf683ba455-utilities\") pod \"6968d3d1-c3bb-43df-a62f-17bf683ba455\" (UID: \"6968d3d1-c3bb-43df-a62f-17bf683ba455\") " Oct 07 12:17:22 crc kubenswrapper[4700]: I1007 12:17:22.995013 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6968d3d1-c3bb-43df-a62f-17bf683ba455-utilities" (OuterVolumeSpecName: "utilities") pod "6968d3d1-c3bb-43df-a62f-17bf683ba455" (UID: "6968d3d1-c3bb-43df-a62f-17bf683ba455"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:17:23 crc kubenswrapper[4700]: I1007 12:17:23.000716 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6968d3d1-c3bb-43df-a62f-17bf683ba455-kube-api-access-frnp7" (OuterVolumeSpecName: "kube-api-access-frnp7") pod "6968d3d1-c3bb-43df-a62f-17bf683ba455" (UID: "6968d3d1-c3bb-43df-a62f-17bf683ba455"). InnerVolumeSpecName "kube-api-access-frnp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:17:23 crc kubenswrapper[4700]: I1007 12:17:23.068793 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6968d3d1-c3bb-43df-a62f-17bf683ba455-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6968d3d1-c3bb-43df-a62f-17bf683ba455" (UID: "6968d3d1-c3bb-43df-a62f-17bf683ba455"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:17:23 crc kubenswrapper[4700]: I1007 12:17:23.096638 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frnp7\" (UniqueName: \"kubernetes.io/projected/6968d3d1-c3bb-43df-a62f-17bf683ba455-kube-api-access-frnp7\") on node \"crc\" DevicePath \"\"" Oct 07 12:17:23 crc kubenswrapper[4700]: I1007 12:17:23.096666 4700 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6968d3d1-c3bb-43df-a62f-17bf683ba455-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 12:17:23 crc kubenswrapper[4700]: I1007 12:17:23.096675 4700 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6968d3d1-c3bb-43df-a62f-17bf683ba455-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 12:17:23 crc kubenswrapper[4700]: I1007 12:17:23.304007 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-82kbr" event={"ID":"6968d3d1-c3bb-43df-a62f-17bf683ba455","Type":"ContainerDied","Data":"6d1e6300951ad5f8bf7f13200709b6179655bd1ae273cacde7dc1742bdd75c74"} Oct 07 12:17:23 crc kubenswrapper[4700]: I1007 12:17:23.304108 4700 scope.go:117] "RemoveContainer" containerID="247892545a67115b892ea82262bb747270c9e4b2528a457d936836da32273998" Oct 07 12:17:23 crc kubenswrapper[4700]: I1007 12:17:23.305394 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-82kbr" Oct 07 12:17:23 crc kubenswrapper[4700]: I1007 12:17:23.346182 4700 scope.go:117] "RemoveContainer" containerID="973675de124f56170a81d35860689254476d16c8a3ced834fd2f082abf0c501a" Oct 07 12:17:23 crc kubenswrapper[4700]: I1007 12:17:23.367581 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-82kbr"] Oct 07 12:17:23 crc kubenswrapper[4700]: I1007 12:17:23.375722 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-82kbr"] Oct 07 12:17:23 crc kubenswrapper[4700]: I1007 12:17:23.390796 4700 scope.go:117] "RemoveContainer" containerID="cd984c3627dfe98e33edef2d18a2f82fce4b16c2cd94b98a124775f8dadabf04" Oct 07 12:17:23 crc kubenswrapper[4700]: I1007 12:17:23.974427 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6968d3d1-c3bb-43df-a62f-17bf683ba455" path="/var/lib/kubelet/pods/6968d3d1-c3bb-43df-a62f-17bf683ba455/volumes" Oct 07 12:17:25 crc kubenswrapper[4700]: I1007 12:17:25.366105 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4","Type":"ContainerStarted","Data":"d540447e032010a1b5195fc87785b5e5decd5baa78d5a05fccf5a77fb97f1be6"} Oct 07 12:17:25 crc kubenswrapper[4700]: I1007 12:17:25.366698 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4","Type":"ContainerStarted","Data":"d0e984d65ee653c5b2b4a431dd87e1b24322ba89eca29bc6847d0e0bb7fc6fca"} Oct 07 12:17:26 crc kubenswrapper[4700]: I1007 12:17:26.436635 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=23.436604689 podStartE2EDuration="23.436604689s" podCreationTimestamp="2025-10-07 12:17:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:17:26.426020985 +0000 UTC m=+3413.222420064" watchObservedRunningTime="2025-10-07 12:17:26.436604689 +0000 UTC m=+3413.233003718" Oct 07 12:17:28 crc kubenswrapper[4700]: I1007 12:17:28.360922 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:33 crc kubenswrapper[4700]: I1007 12:17:33.360752 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:33 crc kubenswrapper[4700]: I1007 12:17:33.370939 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:33 crc kubenswrapper[4700]: I1007 12:17:33.492460 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 07 12:17:45 crc kubenswrapper[4700]: I1007 12:17:45.311286 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7mghc"] Oct 07 12:17:45 crc kubenswrapper[4700]: E1007 12:17:45.312485 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6968d3d1-c3bb-43df-a62f-17bf683ba455" containerName="extract-content" Oct 07 12:17:45 crc kubenswrapper[4700]: I1007 12:17:45.312506 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="6968d3d1-c3bb-43df-a62f-17bf683ba455" containerName="extract-content" Oct 07 12:17:45 crc kubenswrapper[4700]: E1007 12:17:45.312531 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6968d3d1-c3bb-43df-a62f-17bf683ba455" containerName="registry-server" Oct 07 12:17:45 crc kubenswrapper[4700]: I1007 12:17:45.312544 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="6968d3d1-c3bb-43df-a62f-17bf683ba455" containerName="registry-server" Oct 07 12:17:45 crc kubenswrapper[4700]: E1007 12:17:45.312615 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6968d3d1-c3bb-43df-a62f-17bf683ba455" containerName="extract-utilities" Oct 07 12:17:45 crc kubenswrapper[4700]: I1007 12:17:45.312628 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="6968d3d1-c3bb-43df-a62f-17bf683ba455" containerName="extract-utilities" Oct 07 12:17:45 crc kubenswrapper[4700]: I1007 12:17:45.312956 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="6968d3d1-c3bb-43df-a62f-17bf683ba455" containerName="registry-server" Oct 07 12:17:45 crc kubenswrapper[4700]: I1007 12:17:45.316271 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7mghc" Oct 07 12:17:45 crc kubenswrapper[4700]: I1007 12:17:45.343403 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7mghc"] Oct 07 12:17:45 crc kubenswrapper[4700]: I1007 12:17:45.422166 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b9d456f-8688-4f2a-929d-4e1270959d5a-utilities\") pod \"community-operators-7mghc\" (UID: \"5b9d456f-8688-4f2a-929d-4e1270959d5a\") " pod="openshift-marketplace/community-operators-7mghc" Oct 07 12:17:45 crc kubenswrapper[4700]: I1007 12:17:45.422259 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b9d456f-8688-4f2a-929d-4e1270959d5a-catalog-content\") pod \"community-operators-7mghc\" (UID: \"5b9d456f-8688-4f2a-929d-4e1270959d5a\") " pod="openshift-marketplace/community-operators-7mghc" Oct 07 12:17:45 crc kubenswrapper[4700]: I1007 12:17:45.422305 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s9r4\" (UniqueName: \"kubernetes.io/projected/5b9d456f-8688-4f2a-929d-4e1270959d5a-kube-api-access-9s9r4\") pod \"community-operators-7mghc\" (UID: \"5b9d456f-8688-4f2a-929d-4e1270959d5a\") " pod="openshift-marketplace/community-operators-7mghc" Oct 07 12:17:45 crc kubenswrapper[4700]: I1007 12:17:45.524354 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b9d456f-8688-4f2a-929d-4e1270959d5a-utilities\") pod \"community-operators-7mghc\" (UID: \"5b9d456f-8688-4f2a-929d-4e1270959d5a\") " pod="openshift-marketplace/community-operators-7mghc" Oct 07 12:17:45 crc kubenswrapper[4700]: I1007 12:17:45.524467 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b9d456f-8688-4f2a-929d-4e1270959d5a-catalog-content\") pod \"community-operators-7mghc\" (UID: \"5b9d456f-8688-4f2a-929d-4e1270959d5a\") " pod="openshift-marketplace/community-operators-7mghc" Oct 07 12:17:45 crc kubenswrapper[4700]: I1007 12:17:45.524505 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s9r4\" (UniqueName: \"kubernetes.io/projected/5b9d456f-8688-4f2a-929d-4e1270959d5a-kube-api-access-9s9r4\") pod \"community-operators-7mghc\" (UID: \"5b9d456f-8688-4f2a-929d-4e1270959d5a\") " pod="openshift-marketplace/community-operators-7mghc" Oct 07 12:17:45 crc kubenswrapper[4700]: I1007 12:17:45.525021 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b9d456f-8688-4f2a-929d-4e1270959d5a-catalog-content\") pod \"community-operators-7mghc\" (UID: \"5b9d456f-8688-4f2a-929d-4e1270959d5a\") " pod="openshift-marketplace/community-operators-7mghc" Oct 07 12:17:45 crc kubenswrapper[4700]: I1007 12:17:45.525038 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b9d456f-8688-4f2a-929d-4e1270959d5a-utilities\") pod \"community-operators-7mghc\" (UID: \"5b9d456f-8688-4f2a-929d-4e1270959d5a\") " pod="openshift-marketplace/community-operators-7mghc" Oct 07 12:17:45 crc kubenswrapper[4700]: I1007 12:17:45.559474 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s9r4\" (UniqueName: \"kubernetes.io/projected/5b9d456f-8688-4f2a-929d-4e1270959d5a-kube-api-access-9s9r4\") pod \"community-operators-7mghc\" (UID: \"5b9d456f-8688-4f2a-929d-4e1270959d5a\") " pod="openshift-marketplace/community-operators-7mghc" Oct 07 12:17:45 crc kubenswrapper[4700]: I1007 12:17:45.666627 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7mghc" Oct 07 12:17:46 crc kubenswrapper[4700]: W1007 12:17:46.189153 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b9d456f_8688_4f2a_929d_4e1270959d5a.slice/crio-46f62b90084c0d71ea99cb28376d4119a50a3a0f6c194d5eca82148b9108decf WatchSource:0}: Error finding container 46f62b90084c0d71ea99cb28376d4119a50a3a0f6c194d5eca82148b9108decf: Status 404 returned error can't find the container with id 46f62b90084c0d71ea99cb28376d4119a50a3a0f6c194d5eca82148b9108decf Oct 07 12:17:46 crc kubenswrapper[4700]: I1007 12:17:46.191903 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7mghc"] Oct 07 12:17:46 crc kubenswrapper[4700]: I1007 12:17:46.643130 4700 generic.go:334] "Generic (PLEG): container finished" podID="5b9d456f-8688-4f2a-929d-4e1270959d5a" containerID="a10f1ae9ee5f46abb937462f9a46daedffbda7f55538af8b485d0f35fa74a377" exitCode=0 Oct 07 12:17:46 crc kubenswrapper[4700]: I1007 12:17:46.643247 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7mghc" event={"ID":"5b9d456f-8688-4f2a-929d-4e1270959d5a","Type":"ContainerDied","Data":"a10f1ae9ee5f46abb937462f9a46daedffbda7f55538af8b485d0f35fa74a377"} Oct 07 12:17:46 crc kubenswrapper[4700]: I1007 12:17:46.643411 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7mghc" event={"ID":"5b9d456f-8688-4f2a-929d-4e1270959d5a","Type":"ContainerStarted","Data":"46f62b90084c0d71ea99cb28376d4119a50a3a0f6c194d5eca82148b9108decf"} Oct 07 12:17:49 crc kubenswrapper[4700]: I1007 12:17:49.677472 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7mghc" event={"ID":"5b9d456f-8688-4f2a-929d-4e1270959d5a","Type":"ContainerStarted","Data":"73c178285b31e380c9bd3da13378226f1ecd601e55aed98c43017a17da709a9b"} Oct 07 12:17:53 crc kubenswrapper[4700]: I1007 12:17:53.727167 4700 generic.go:334] "Generic (PLEG): container finished" podID="5b9d456f-8688-4f2a-929d-4e1270959d5a" containerID="73c178285b31e380c9bd3da13378226f1ecd601e55aed98c43017a17da709a9b" exitCode=0 Oct 07 12:17:53 crc kubenswrapper[4700]: I1007 12:17:53.728003 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7mghc" event={"ID":"5b9d456f-8688-4f2a-929d-4e1270959d5a","Type":"ContainerDied","Data":"73c178285b31e380c9bd3da13378226f1ecd601e55aed98c43017a17da709a9b"} Oct 07 12:17:55 crc kubenswrapper[4700]: I1007 12:17:55.751278 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7mghc" event={"ID":"5b9d456f-8688-4f2a-929d-4e1270959d5a","Type":"ContainerStarted","Data":"1a93f98ef1491aa666b7237bee16be2e698ee9dce43df72489265756412ecf91"} Oct 07 12:17:55 crc kubenswrapper[4700]: I1007 12:17:55.776189 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7mghc" podStartSLOduration=2.178193945 podStartE2EDuration="10.776170199s" podCreationTimestamp="2025-10-07 12:17:45 +0000 UTC" firstStartedPulling="2025-10-07 12:17:46.646178679 +0000 UTC m=+3433.442577668" lastFinishedPulling="2025-10-07 12:17:55.244154913 +0000 UTC m=+3442.040553922" observedRunningTime="2025-10-07 12:17:55.767935816 +0000 UTC m=+3442.564334825" watchObservedRunningTime="2025-10-07 12:17:55.776170199 +0000 UTC m=+3442.572569188" Oct 07 12:18:05 crc kubenswrapper[4700]: I1007 12:18:05.667496 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7mghc" Oct 07 12:18:05 crc kubenswrapper[4700]: I1007 12:18:05.668362 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7mghc" Oct 07 12:18:05 crc kubenswrapper[4700]: I1007 12:18:05.751137 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7mghc" Oct 07 12:18:05 crc kubenswrapper[4700]: I1007 12:18:05.943992 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7mghc" Oct 07 12:18:06 crc kubenswrapper[4700]: I1007 12:18:06.009059 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7mghc"] Oct 07 12:18:07 crc kubenswrapper[4700]: I1007 12:18:07.892706 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7mghc" podUID="5b9d456f-8688-4f2a-929d-4e1270959d5a" containerName="registry-server" containerID="cri-o://1a93f98ef1491aa666b7237bee16be2e698ee9dce43df72489265756412ecf91" gracePeriod=2 Oct 07 12:18:08 crc kubenswrapper[4700]: I1007 12:18:08.907752 4700 generic.go:334] "Generic (PLEG): container finished" podID="5b9d456f-8688-4f2a-929d-4e1270959d5a" containerID="1a93f98ef1491aa666b7237bee16be2e698ee9dce43df72489265756412ecf91" exitCode=0 Oct 07 12:18:08 crc kubenswrapper[4700]: I1007 12:18:08.907849 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7mghc" event={"ID":"5b9d456f-8688-4f2a-929d-4e1270959d5a","Type":"ContainerDied","Data":"1a93f98ef1491aa666b7237bee16be2e698ee9dce43df72489265756412ecf91"} Oct 07 12:18:09 crc kubenswrapper[4700]: I1007 12:18:09.047150 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7mghc" Oct 07 12:18:09 crc kubenswrapper[4700]: I1007 12:18:09.243735 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b9d456f-8688-4f2a-929d-4e1270959d5a-utilities\") pod \"5b9d456f-8688-4f2a-929d-4e1270959d5a\" (UID: \"5b9d456f-8688-4f2a-929d-4e1270959d5a\") " Oct 07 12:18:09 crc kubenswrapper[4700]: I1007 12:18:09.243945 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9s9r4\" (UniqueName: \"kubernetes.io/projected/5b9d456f-8688-4f2a-929d-4e1270959d5a-kube-api-access-9s9r4\") pod \"5b9d456f-8688-4f2a-929d-4e1270959d5a\" (UID: \"5b9d456f-8688-4f2a-929d-4e1270959d5a\") " Oct 07 12:18:09 crc kubenswrapper[4700]: I1007 12:18:09.243992 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b9d456f-8688-4f2a-929d-4e1270959d5a-catalog-content\") pod \"5b9d456f-8688-4f2a-929d-4e1270959d5a\" (UID: \"5b9d456f-8688-4f2a-929d-4e1270959d5a\") " Oct 07 12:18:09 crc kubenswrapper[4700]: I1007 12:18:09.245111 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b9d456f-8688-4f2a-929d-4e1270959d5a-utilities" (OuterVolumeSpecName: "utilities") pod "5b9d456f-8688-4f2a-929d-4e1270959d5a" (UID: "5b9d456f-8688-4f2a-929d-4e1270959d5a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:18:09 crc kubenswrapper[4700]: I1007 12:18:09.256090 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b9d456f-8688-4f2a-929d-4e1270959d5a-kube-api-access-9s9r4" (OuterVolumeSpecName: "kube-api-access-9s9r4") pod "5b9d456f-8688-4f2a-929d-4e1270959d5a" (UID: "5b9d456f-8688-4f2a-929d-4e1270959d5a"). InnerVolumeSpecName "kube-api-access-9s9r4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:18:09 crc kubenswrapper[4700]: I1007 12:18:09.316331 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b9d456f-8688-4f2a-929d-4e1270959d5a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5b9d456f-8688-4f2a-929d-4e1270959d5a" (UID: "5b9d456f-8688-4f2a-929d-4e1270959d5a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:18:09 crc kubenswrapper[4700]: I1007 12:18:09.346669 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9s9r4\" (UniqueName: \"kubernetes.io/projected/5b9d456f-8688-4f2a-929d-4e1270959d5a-kube-api-access-9s9r4\") on node \"crc\" DevicePath \"\"" Oct 07 12:18:09 crc kubenswrapper[4700]: I1007 12:18:09.346714 4700 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b9d456f-8688-4f2a-929d-4e1270959d5a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 12:18:09 crc kubenswrapper[4700]: I1007 12:18:09.346727 4700 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b9d456f-8688-4f2a-929d-4e1270959d5a-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 12:18:09 crc kubenswrapper[4700]: I1007 12:18:09.928611 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7mghc" event={"ID":"5b9d456f-8688-4f2a-929d-4e1270959d5a","Type":"ContainerDied","Data":"46f62b90084c0d71ea99cb28376d4119a50a3a0f6c194d5eca82148b9108decf"} Oct 07 12:18:09 crc kubenswrapper[4700]: I1007 12:18:09.928940 4700 scope.go:117] "RemoveContainer" containerID="1a93f98ef1491aa666b7237bee16be2e698ee9dce43df72489265756412ecf91" Oct 07 12:18:09 crc kubenswrapper[4700]: I1007 12:18:09.928782 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7mghc" Oct 07 12:18:09 crc kubenswrapper[4700]: I1007 12:18:09.988891 4700 scope.go:117] "RemoveContainer" containerID="73c178285b31e380c9bd3da13378226f1ecd601e55aed98c43017a17da709a9b" Oct 07 12:18:09 crc kubenswrapper[4700]: I1007 12:18:09.992861 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7mghc"] Oct 07 12:18:10 crc kubenswrapper[4700]: I1007 12:18:10.002966 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7mghc"] Oct 07 12:18:10 crc kubenswrapper[4700]: I1007 12:18:10.019366 4700 scope.go:117] "RemoveContainer" containerID="a10f1ae9ee5f46abb937462f9a46daedffbda7f55538af8b485d0f35fa74a377" Oct 07 12:18:11 crc kubenswrapper[4700]: I1007 12:18:11.977824 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b9d456f-8688-4f2a-929d-4e1270959d5a" path="/var/lib/kubelet/pods/5b9d456f-8688-4f2a-929d-4e1270959d5a/volumes" Oct 07 12:19:35 crc kubenswrapper[4700]: I1007 12:19:35.381742 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-bf98bb7b6-ghwv9_b5911608-e23e-46e8-9637-488593110278/manager/0.log" Oct 07 12:19:37 crc kubenswrapper[4700]: I1007 12:19:37.155124 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 07 12:19:37 crc kubenswrapper[4700]: I1007 12:19:37.155723 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="4ea57797-7f86-46b3-ab7e-d35d95e1c4a4" containerName="prometheus" containerID="cri-o://1985dc8b1dc230fe52cd0ac0802d7229a57df9a2a5ae35a15ca54e1ac04e4b36" gracePeriod=600 Oct 07 12:19:37 crc kubenswrapper[4700]: I1007 12:19:37.155796 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="4ea57797-7f86-46b3-ab7e-d35d95e1c4a4" containerName="config-reloader" containerID="cri-o://d0e984d65ee653c5b2b4a431dd87e1b24322ba89eca29bc6847d0e0bb7fc6fca" gracePeriod=600 Oct 07 12:19:37 crc kubenswrapper[4700]: I1007 12:19:37.155986 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="4ea57797-7f86-46b3-ab7e-d35d95e1c4a4" containerName="thanos-sidecar" containerID="cri-o://d540447e032010a1b5195fc87785b5e5decd5baa78d5a05fccf5a77fb97f1be6" gracePeriod=600 Oct 07 12:19:37 crc kubenswrapper[4700]: I1007 12:19:37.906243 4700 generic.go:334] "Generic (PLEG): container finished" podID="4ea57797-7f86-46b3-ab7e-d35d95e1c4a4" containerID="d540447e032010a1b5195fc87785b5e5decd5baa78d5a05fccf5a77fb97f1be6" exitCode=0 Oct 07 12:19:37 crc kubenswrapper[4700]: I1007 12:19:37.906814 4700 generic.go:334] "Generic (PLEG): container finished" podID="4ea57797-7f86-46b3-ab7e-d35d95e1c4a4" containerID="d0e984d65ee653c5b2b4a431dd87e1b24322ba89eca29bc6847d0e0bb7fc6fca" exitCode=0 Oct 07 12:19:37 crc kubenswrapper[4700]: I1007 12:19:37.906828 4700 generic.go:334] "Generic (PLEG): container finished" podID="4ea57797-7f86-46b3-ab7e-d35d95e1c4a4" containerID="1985dc8b1dc230fe52cd0ac0802d7229a57df9a2a5ae35a15ca54e1ac04e4b36" exitCode=0 Oct 07 12:19:37 crc kubenswrapper[4700]: I1007 12:19:37.906332 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4","Type":"ContainerDied","Data":"d540447e032010a1b5195fc87785b5e5decd5baa78d5a05fccf5a77fb97f1be6"} Oct 07 12:19:37 crc kubenswrapper[4700]: I1007 12:19:37.906886 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4","Type":"ContainerDied","Data":"d0e984d65ee653c5b2b4a431dd87e1b24322ba89eca29bc6847d0e0bb7fc6fca"} Oct 07 12:19:37 crc kubenswrapper[4700]: I1007 12:19:37.906903 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4","Type":"ContainerDied","Data":"1985dc8b1dc230fe52cd0ac0802d7229a57df9a2a5ae35a15ca54e1ac04e4b36"} Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.137445 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.236354 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-web-config\") pod \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.236411 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-tls-assets\") pod \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.236431 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-config\") pod \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.236611 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e554be8a-675d-4ec2-b2d9-1b74f4ebb37f\") pod \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.236635 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-prometheus-metric-storage-rulefiles-0\") pod \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.236697 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-secret-combined-ca-bundle\") pod \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.236767 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.236801 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-thanos-prometheus-http-client-file\") pod \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.236837 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.236858 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnk6r\" (UniqueName: \"kubernetes.io/projected/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-kube-api-access-nnk6r\") pod \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.236888 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-config-out\") pod \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\" (UID: \"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4\") " Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.237210 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "4ea57797-7f86-46b3-ab7e-d35d95e1c4a4" (UID: "4ea57797-7f86-46b3-ab7e-d35d95e1c4a4"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.237505 4700 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.242870 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "4ea57797-7f86-46b3-ab7e-d35d95e1c4a4" (UID: "4ea57797-7f86-46b3-ab7e-d35d95e1c4a4"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.243589 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "4ea57797-7f86-46b3-ab7e-d35d95e1c4a4" (UID: "4ea57797-7f86-46b3-ab7e-d35d95e1c4a4"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.243611 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "4ea57797-7f86-46b3-ab7e-d35d95e1c4a4" (UID: "4ea57797-7f86-46b3-ab7e-d35d95e1c4a4"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.244192 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-kube-api-access-nnk6r" (OuterVolumeSpecName: "kube-api-access-nnk6r") pod "4ea57797-7f86-46b3-ab7e-d35d95e1c4a4" (UID: "4ea57797-7f86-46b3-ab7e-d35d95e1c4a4"). InnerVolumeSpecName "kube-api-access-nnk6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.244499 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-config-out" (OuterVolumeSpecName: "config-out") pod "4ea57797-7f86-46b3-ab7e-d35d95e1c4a4" (UID: "4ea57797-7f86-46b3-ab7e-d35d95e1c4a4"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.244853 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "4ea57797-7f86-46b3-ab7e-d35d95e1c4a4" (UID: "4ea57797-7f86-46b3-ab7e-d35d95e1c4a4"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.246478 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-config" (OuterVolumeSpecName: "config") pod "4ea57797-7f86-46b3-ab7e-d35d95e1c4a4" (UID: "4ea57797-7f86-46b3-ab7e-d35d95e1c4a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.251893 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "4ea57797-7f86-46b3-ab7e-d35d95e1c4a4" (UID: "4ea57797-7f86-46b3-ab7e-d35d95e1c4a4"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.258811 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e554be8a-675d-4ec2-b2d9-1b74f4ebb37f" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "4ea57797-7f86-46b3-ab7e-d35d95e1c4a4" (UID: "4ea57797-7f86-46b3-ab7e-d35d95e1c4a4"). InnerVolumeSpecName "pvc-e554be8a-675d-4ec2-b2d9-1b74f4ebb37f". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.319632 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-web-config" (OuterVolumeSpecName: "web-config") pod "4ea57797-7f86-46b3-ab7e-d35d95e1c4a4" (UID: "4ea57797-7f86-46b3-ab7e-d35d95e1c4a4"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.338880 4700 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.338919 4700 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.338934 4700 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.338948 4700 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.338961 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnk6r\" (UniqueName: \"kubernetes.io/projected/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-kube-api-access-nnk6r\") on node \"crc\" DevicePath \"\"" Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.338972 4700 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-config-out\") on node \"crc\" DevicePath \"\"" Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.338984 4700 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-web-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.338993 4700 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-tls-assets\") on node \"crc\" DevicePath \"\"" Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.339002 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.339040 4700 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-e554be8a-675d-4ec2-b2d9-1b74f4ebb37f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e554be8a-675d-4ec2-b2d9-1b74f4ebb37f\") on node \"crc\" " Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.366150 4700 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.366298 4700 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-e554be8a-675d-4ec2-b2d9-1b74f4ebb37f" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e554be8a-675d-4ec2-b2d9-1b74f4ebb37f") on node "crc" Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.441007 4700 reconciler_common.go:293] "Volume detached for volume \"pvc-e554be8a-675d-4ec2-b2d9-1b74f4ebb37f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e554be8a-675d-4ec2-b2d9-1b74f4ebb37f\") on node \"crc\" DevicePath \"\"" Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.926259 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4ea57797-7f86-46b3-ab7e-d35d95e1c4a4","Type":"ContainerDied","Data":"37791f44e9a471223a4c304747490276a6fb5245a486b5af18c32bd4fffcaf2d"} Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.926334 4700 scope.go:117] "RemoveContainer" containerID="d540447e032010a1b5195fc87785b5e5decd5baa78d5a05fccf5a77fb97f1be6" Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.926362 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.977418 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.980028 4700 scope.go:117] "RemoveContainer" containerID="d0e984d65ee653c5b2b4a431dd87e1b24322ba89eca29bc6847d0e0bb7fc6fca" Oct 07 12:19:38 crc kubenswrapper[4700]: I1007 12:19:38.987179 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.013927 4700 scope.go:117] "RemoveContainer" containerID="1985dc8b1dc230fe52cd0ac0802d7229a57df9a2a5ae35a15ca54e1ac04e4b36" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.037457 4700 scope.go:117] "RemoveContainer" containerID="eba12554c6e1811a8e6ad023e5a462156b52507774eac34a88821da63a7c4a77" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.685428 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 07 12:19:39 crc kubenswrapper[4700]: E1007 12:19:39.686122 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b9d456f-8688-4f2a-929d-4e1270959d5a" containerName="extract-content" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.686137 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b9d456f-8688-4f2a-929d-4e1270959d5a" containerName="extract-content" Oct 07 12:19:39 crc kubenswrapper[4700]: E1007 12:19:39.686179 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ea57797-7f86-46b3-ab7e-d35d95e1c4a4" containerName="config-reloader" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.686188 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ea57797-7f86-46b3-ab7e-d35d95e1c4a4" containerName="config-reloader" Oct 07 12:19:39 crc kubenswrapper[4700]: E1007 12:19:39.686201 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b9d456f-8688-4f2a-929d-4e1270959d5a" containerName="registry-server" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.686211 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b9d456f-8688-4f2a-929d-4e1270959d5a" containerName="registry-server" Oct 07 12:19:39 crc kubenswrapper[4700]: E1007 12:19:39.686233 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ea57797-7f86-46b3-ab7e-d35d95e1c4a4" containerName="prometheus" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.686242 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ea57797-7f86-46b3-ab7e-d35d95e1c4a4" containerName="prometheus" Oct 07 12:19:39 crc kubenswrapper[4700]: E1007 12:19:39.686268 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b9d456f-8688-4f2a-929d-4e1270959d5a" containerName="extract-utilities" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.686277 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b9d456f-8688-4f2a-929d-4e1270959d5a" containerName="extract-utilities" Oct 07 12:19:39 crc kubenswrapper[4700]: E1007 12:19:39.686800 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ea57797-7f86-46b3-ab7e-d35d95e1c4a4" containerName="thanos-sidecar" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.686818 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ea57797-7f86-46b3-ab7e-d35d95e1c4a4" containerName="thanos-sidecar" Oct 07 12:19:39 crc kubenswrapper[4700]: E1007 12:19:39.686834 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ea57797-7f86-46b3-ab7e-d35d95e1c4a4" containerName="init-config-reloader" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.686843 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ea57797-7f86-46b3-ab7e-d35d95e1c4a4" containerName="init-config-reloader" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.687206 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ea57797-7f86-46b3-ab7e-d35d95e1c4a4" containerName="prometheus" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.687231 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ea57797-7f86-46b3-ab7e-d35d95e1c4a4" containerName="thanos-sidecar" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.687246 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b9d456f-8688-4f2a-929d-4e1270959d5a" containerName="registry-server" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.687296 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ea57797-7f86-46b3-ab7e-d35d95e1c4a4" containerName="config-reloader" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.690030 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.693035 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.693504 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.700990 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-cgn6d" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.700990 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.701034 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.701523 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.708798 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.725994 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.767830 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.767894 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.767917 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.767950 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-272g5\" (UniqueName: \"kubernetes.io/projected/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-kube-api-access-272g5\") pod \"prometheus-metric-storage-0\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.767984 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.768035 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.768053 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-prometheus-metric-storage-db\") pod \"prometheus-metric-storage-0\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.768076 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.768094 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-config\") pod \"prometheus-metric-storage-0\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.768146 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.768340 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.870406 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-272g5\" (UniqueName: \"kubernetes.io/projected/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-kube-api-access-272g5\") pod \"prometheus-metric-storage-0\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.870483 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.870560 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.870590 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-prometheus-metric-storage-db\") pod \"prometheus-metric-storage-0\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.870618 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.870645 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-config\") pod \"prometheus-metric-storage-0\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.870665 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.870710 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.870945 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.870993 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.871015 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.873545 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.873572 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-prometheus-metric-storage-db\") pod \"prometheus-metric-storage-0\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.879633 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.880414 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.881094 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.884786 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-config\") pod \"prometheus-metric-storage-0\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.884926 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.887647 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.888366 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.888905 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.894925 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-272g5\" (UniqueName: \"kubernetes.io/projected/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-kube-api-access-272g5\") pod \"prometheus-metric-storage-0\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:19:39 crc kubenswrapper[4700]: I1007 12:19:39.968420 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ea57797-7f86-46b3-ab7e-d35d95e1c4a4" path="/var/lib/kubelet/pods/4ea57797-7f86-46b3-ab7e-d35d95e1c4a4/volumes" Oct 07 12:19:40 crc kubenswrapper[4700]: I1007 12:19:40.025172 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 07 12:19:40 crc kubenswrapper[4700]: I1007 12:19:40.498497 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 07 12:19:40 crc kubenswrapper[4700]: I1007 12:19:40.956048 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f","Type":"ContainerStarted","Data":"91fdffa6e672e47f06bd04c907a5f56905b75f9a3cda20ba438eff8eae01fb7f"} Oct 07 12:19:45 crc kubenswrapper[4700]: I1007 12:19:45.021659 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f","Type":"ContainerStarted","Data":"f935695e58d2a1291ec54b1e5d37fe445ac85fb3e2e8236922509df17513e09a"} Oct 07 12:19:45 crc kubenswrapper[4700]: I1007 12:19:45.333588 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:19:45 crc kubenswrapper[4700]: I1007 12:19:45.333694 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:19:53 crc kubenswrapper[4700]: I1007 12:19:53.128353 4700 generic.go:334] "Generic (PLEG): container finished" podID="89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f" containerID="f935695e58d2a1291ec54b1e5d37fe445ac85fb3e2e8236922509df17513e09a" exitCode=0 Oct 07 12:19:53 crc kubenswrapper[4700]: I1007 12:19:53.128469 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f","Type":"ContainerDied","Data":"f935695e58d2a1291ec54b1e5d37fe445ac85fb3e2e8236922509df17513e09a"} Oct 07 12:19:54 crc kubenswrapper[4700]: I1007 12:19:54.163917 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f","Type":"ContainerStarted","Data":"9c562659bbb947671ebc48d5e7b085c12b8a8f0b127f7a3f38c8b4477ccade59"} Oct 07 12:19:57 crc kubenswrapper[4700]: I1007 12:19:57.210519 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f","Type":"ContainerStarted","Data":"b36ec2d380001d57492bd00228973959b739899d591b0890400451716d0218e2"} Oct 07 12:19:58 crc kubenswrapper[4700]: I1007 12:19:58.223366 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f","Type":"ContainerStarted","Data":"7282853fd35db2f480dc1dfb7bb7ea5c62ce7cc1cb7ddd182d11ff542cb9df38"} Oct 07 12:19:58 crc kubenswrapper[4700]: I1007 12:19:58.255038 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=19.255022402 podStartE2EDuration="19.255022402s" podCreationTimestamp="2025-10-07 12:19:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:19:58.252737723 +0000 UTC m=+3565.049136712" watchObservedRunningTime="2025-10-07 12:19:58.255022402 +0000 UTC m=+3565.051421391" Oct 07 12:20:00 crc kubenswrapper[4700]: I1007 12:20:00.025479 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 07 12:20:10 crc kubenswrapper[4700]: I1007 12:20:10.025704 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 07 12:20:10 crc kubenswrapper[4700]: I1007 12:20:10.034718 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 07 12:20:10 crc kubenswrapper[4700]: I1007 12:20:10.350813 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 07 12:20:15 crc kubenswrapper[4700]: I1007 12:20:15.333934 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:20:15 crc kubenswrapper[4700]: I1007 12:20:15.334566 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:20:38 crc kubenswrapper[4700]: I1007 12:20:38.039922 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-2bphs"] Oct 07 12:20:38 crc kubenswrapper[4700]: I1007 12:20:38.049269 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-2bphs"] Oct 07 12:20:39 crc kubenswrapper[4700]: I1007 12:20:39.974129 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c2f4928-4400-49ed-bc37-8d9ab2aa0de6" path="/var/lib/kubelet/pods/1c2f4928-4400-49ed-bc37-8d9ab2aa0de6/volumes" Oct 07 12:20:45 crc kubenswrapper[4700]: I1007 12:20:45.333447 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:20:45 crc kubenswrapper[4700]: I1007 12:20:45.335251 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:20:45 crc kubenswrapper[4700]: I1007 12:20:45.335465 4700 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" Oct 07 12:20:45 crc kubenswrapper[4700]: I1007 12:20:45.336626 4700 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5607bc9974eeb0f551316eb4488f453bd2b7dc72fa28aa7ecde26c755a3be75b"} pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 12:20:45 crc kubenswrapper[4700]: I1007 12:20:45.336814 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" containerID="cri-o://5607bc9974eeb0f551316eb4488f453bd2b7dc72fa28aa7ecde26c755a3be75b" gracePeriod=600 Oct 07 12:20:45 crc kubenswrapper[4700]: E1007 12:20:45.463131 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:20:45 crc kubenswrapper[4700]: I1007 12:20:45.721910 4700 generic.go:334] "Generic (PLEG): container finished" podID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerID="5607bc9974eeb0f551316eb4488f453bd2b7dc72fa28aa7ecde26c755a3be75b" exitCode=0 Oct 07 12:20:45 crc kubenswrapper[4700]: I1007 12:20:45.722414 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" event={"ID":"97a77b38-e9b1-4243-ac3a-28d83d87cf15","Type":"ContainerDied","Data":"5607bc9974eeb0f551316eb4488f453bd2b7dc72fa28aa7ecde26c755a3be75b"} Oct 07 12:20:45 crc kubenswrapper[4700]: I1007 12:20:45.722539 4700 scope.go:117] "RemoveContainer" containerID="0bdab40f62fe10a31c63de84e7e40648ed0233a3222811a5befdca5ac33ff4be" Oct 07 12:20:45 crc kubenswrapper[4700]: I1007 12:20:45.723210 4700 scope.go:117] "RemoveContainer" containerID="5607bc9974eeb0f551316eb4488f453bd2b7dc72fa28aa7ecde26c755a3be75b" Oct 07 12:20:45 crc kubenswrapper[4700]: E1007 12:20:45.723541 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:20:48 crc kubenswrapper[4700]: I1007 12:20:48.031027 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-998f-account-create-tlv92"] Oct 07 12:20:48 crc kubenswrapper[4700]: I1007 12:20:48.042163 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-998f-account-create-tlv92"] Oct 07 12:20:49 crc kubenswrapper[4700]: I1007 12:20:49.992821 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4fb38c5-276c-41c2-8f9c-f03e43564dbe" path="/var/lib/kubelet/pods/c4fb38c5-276c-41c2-8f9c-f03e43564dbe/volumes" Oct 07 12:20:59 crc kubenswrapper[4700]: I1007 12:20:59.957657 4700 scope.go:117] "RemoveContainer" containerID="5607bc9974eeb0f551316eb4488f453bd2b7dc72fa28aa7ecde26c755a3be75b" Oct 07 12:20:59 crc kubenswrapper[4700]: E1007 12:20:59.958541 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:21:00 crc kubenswrapper[4700]: I1007 12:21:00.057409 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-2nc8j"] Oct 07 12:21:00 crc kubenswrapper[4700]: I1007 12:21:00.100746 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-2nc8j"] Oct 07 12:21:01 crc kubenswrapper[4700]: I1007 12:21:01.971796 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11ba889b-74ce-4f70-8ee0-d5a21cc39d98" path="/var/lib/kubelet/pods/11ba889b-74ce-4f70-8ee0-d5a21cc39d98/volumes" Oct 07 12:21:07 crc kubenswrapper[4700]: I1007 12:21:07.759677 4700 scope.go:117] "RemoveContainer" containerID="660657e66ff61d19b133777d92fb411d0020e3d8f392295549535bd92fee14af" Oct 07 12:21:07 crc kubenswrapper[4700]: I1007 12:21:07.809622 4700 scope.go:117] "RemoveContainer" containerID="52b1fe863acc3f636ce580f3d15430d810f27b16ae938f2f15ec148b642ffd0e" Oct 07 12:21:07 crc kubenswrapper[4700]: I1007 12:21:07.872329 4700 scope.go:117] "RemoveContainer" containerID="ef381d6512f44f24c157cd4ab61301f4206444d8c19eb34d59f473c90aeb4af1" Oct 07 12:21:11 crc kubenswrapper[4700]: I1007 12:21:11.957015 4700 scope.go:117] "RemoveContainer" containerID="5607bc9974eeb0f551316eb4488f453bd2b7dc72fa28aa7ecde26c755a3be75b" Oct 07 12:21:11 crc kubenswrapper[4700]: E1007 12:21:11.957773 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:21:25 crc kubenswrapper[4700]: I1007 12:21:25.961169 4700 scope.go:117] "RemoveContainer" containerID="5607bc9974eeb0f551316eb4488f453bd2b7dc72fa28aa7ecde26c755a3be75b" Oct 07 12:21:25 crc kubenswrapper[4700]: E1007 12:21:25.962397 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:21:37 crc kubenswrapper[4700]: I1007 12:21:37.301856 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-bf98bb7b6-ghwv9_b5911608-e23e-46e8-9637-488593110278/manager/0.log" Oct 07 12:21:39 crc kubenswrapper[4700]: I1007 12:21:39.006457 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Oct 07 12:21:39 crc kubenswrapper[4700]: I1007 12:21:39.007230 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="90228e7e-2fff-43c5-8db3-6b803d39682c" containerName="aodh-api" containerID="cri-o://58ad82a45f336c5193483d194a619564c79c763de6e3274b529bee123c10b87d" gracePeriod=30 Oct 07 12:21:39 crc kubenswrapper[4700]: I1007 12:21:39.007335 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="90228e7e-2fff-43c5-8db3-6b803d39682c" containerName="aodh-listener" containerID="cri-o://14e8137905a0c4c51619ad9331328a1736f1b4d99e5a500f57140831ba55dc98" gracePeriod=30 Oct 07 12:21:39 crc kubenswrapper[4700]: I1007 12:21:39.007335 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="90228e7e-2fff-43c5-8db3-6b803d39682c" containerName="aodh-notifier" containerID="cri-o://f5f27481206b577170dc58ad998e630abda78272002fbbbaaefc894186813195" gracePeriod=30 Oct 07 12:21:39 crc kubenswrapper[4700]: I1007 12:21:39.007383 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="90228e7e-2fff-43c5-8db3-6b803d39682c" containerName="aodh-evaluator" containerID="cri-o://73fce3aa665178aa4b4199787742d85f97741ce0dfc0d9b3665c0379067546b6" gracePeriod=30 Oct 07 12:21:39 crc kubenswrapper[4700]: I1007 12:21:39.405350 4700 generic.go:334] "Generic (PLEG): container finished" podID="90228e7e-2fff-43c5-8db3-6b803d39682c" containerID="58ad82a45f336c5193483d194a619564c79c763de6e3274b529bee123c10b87d" exitCode=0 Oct 07 12:21:39 crc kubenswrapper[4700]: I1007 12:21:39.405494 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"90228e7e-2fff-43c5-8db3-6b803d39682c","Type":"ContainerDied","Data":"58ad82a45f336c5193483d194a619564c79c763de6e3274b529bee123c10b87d"} Oct 07 12:21:39 crc kubenswrapper[4700]: I1007 12:21:39.957134 4700 scope.go:117] "RemoveContainer" containerID="5607bc9974eeb0f551316eb4488f453bd2b7dc72fa28aa7ecde26c755a3be75b" Oct 07 12:21:39 crc kubenswrapper[4700]: E1007 12:21:39.957939 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:21:40 crc kubenswrapper[4700]: I1007 12:21:40.417738 4700 generic.go:334] "Generic (PLEG): container finished" podID="90228e7e-2fff-43c5-8db3-6b803d39682c" containerID="73fce3aa665178aa4b4199787742d85f97741ce0dfc0d9b3665c0379067546b6" exitCode=0 Oct 07 12:21:40 crc kubenswrapper[4700]: I1007 12:21:40.417778 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"90228e7e-2fff-43c5-8db3-6b803d39682c","Type":"ContainerDied","Data":"73fce3aa665178aa4b4199787742d85f97741ce0dfc0d9b3665c0379067546b6"} Oct 07 12:21:41 crc kubenswrapper[4700]: I1007 12:21:41.431636 4700 generic.go:334] "Generic (PLEG): container finished" podID="90228e7e-2fff-43c5-8db3-6b803d39682c" containerID="f5f27481206b577170dc58ad998e630abda78272002fbbbaaefc894186813195" exitCode=0 Oct 07 12:21:41 crc kubenswrapper[4700]: I1007 12:21:41.432985 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"90228e7e-2fff-43c5-8db3-6b803d39682c","Type":"ContainerDied","Data":"f5f27481206b577170dc58ad998e630abda78272002fbbbaaefc894186813195"} Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.259943 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.428652 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90228e7e-2fff-43c5-8db3-6b803d39682c-scripts\") pod \"90228e7e-2fff-43c5-8db3-6b803d39682c\" (UID: \"90228e7e-2fff-43c5-8db3-6b803d39682c\") " Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.428949 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90228e7e-2fff-43c5-8db3-6b803d39682c-internal-tls-certs\") pod \"90228e7e-2fff-43c5-8db3-6b803d39682c\" (UID: \"90228e7e-2fff-43c5-8db3-6b803d39682c\") " Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.429046 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90228e7e-2fff-43c5-8db3-6b803d39682c-public-tls-certs\") pod \"90228e7e-2fff-43c5-8db3-6b803d39682c\" (UID: \"90228e7e-2fff-43c5-8db3-6b803d39682c\") " Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.429064 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90228e7e-2fff-43c5-8db3-6b803d39682c-combined-ca-bundle\") pod \"90228e7e-2fff-43c5-8db3-6b803d39682c\" (UID: \"90228e7e-2fff-43c5-8db3-6b803d39682c\") " Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.429135 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djt7d\" (UniqueName: \"kubernetes.io/projected/90228e7e-2fff-43c5-8db3-6b803d39682c-kube-api-access-djt7d\") pod \"90228e7e-2fff-43c5-8db3-6b803d39682c\" (UID: \"90228e7e-2fff-43c5-8db3-6b803d39682c\") " Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.429219 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90228e7e-2fff-43c5-8db3-6b803d39682c-config-data\") pod \"90228e7e-2fff-43c5-8db3-6b803d39682c\" (UID: \"90228e7e-2fff-43c5-8db3-6b803d39682c\") " Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.437156 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90228e7e-2fff-43c5-8db3-6b803d39682c-scripts" (OuterVolumeSpecName: "scripts") pod "90228e7e-2fff-43c5-8db3-6b803d39682c" (UID: "90228e7e-2fff-43c5-8db3-6b803d39682c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.451809 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90228e7e-2fff-43c5-8db3-6b803d39682c-kube-api-access-djt7d" (OuterVolumeSpecName: "kube-api-access-djt7d") pod "90228e7e-2fff-43c5-8db3-6b803d39682c" (UID: "90228e7e-2fff-43c5-8db3-6b803d39682c"). InnerVolumeSpecName "kube-api-access-djt7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.478064 4700 generic.go:334] "Generic (PLEG): container finished" podID="90228e7e-2fff-43c5-8db3-6b803d39682c" containerID="14e8137905a0c4c51619ad9331328a1736f1b4d99e5a500f57140831ba55dc98" exitCode=0 Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.478193 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"90228e7e-2fff-43c5-8db3-6b803d39682c","Type":"ContainerDied","Data":"14e8137905a0c4c51619ad9331328a1736f1b4d99e5a500f57140831ba55dc98"} Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.478226 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"90228e7e-2fff-43c5-8db3-6b803d39682c","Type":"ContainerDied","Data":"9504d46207a78f19b5e6bfac820998541062fc83a72092937168010b0670e452"} Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.478245 4700 scope.go:117] "RemoveContainer" containerID="14e8137905a0c4c51619ad9331328a1736f1b4d99e5a500f57140831ba55dc98" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.478942 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.490295 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90228e7e-2fff-43c5-8db3-6b803d39682c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "90228e7e-2fff-43c5-8db3-6b803d39682c" (UID: "90228e7e-2fff-43c5-8db3-6b803d39682c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.493539 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90228e7e-2fff-43c5-8db3-6b803d39682c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "90228e7e-2fff-43c5-8db3-6b803d39682c" (UID: "90228e7e-2fff-43c5-8db3-6b803d39682c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.532538 4700 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90228e7e-2fff-43c5-8db3-6b803d39682c-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.532564 4700 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90228e7e-2fff-43c5-8db3-6b803d39682c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.532575 4700 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90228e7e-2fff-43c5-8db3-6b803d39682c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.532584 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djt7d\" (UniqueName: \"kubernetes.io/projected/90228e7e-2fff-43c5-8db3-6b803d39682c-kube-api-access-djt7d\") on node \"crc\" DevicePath \"\"" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.545370 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90228e7e-2fff-43c5-8db3-6b803d39682c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90228e7e-2fff-43c5-8db3-6b803d39682c" (UID: "90228e7e-2fff-43c5-8db3-6b803d39682c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.551945 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90228e7e-2fff-43c5-8db3-6b803d39682c-config-data" (OuterVolumeSpecName: "config-data") pod "90228e7e-2fff-43c5-8db3-6b803d39682c" (UID: "90228e7e-2fff-43c5-8db3-6b803d39682c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.558363 4700 scope.go:117] "RemoveContainer" containerID="f5f27481206b577170dc58ad998e630abda78272002fbbbaaefc894186813195" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.591787 4700 scope.go:117] "RemoveContainer" containerID="73fce3aa665178aa4b4199787742d85f97741ce0dfc0d9b3665c0379067546b6" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.614511 4700 scope.go:117] "RemoveContainer" containerID="58ad82a45f336c5193483d194a619564c79c763de6e3274b529bee123c10b87d" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.634520 4700 scope.go:117] "RemoveContainer" containerID="14e8137905a0c4c51619ad9331328a1736f1b4d99e5a500f57140831ba55dc98" Oct 07 12:21:44 crc kubenswrapper[4700]: E1007 12:21:44.634926 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14e8137905a0c4c51619ad9331328a1736f1b4d99e5a500f57140831ba55dc98\": container with ID starting with 14e8137905a0c4c51619ad9331328a1736f1b4d99e5a500f57140831ba55dc98 not found: ID does not exist" containerID="14e8137905a0c4c51619ad9331328a1736f1b4d99e5a500f57140831ba55dc98" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.634964 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14e8137905a0c4c51619ad9331328a1736f1b4d99e5a500f57140831ba55dc98"} err="failed to get container status \"14e8137905a0c4c51619ad9331328a1736f1b4d99e5a500f57140831ba55dc98\": rpc error: code = NotFound desc = could not find container \"14e8137905a0c4c51619ad9331328a1736f1b4d99e5a500f57140831ba55dc98\": container with ID starting with 14e8137905a0c4c51619ad9331328a1736f1b4d99e5a500f57140831ba55dc98 not found: ID does not exist" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.635021 4700 scope.go:117] "RemoveContainer" containerID="f5f27481206b577170dc58ad998e630abda78272002fbbbaaefc894186813195" Oct 07 12:21:44 crc kubenswrapper[4700]: E1007 12:21:44.635285 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5f27481206b577170dc58ad998e630abda78272002fbbbaaefc894186813195\": container with ID starting with f5f27481206b577170dc58ad998e630abda78272002fbbbaaefc894186813195 not found: ID does not exist" containerID="f5f27481206b577170dc58ad998e630abda78272002fbbbaaefc894186813195" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.635345 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5f27481206b577170dc58ad998e630abda78272002fbbbaaefc894186813195"} err="failed to get container status \"f5f27481206b577170dc58ad998e630abda78272002fbbbaaefc894186813195\": rpc error: code = NotFound desc = could not find container \"f5f27481206b577170dc58ad998e630abda78272002fbbbaaefc894186813195\": container with ID starting with f5f27481206b577170dc58ad998e630abda78272002fbbbaaefc894186813195 not found: ID does not exist" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.635364 4700 scope.go:117] "RemoveContainer" containerID="73fce3aa665178aa4b4199787742d85f97741ce0dfc0d9b3665c0379067546b6" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.635877 4700 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90228e7e-2fff-43c5-8db3-6b803d39682c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.635904 4700 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90228e7e-2fff-43c5-8db3-6b803d39682c-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:21:44 crc kubenswrapper[4700]: E1007 12:21:44.636008 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73fce3aa665178aa4b4199787742d85f97741ce0dfc0d9b3665c0379067546b6\": container with ID starting with 73fce3aa665178aa4b4199787742d85f97741ce0dfc0d9b3665c0379067546b6 not found: ID does not exist" containerID="73fce3aa665178aa4b4199787742d85f97741ce0dfc0d9b3665c0379067546b6" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.636040 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73fce3aa665178aa4b4199787742d85f97741ce0dfc0d9b3665c0379067546b6"} err="failed to get container status \"73fce3aa665178aa4b4199787742d85f97741ce0dfc0d9b3665c0379067546b6\": rpc error: code = NotFound desc = could not find container \"73fce3aa665178aa4b4199787742d85f97741ce0dfc0d9b3665c0379067546b6\": container with ID starting with 73fce3aa665178aa4b4199787742d85f97741ce0dfc0d9b3665c0379067546b6 not found: ID does not exist" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.636170 4700 scope.go:117] "RemoveContainer" containerID="58ad82a45f336c5193483d194a619564c79c763de6e3274b529bee123c10b87d" Oct 07 12:21:44 crc kubenswrapper[4700]: E1007 12:21:44.636547 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58ad82a45f336c5193483d194a619564c79c763de6e3274b529bee123c10b87d\": container with ID starting with 58ad82a45f336c5193483d194a619564c79c763de6e3274b529bee123c10b87d not found: ID does not exist" containerID="58ad82a45f336c5193483d194a619564c79c763de6e3274b529bee123c10b87d" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.636568 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58ad82a45f336c5193483d194a619564c79c763de6e3274b529bee123c10b87d"} err="failed to get container status \"58ad82a45f336c5193483d194a619564c79c763de6e3274b529bee123c10b87d\": rpc error: code = NotFound desc = could not find container \"58ad82a45f336c5193483d194a619564c79c763de6e3274b529bee123c10b87d\": container with ID starting with 58ad82a45f336c5193483d194a619564c79c763de6e3274b529bee123c10b87d not found: ID does not exist" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.818921 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.834702 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.851844 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Oct 07 12:21:44 crc kubenswrapper[4700]: E1007 12:21:44.852660 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90228e7e-2fff-43c5-8db3-6b803d39682c" containerName="aodh-listener" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.852708 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="90228e7e-2fff-43c5-8db3-6b803d39682c" containerName="aodh-listener" Oct 07 12:21:44 crc kubenswrapper[4700]: E1007 12:21:44.852771 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90228e7e-2fff-43c5-8db3-6b803d39682c" containerName="aodh-api" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.852790 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="90228e7e-2fff-43c5-8db3-6b803d39682c" containerName="aodh-api" Oct 07 12:21:44 crc kubenswrapper[4700]: E1007 12:21:44.852828 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90228e7e-2fff-43c5-8db3-6b803d39682c" containerName="aodh-notifier" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.852848 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="90228e7e-2fff-43c5-8db3-6b803d39682c" containerName="aodh-notifier" Oct 07 12:21:44 crc kubenswrapper[4700]: E1007 12:21:44.852889 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90228e7e-2fff-43c5-8db3-6b803d39682c" containerName="aodh-evaluator" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.852904 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="90228e7e-2fff-43c5-8db3-6b803d39682c" containerName="aodh-evaluator" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.854604 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="90228e7e-2fff-43c5-8db3-6b803d39682c" containerName="aodh-notifier" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.854687 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="90228e7e-2fff-43c5-8db3-6b803d39682c" containerName="aodh-listener" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.854722 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="90228e7e-2fff-43c5-8db3-6b803d39682c" containerName="aodh-api" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.854755 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="90228e7e-2fff-43c5-8db3-6b803d39682c" containerName="aodh-evaluator" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.859483 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.864444 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.866302 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.866406 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-k7ltr" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.866477 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.867441 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 07 12:21:44 crc kubenswrapper[4700]: I1007 12:21:44.869809 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 07 12:21:45 crc kubenswrapper[4700]: I1007 12:21:45.043724 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c868df58-1fc5-45c0-967d-d42bdb1390f5-scripts\") pod \"aodh-0\" (UID: \"c868df58-1fc5-45c0-967d-d42bdb1390f5\") " pod="openstack/aodh-0" Oct 07 12:21:45 crc kubenswrapper[4700]: I1007 12:21:45.043802 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c868df58-1fc5-45c0-967d-d42bdb1390f5-combined-ca-bundle\") pod \"aodh-0\" (UID: \"c868df58-1fc5-45c0-967d-d42bdb1390f5\") " pod="openstack/aodh-0" Oct 07 12:21:45 crc kubenswrapper[4700]: I1007 12:21:45.043857 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c868df58-1fc5-45c0-967d-d42bdb1390f5-config-data\") pod \"aodh-0\" (UID: \"c868df58-1fc5-45c0-967d-d42bdb1390f5\") " pod="openstack/aodh-0" Oct 07 12:21:45 crc kubenswrapper[4700]: I1007 12:21:45.044008 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c868df58-1fc5-45c0-967d-d42bdb1390f5-internal-tls-certs\") pod \"aodh-0\" (UID: \"c868df58-1fc5-45c0-967d-d42bdb1390f5\") " pod="openstack/aodh-0" Oct 07 12:21:45 crc kubenswrapper[4700]: I1007 12:21:45.044094 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89tjr\" (UniqueName: \"kubernetes.io/projected/c868df58-1fc5-45c0-967d-d42bdb1390f5-kube-api-access-89tjr\") pod \"aodh-0\" (UID: \"c868df58-1fc5-45c0-967d-d42bdb1390f5\") " pod="openstack/aodh-0" Oct 07 12:21:45 crc kubenswrapper[4700]: I1007 12:21:45.044219 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c868df58-1fc5-45c0-967d-d42bdb1390f5-public-tls-certs\") pod \"aodh-0\" (UID: \"c868df58-1fc5-45c0-967d-d42bdb1390f5\") " pod="openstack/aodh-0" Oct 07 12:21:45 crc kubenswrapper[4700]: I1007 12:21:45.146713 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89tjr\" (UniqueName: \"kubernetes.io/projected/c868df58-1fc5-45c0-967d-d42bdb1390f5-kube-api-access-89tjr\") pod \"aodh-0\" (UID: \"c868df58-1fc5-45c0-967d-d42bdb1390f5\") " pod="openstack/aodh-0" Oct 07 12:21:45 crc kubenswrapper[4700]: I1007 12:21:45.146893 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c868df58-1fc5-45c0-967d-d42bdb1390f5-public-tls-certs\") pod \"aodh-0\" (UID: \"c868df58-1fc5-45c0-967d-d42bdb1390f5\") " pod="openstack/aodh-0" Oct 07 12:21:45 crc kubenswrapper[4700]: I1007 12:21:45.146950 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c868df58-1fc5-45c0-967d-d42bdb1390f5-scripts\") pod \"aodh-0\" (UID: \"c868df58-1fc5-45c0-967d-d42bdb1390f5\") " pod="openstack/aodh-0" Oct 07 12:21:45 crc kubenswrapper[4700]: I1007 12:21:45.146994 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c868df58-1fc5-45c0-967d-d42bdb1390f5-combined-ca-bundle\") pod \"aodh-0\" (UID: \"c868df58-1fc5-45c0-967d-d42bdb1390f5\") " pod="openstack/aodh-0" Oct 07 12:21:45 crc kubenswrapper[4700]: I1007 12:21:45.147049 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c868df58-1fc5-45c0-967d-d42bdb1390f5-config-data\") pod \"aodh-0\" (UID: \"c868df58-1fc5-45c0-967d-d42bdb1390f5\") " pod="openstack/aodh-0" Oct 07 12:21:45 crc kubenswrapper[4700]: I1007 12:21:45.148025 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c868df58-1fc5-45c0-967d-d42bdb1390f5-internal-tls-certs\") pod \"aodh-0\" (UID: \"c868df58-1fc5-45c0-967d-d42bdb1390f5\") " pod="openstack/aodh-0" Oct 07 12:21:45 crc kubenswrapper[4700]: I1007 12:21:45.151103 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c868df58-1fc5-45c0-967d-d42bdb1390f5-internal-tls-certs\") pod \"aodh-0\" (UID: \"c868df58-1fc5-45c0-967d-d42bdb1390f5\") " pod="openstack/aodh-0" Oct 07 12:21:45 crc kubenswrapper[4700]: I1007 12:21:45.151296 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c868df58-1fc5-45c0-967d-d42bdb1390f5-public-tls-certs\") pod \"aodh-0\" (UID: \"c868df58-1fc5-45c0-967d-d42bdb1390f5\") " pod="openstack/aodh-0" Oct 07 12:21:45 crc kubenswrapper[4700]: I1007 12:21:45.151500 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c868df58-1fc5-45c0-967d-d42bdb1390f5-scripts\") pod \"aodh-0\" (UID: \"c868df58-1fc5-45c0-967d-d42bdb1390f5\") " pod="openstack/aodh-0" Oct 07 12:21:45 crc kubenswrapper[4700]: I1007 12:21:45.152683 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c868df58-1fc5-45c0-967d-d42bdb1390f5-config-data\") pod \"aodh-0\" (UID: \"c868df58-1fc5-45c0-967d-d42bdb1390f5\") " pod="openstack/aodh-0" Oct 07 12:21:45 crc kubenswrapper[4700]: I1007 12:21:45.153025 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c868df58-1fc5-45c0-967d-d42bdb1390f5-combined-ca-bundle\") pod \"aodh-0\" (UID: \"c868df58-1fc5-45c0-967d-d42bdb1390f5\") " pod="openstack/aodh-0" Oct 07 12:21:45 crc kubenswrapper[4700]: I1007 12:21:45.174548 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89tjr\" (UniqueName: \"kubernetes.io/projected/c868df58-1fc5-45c0-967d-d42bdb1390f5-kube-api-access-89tjr\") pod \"aodh-0\" (UID: \"c868df58-1fc5-45c0-967d-d42bdb1390f5\") " pod="openstack/aodh-0" Oct 07 12:21:45 crc kubenswrapper[4700]: I1007 12:21:45.228193 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 07 12:21:45 crc kubenswrapper[4700]: I1007 12:21:45.741887 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 07 12:21:45 crc kubenswrapper[4700]: I1007 12:21:45.746528 4700 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 12:21:45 crc kubenswrapper[4700]: I1007 12:21:45.982816 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90228e7e-2fff-43c5-8db3-6b803d39682c" path="/var/lib/kubelet/pods/90228e7e-2fff-43c5-8db3-6b803d39682c/volumes" Oct 07 12:21:46 crc kubenswrapper[4700]: I1007 12:21:46.518400 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c868df58-1fc5-45c0-967d-d42bdb1390f5","Type":"ContainerStarted","Data":"c99948ddd20e72ba0a9b100852d15f4f9323fa7ad9b6fb39c7bdd7f25d967912"} Oct 07 12:21:47 crc kubenswrapper[4700]: I1007 12:21:47.535391 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c868df58-1fc5-45c0-967d-d42bdb1390f5","Type":"ContainerStarted","Data":"be033c476af075a1b68e0771b04c5effeed72f6dfa8eff5b3e29724ad4f5f49d"} Oct 07 12:21:47 crc kubenswrapper[4700]: I1007 12:21:47.535874 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c868df58-1fc5-45c0-967d-d42bdb1390f5","Type":"ContainerStarted","Data":"cdadcce47d36d8b42312cc2b8ce1e09e85d9ff9b449b45d6aa1600805474b9e8"} Oct 07 12:21:48 crc kubenswrapper[4700]: I1007 12:21:48.545600 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c868df58-1fc5-45c0-967d-d42bdb1390f5","Type":"ContainerStarted","Data":"672bf7ccd52e90535f22bcd3cc82882abf379be067c5b95313daab93b3ed94f8"} Oct 07 12:21:49 crc kubenswrapper[4700]: I1007 12:21:49.558151 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c868df58-1fc5-45c0-967d-d42bdb1390f5","Type":"ContainerStarted","Data":"e6d62868691836afdd31bd43ccfceb87e7c030530a509f56fe3f941278cc0384"} Oct 07 12:21:49 crc kubenswrapper[4700]: I1007 12:21:49.593614 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.602631327 podStartE2EDuration="5.593595205s" podCreationTimestamp="2025-10-07 12:21:44 +0000 UTC" firstStartedPulling="2025-10-07 12:21:45.746257783 +0000 UTC m=+3672.542656772" lastFinishedPulling="2025-10-07 12:21:48.737221651 +0000 UTC m=+3675.533620650" observedRunningTime="2025-10-07 12:21:49.587576669 +0000 UTC m=+3676.383975678" watchObservedRunningTime="2025-10-07 12:21:49.593595205 +0000 UTC m=+3676.389994194" Oct 07 12:21:53 crc kubenswrapper[4700]: I1007 12:21:53.979110 4700 scope.go:117] "RemoveContainer" containerID="5607bc9974eeb0f551316eb4488f453bd2b7dc72fa28aa7ecde26c755a3be75b" Oct 07 12:21:53 crc kubenswrapper[4700]: E1007 12:21:53.980292 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:22:05 crc kubenswrapper[4700]: I1007 12:22:05.957838 4700 scope.go:117] "RemoveContainer" containerID="5607bc9974eeb0f551316eb4488f453bd2b7dc72fa28aa7ecde26c755a3be75b" Oct 07 12:22:05 crc kubenswrapper[4700]: E1007 12:22:05.959050 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:22:20 crc kubenswrapper[4700]: I1007 12:22:20.957956 4700 scope.go:117] "RemoveContainer" containerID="5607bc9974eeb0f551316eb4488f453bd2b7dc72fa28aa7ecde26c755a3be75b" Oct 07 12:22:20 crc kubenswrapper[4700]: E1007 12:22:20.959267 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:22:35 crc kubenswrapper[4700]: I1007 12:22:35.957984 4700 scope.go:117] "RemoveContainer" containerID="5607bc9974eeb0f551316eb4488f453bd2b7dc72fa28aa7ecde26c755a3be75b" Oct 07 12:22:35 crc kubenswrapper[4700]: E1007 12:22:35.959009 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:22:48 crc kubenswrapper[4700]: I1007 12:22:48.959189 4700 scope.go:117] "RemoveContainer" containerID="5607bc9974eeb0f551316eb4488f453bd2b7dc72fa28aa7ecde26c755a3be75b" Oct 07 12:22:48 crc kubenswrapper[4700]: E1007 12:22:48.960473 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:23:02 crc kubenswrapper[4700]: I1007 12:23:02.957717 4700 scope.go:117] "RemoveContainer" containerID="5607bc9974eeb0f551316eb4488f453bd2b7dc72fa28aa7ecde26c755a3be75b" Oct 07 12:23:02 crc kubenswrapper[4700]: E1007 12:23:02.958816 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:23:16 crc kubenswrapper[4700]: I1007 12:23:16.344980 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-25mvc"] Oct 07 12:23:16 crc kubenswrapper[4700]: I1007 12:23:16.347453 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-25mvc" Oct 07 12:23:16 crc kubenswrapper[4700]: I1007 12:23:16.353336 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-25mvc"] Oct 07 12:23:16 crc kubenswrapper[4700]: I1007 12:23:16.392810 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8822222-c4e9-4563-b9c6-f03029e17f19-utilities\") pod \"redhat-operators-25mvc\" (UID: \"b8822222-c4e9-4563-b9c6-f03029e17f19\") " pod="openshift-marketplace/redhat-operators-25mvc" Oct 07 12:23:16 crc kubenswrapper[4700]: I1007 12:23:16.393278 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8822222-c4e9-4563-b9c6-f03029e17f19-catalog-content\") pod \"redhat-operators-25mvc\" (UID: \"b8822222-c4e9-4563-b9c6-f03029e17f19\") " pod="openshift-marketplace/redhat-operators-25mvc" Oct 07 12:23:16 crc kubenswrapper[4700]: I1007 12:23:16.393360 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g27px\" (UniqueName: \"kubernetes.io/projected/b8822222-c4e9-4563-b9c6-f03029e17f19-kube-api-access-g27px\") pod \"redhat-operators-25mvc\" (UID: \"b8822222-c4e9-4563-b9c6-f03029e17f19\") " pod="openshift-marketplace/redhat-operators-25mvc" Oct 07 12:23:16 crc kubenswrapper[4700]: I1007 12:23:16.495564 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8822222-c4e9-4563-b9c6-f03029e17f19-catalog-content\") pod \"redhat-operators-25mvc\" (UID: \"b8822222-c4e9-4563-b9c6-f03029e17f19\") " pod="openshift-marketplace/redhat-operators-25mvc" Oct 07 12:23:16 crc kubenswrapper[4700]: I1007 12:23:16.495639 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g27px\" (UniqueName: \"kubernetes.io/projected/b8822222-c4e9-4563-b9c6-f03029e17f19-kube-api-access-g27px\") pod \"redhat-operators-25mvc\" (UID: \"b8822222-c4e9-4563-b9c6-f03029e17f19\") " pod="openshift-marketplace/redhat-operators-25mvc" Oct 07 12:23:16 crc kubenswrapper[4700]: I1007 12:23:16.495767 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8822222-c4e9-4563-b9c6-f03029e17f19-utilities\") pod \"redhat-operators-25mvc\" (UID: \"b8822222-c4e9-4563-b9c6-f03029e17f19\") " pod="openshift-marketplace/redhat-operators-25mvc" Oct 07 12:23:16 crc kubenswrapper[4700]: I1007 12:23:16.496262 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8822222-c4e9-4563-b9c6-f03029e17f19-catalog-content\") pod \"redhat-operators-25mvc\" (UID: \"b8822222-c4e9-4563-b9c6-f03029e17f19\") " pod="openshift-marketplace/redhat-operators-25mvc" Oct 07 12:23:16 crc kubenswrapper[4700]: I1007 12:23:16.496262 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8822222-c4e9-4563-b9c6-f03029e17f19-utilities\") pod \"redhat-operators-25mvc\" (UID: \"b8822222-c4e9-4563-b9c6-f03029e17f19\") " pod="openshift-marketplace/redhat-operators-25mvc" Oct 07 12:23:16 crc kubenswrapper[4700]: I1007 12:23:16.515564 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g27px\" (UniqueName: \"kubernetes.io/projected/b8822222-c4e9-4563-b9c6-f03029e17f19-kube-api-access-g27px\") pod \"redhat-operators-25mvc\" (UID: \"b8822222-c4e9-4563-b9c6-f03029e17f19\") " pod="openshift-marketplace/redhat-operators-25mvc" Oct 07 12:23:16 crc kubenswrapper[4700]: I1007 12:23:16.684976 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-25mvc" Oct 07 12:23:17 crc kubenswrapper[4700]: I1007 12:23:17.162374 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-25mvc"] Oct 07 12:23:17 crc kubenswrapper[4700]: I1007 12:23:17.643054 4700 generic.go:334] "Generic (PLEG): container finished" podID="b8822222-c4e9-4563-b9c6-f03029e17f19" containerID="fa7b60f5836d3eaa48631c6464851935d59a7e7cd71e85beee9b6c4dfa8b33f4" exitCode=0 Oct 07 12:23:17 crc kubenswrapper[4700]: I1007 12:23:17.643117 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-25mvc" event={"ID":"b8822222-c4e9-4563-b9c6-f03029e17f19","Type":"ContainerDied","Data":"fa7b60f5836d3eaa48631c6464851935d59a7e7cd71e85beee9b6c4dfa8b33f4"} Oct 07 12:23:17 crc kubenswrapper[4700]: I1007 12:23:17.643160 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-25mvc" event={"ID":"b8822222-c4e9-4563-b9c6-f03029e17f19","Type":"ContainerStarted","Data":"ec4265353d43079182be7fcc8cde867a5978655d8930a4528509d4bc6b5396fe"} Oct 07 12:23:17 crc kubenswrapper[4700]: I1007 12:23:17.957755 4700 scope.go:117] "RemoveContainer" containerID="5607bc9974eeb0f551316eb4488f453bd2b7dc72fa28aa7ecde26c755a3be75b" Oct 07 12:23:17 crc kubenswrapper[4700]: E1007 12:23:17.958657 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:23:19 crc kubenswrapper[4700]: I1007 12:23:19.664757 4700 generic.go:334] "Generic (PLEG): container finished" podID="b8822222-c4e9-4563-b9c6-f03029e17f19" containerID="883257e0f2e0eb389e387d9748f27fd9e1aea04c688773b9e122da8179303874" exitCode=0 Oct 07 12:23:19 crc kubenswrapper[4700]: I1007 12:23:19.666373 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-25mvc" event={"ID":"b8822222-c4e9-4563-b9c6-f03029e17f19","Type":"ContainerDied","Data":"883257e0f2e0eb389e387d9748f27fd9e1aea04c688773b9e122da8179303874"} Oct 07 12:23:21 crc kubenswrapper[4700]: I1007 12:23:21.684224 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-25mvc" event={"ID":"b8822222-c4e9-4563-b9c6-f03029e17f19","Type":"ContainerStarted","Data":"eefc4d9bdccd5757cb300b268e7f229f7199d33d759c026ca9f33179e0720d9f"} Oct 07 12:23:21 crc kubenswrapper[4700]: I1007 12:23:21.707692 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-25mvc" podStartSLOduration=3.24415285 podStartE2EDuration="5.70767356s" podCreationTimestamp="2025-10-07 12:23:16 +0000 UTC" firstStartedPulling="2025-10-07 12:23:17.644956071 +0000 UTC m=+3764.441355060" lastFinishedPulling="2025-10-07 12:23:20.108476791 +0000 UTC m=+3766.904875770" observedRunningTime="2025-10-07 12:23:21.706068548 +0000 UTC m=+3768.502467527" watchObservedRunningTime="2025-10-07 12:23:21.70767356 +0000 UTC m=+3768.504072559" Oct 07 12:23:26 crc kubenswrapper[4700]: I1007 12:23:26.685382 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-25mvc" Oct 07 12:23:26 crc kubenswrapper[4700]: I1007 12:23:26.685890 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-25mvc" Oct 07 12:23:27 crc kubenswrapper[4700]: I1007 12:23:27.745539 4700 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-25mvc" podUID="b8822222-c4e9-4563-b9c6-f03029e17f19" containerName="registry-server" probeResult="failure" output=< Oct 07 12:23:27 crc kubenswrapper[4700]: timeout: failed to connect service ":50051" within 1s Oct 07 12:23:27 crc kubenswrapper[4700]: > Oct 07 12:23:30 crc kubenswrapper[4700]: I1007 12:23:30.957801 4700 scope.go:117] "RemoveContainer" containerID="5607bc9974eeb0f551316eb4488f453bd2b7dc72fa28aa7ecde26c755a3be75b" Oct 07 12:23:30 crc kubenswrapper[4700]: E1007 12:23:30.959646 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:23:36 crc kubenswrapper[4700]: I1007 12:23:36.756840 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-25mvc" Oct 07 12:23:37 crc kubenswrapper[4700]: I1007 12:23:37.628697 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-25mvc" Oct 07 12:23:37 crc kubenswrapper[4700]: I1007 12:23:37.690194 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-25mvc"] Oct 07 12:23:37 crc kubenswrapper[4700]: I1007 12:23:37.891789 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-25mvc" podUID="b8822222-c4e9-4563-b9c6-f03029e17f19" containerName="registry-server" containerID="cri-o://eefc4d9bdccd5757cb300b268e7f229f7199d33d759c026ca9f33179e0720d9f" gracePeriod=2 Oct 07 12:23:38 crc kubenswrapper[4700]: I1007 12:23:38.638971 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-25mvc" Oct 07 12:23:38 crc kubenswrapper[4700]: I1007 12:23:38.779848 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8822222-c4e9-4563-b9c6-f03029e17f19-catalog-content\") pod \"b8822222-c4e9-4563-b9c6-f03029e17f19\" (UID: \"b8822222-c4e9-4563-b9c6-f03029e17f19\") " Oct 07 12:23:38 crc kubenswrapper[4700]: I1007 12:23:38.779889 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8822222-c4e9-4563-b9c6-f03029e17f19-utilities\") pod \"b8822222-c4e9-4563-b9c6-f03029e17f19\" (UID: \"b8822222-c4e9-4563-b9c6-f03029e17f19\") " Oct 07 12:23:38 crc kubenswrapper[4700]: I1007 12:23:38.779936 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g27px\" (UniqueName: \"kubernetes.io/projected/b8822222-c4e9-4563-b9c6-f03029e17f19-kube-api-access-g27px\") pod \"b8822222-c4e9-4563-b9c6-f03029e17f19\" (UID: \"b8822222-c4e9-4563-b9c6-f03029e17f19\") " Oct 07 12:23:38 crc kubenswrapper[4700]: I1007 12:23:38.780589 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8822222-c4e9-4563-b9c6-f03029e17f19-utilities" (OuterVolumeSpecName: "utilities") pod "b8822222-c4e9-4563-b9c6-f03029e17f19" (UID: "b8822222-c4e9-4563-b9c6-f03029e17f19"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:23:38 crc kubenswrapper[4700]: I1007 12:23:38.787868 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8822222-c4e9-4563-b9c6-f03029e17f19-kube-api-access-g27px" (OuterVolumeSpecName: "kube-api-access-g27px") pod "b8822222-c4e9-4563-b9c6-f03029e17f19" (UID: "b8822222-c4e9-4563-b9c6-f03029e17f19"). InnerVolumeSpecName "kube-api-access-g27px". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:23:38 crc kubenswrapper[4700]: I1007 12:23:38.854798 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8822222-c4e9-4563-b9c6-f03029e17f19-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b8822222-c4e9-4563-b9c6-f03029e17f19" (UID: "b8822222-c4e9-4563-b9c6-f03029e17f19"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:23:38 crc kubenswrapper[4700]: I1007 12:23:38.882260 4700 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8822222-c4e9-4563-b9c6-f03029e17f19-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 12:23:38 crc kubenswrapper[4700]: I1007 12:23:38.882292 4700 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8822222-c4e9-4563-b9c6-f03029e17f19-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 12:23:38 crc kubenswrapper[4700]: I1007 12:23:38.882317 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g27px\" (UniqueName: \"kubernetes.io/projected/b8822222-c4e9-4563-b9c6-f03029e17f19-kube-api-access-g27px\") on node \"crc\" DevicePath \"\"" Oct 07 12:23:38 crc kubenswrapper[4700]: I1007 12:23:38.904140 4700 generic.go:334] "Generic (PLEG): container finished" podID="b8822222-c4e9-4563-b9c6-f03029e17f19" containerID="eefc4d9bdccd5757cb300b268e7f229f7199d33d759c026ca9f33179e0720d9f" exitCode=0 Oct 07 12:23:38 crc kubenswrapper[4700]: I1007 12:23:38.904232 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-25mvc" event={"ID":"b8822222-c4e9-4563-b9c6-f03029e17f19","Type":"ContainerDied","Data":"eefc4d9bdccd5757cb300b268e7f229f7199d33d759c026ca9f33179e0720d9f"} Oct 07 12:23:38 crc kubenswrapper[4700]: I1007 12:23:38.904355 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-25mvc" event={"ID":"b8822222-c4e9-4563-b9c6-f03029e17f19","Type":"ContainerDied","Data":"ec4265353d43079182be7fcc8cde867a5978655d8930a4528509d4bc6b5396fe"} Oct 07 12:23:38 crc kubenswrapper[4700]: I1007 12:23:38.904388 4700 scope.go:117] "RemoveContainer" containerID="eefc4d9bdccd5757cb300b268e7f229f7199d33d759c026ca9f33179e0720d9f" Oct 07 12:23:38 crc kubenswrapper[4700]: I1007 12:23:38.904258 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-25mvc" Oct 07 12:23:38 crc kubenswrapper[4700]: I1007 12:23:38.950473 4700 scope.go:117] "RemoveContainer" containerID="883257e0f2e0eb389e387d9748f27fd9e1aea04c688773b9e122da8179303874" Oct 07 12:23:38 crc kubenswrapper[4700]: I1007 12:23:38.963817 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-25mvc"] Oct 07 12:23:38 crc kubenswrapper[4700]: I1007 12:23:38.985278 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-25mvc"] Oct 07 12:23:39 crc kubenswrapper[4700]: I1007 12:23:39.004889 4700 scope.go:117] "RemoveContainer" containerID="fa7b60f5836d3eaa48631c6464851935d59a7e7cd71e85beee9b6c4dfa8b33f4" Oct 07 12:23:39 crc kubenswrapper[4700]: I1007 12:23:39.037020 4700 scope.go:117] "RemoveContainer" containerID="eefc4d9bdccd5757cb300b268e7f229f7199d33d759c026ca9f33179e0720d9f" Oct 07 12:23:39 crc kubenswrapper[4700]: E1007 12:23:39.037862 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eefc4d9bdccd5757cb300b268e7f229f7199d33d759c026ca9f33179e0720d9f\": container with ID starting with eefc4d9bdccd5757cb300b268e7f229f7199d33d759c026ca9f33179e0720d9f not found: ID does not exist" containerID="eefc4d9bdccd5757cb300b268e7f229f7199d33d759c026ca9f33179e0720d9f" Oct 07 12:23:39 crc kubenswrapper[4700]: I1007 12:23:39.037894 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eefc4d9bdccd5757cb300b268e7f229f7199d33d759c026ca9f33179e0720d9f"} err="failed to get container status \"eefc4d9bdccd5757cb300b268e7f229f7199d33d759c026ca9f33179e0720d9f\": rpc error: code = NotFound desc = could not find container \"eefc4d9bdccd5757cb300b268e7f229f7199d33d759c026ca9f33179e0720d9f\": container with ID starting with eefc4d9bdccd5757cb300b268e7f229f7199d33d759c026ca9f33179e0720d9f not found: ID does not exist" Oct 07 12:23:39 crc kubenswrapper[4700]: I1007 12:23:39.037919 4700 scope.go:117] "RemoveContainer" containerID="883257e0f2e0eb389e387d9748f27fd9e1aea04c688773b9e122da8179303874" Oct 07 12:23:39 crc kubenswrapper[4700]: E1007 12:23:39.038287 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"883257e0f2e0eb389e387d9748f27fd9e1aea04c688773b9e122da8179303874\": container with ID starting with 883257e0f2e0eb389e387d9748f27fd9e1aea04c688773b9e122da8179303874 not found: ID does not exist" containerID="883257e0f2e0eb389e387d9748f27fd9e1aea04c688773b9e122da8179303874" Oct 07 12:23:39 crc kubenswrapper[4700]: I1007 12:23:39.038333 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"883257e0f2e0eb389e387d9748f27fd9e1aea04c688773b9e122da8179303874"} err="failed to get container status \"883257e0f2e0eb389e387d9748f27fd9e1aea04c688773b9e122da8179303874\": rpc error: code = NotFound desc = could not find container \"883257e0f2e0eb389e387d9748f27fd9e1aea04c688773b9e122da8179303874\": container with ID starting with 883257e0f2e0eb389e387d9748f27fd9e1aea04c688773b9e122da8179303874 not found: ID does not exist" Oct 07 12:23:39 crc kubenswrapper[4700]: I1007 12:23:39.038348 4700 scope.go:117] "RemoveContainer" containerID="fa7b60f5836d3eaa48631c6464851935d59a7e7cd71e85beee9b6c4dfa8b33f4" Oct 07 12:23:39 crc kubenswrapper[4700]: E1007 12:23:39.038637 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa7b60f5836d3eaa48631c6464851935d59a7e7cd71e85beee9b6c4dfa8b33f4\": container with ID starting with fa7b60f5836d3eaa48631c6464851935d59a7e7cd71e85beee9b6c4dfa8b33f4 not found: ID does not exist" containerID="fa7b60f5836d3eaa48631c6464851935d59a7e7cd71e85beee9b6c4dfa8b33f4" Oct 07 12:23:39 crc kubenswrapper[4700]: I1007 12:23:39.038684 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa7b60f5836d3eaa48631c6464851935d59a7e7cd71e85beee9b6c4dfa8b33f4"} err="failed to get container status \"fa7b60f5836d3eaa48631c6464851935d59a7e7cd71e85beee9b6c4dfa8b33f4\": rpc error: code = NotFound desc = could not find container \"fa7b60f5836d3eaa48631c6464851935d59a7e7cd71e85beee9b6c4dfa8b33f4\": container with ID starting with fa7b60f5836d3eaa48631c6464851935d59a7e7cd71e85beee9b6c4dfa8b33f4 not found: ID does not exist" Oct 07 12:23:39 crc kubenswrapper[4700]: I1007 12:23:39.217931 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-bf98bb7b6-ghwv9_b5911608-e23e-46e8-9637-488593110278/manager/0.log" Oct 07 12:23:39 crc kubenswrapper[4700]: I1007 12:23:39.981601 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8822222-c4e9-4563-b9c6-f03029e17f19" path="/var/lib/kubelet/pods/b8822222-c4e9-4563-b9c6-f03029e17f19/volumes" Oct 07 12:23:41 crc kubenswrapper[4700]: I1007 12:23:41.958106 4700 scope.go:117] "RemoveContainer" containerID="5607bc9974eeb0f551316eb4488f453bd2b7dc72fa28aa7ecde26c755a3be75b" Oct 07 12:23:41 crc kubenswrapper[4700]: E1007 12:23:41.958888 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:23:42 crc kubenswrapper[4700]: I1007 12:23:42.979052 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 07 12:23:42 crc kubenswrapper[4700]: I1007 12:23:42.979336 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f" containerName="prometheus" containerID="cri-o://9c562659bbb947671ebc48d5e7b085c12b8a8f0b127f7a3f38c8b4477ccade59" gracePeriod=600 Oct 07 12:23:42 crc kubenswrapper[4700]: I1007 12:23:42.979443 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f" containerName="config-reloader" containerID="cri-o://b36ec2d380001d57492bd00228973959b739899d591b0890400451716d0218e2" gracePeriod=600 Oct 07 12:23:42 crc kubenswrapper[4700]: I1007 12:23:42.979433 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f" containerName="thanos-sidecar" containerID="cri-o://7282853fd35db2f480dc1dfb7bb7ea5c62ce7cc1cb7ddd182d11ff542cb9df38" gracePeriod=600 Oct 07 12:23:43 crc kubenswrapper[4700]: I1007 12:23:43.985470 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 07 12:23:43 crc kubenswrapper[4700]: I1007 12:23:43.998444 4700 generic.go:334] "Generic (PLEG): container finished" podID="89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f" containerID="7282853fd35db2f480dc1dfb7bb7ea5c62ce7cc1cb7ddd182d11ff542cb9df38" exitCode=0 Oct 07 12:23:43 crc kubenswrapper[4700]: I1007 12:23:43.998487 4700 generic.go:334] "Generic (PLEG): container finished" podID="89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f" containerID="b36ec2d380001d57492bd00228973959b739899d591b0890400451716d0218e2" exitCode=0 Oct 07 12:23:43 crc kubenswrapper[4700]: I1007 12:23:43.998498 4700 generic.go:334] "Generic (PLEG): container finished" podID="89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f" containerID="9c562659bbb947671ebc48d5e7b085c12b8a8f0b127f7a3f38c8b4477ccade59" exitCode=0 Oct 07 12:23:43 crc kubenswrapper[4700]: I1007 12:23:43.998663 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 07 12:23:43 crc kubenswrapper[4700]: I1007 12:23:43.998557 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f","Type":"ContainerDied","Data":"7282853fd35db2f480dc1dfb7bb7ea5c62ce7cc1cb7ddd182d11ff542cb9df38"} Oct 07 12:23:43 crc kubenswrapper[4700]: I1007 12:23:43.998797 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f","Type":"ContainerDied","Data":"b36ec2d380001d57492bd00228973959b739899d591b0890400451716d0218e2"} Oct 07 12:23:43 crc kubenswrapper[4700]: I1007 12:23:43.998813 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f","Type":"ContainerDied","Data":"9c562659bbb947671ebc48d5e7b085c12b8a8f0b127f7a3f38c8b4477ccade59"} Oct 07 12:23:43 crc kubenswrapper[4700]: I1007 12:23:43.998833 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f","Type":"ContainerDied","Data":"91fdffa6e672e47f06bd04c907a5f56905b75f9a3cda20ba438eff8eae01fb7f"} Oct 07 12:23:43 crc kubenswrapper[4700]: I1007 12:23:43.998854 4700 scope.go:117] "RemoveContainer" containerID="7282853fd35db2f480dc1dfb7bb7ea5c62ce7cc1cb7ddd182d11ff542cb9df38" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.052403 4700 scope.go:117] "RemoveContainer" containerID="b36ec2d380001d57492bd00228973959b739899d591b0890400451716d0218e2" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.069718 4700 scope.go:117] "RemoveContainer" containerID="9c562659bbb947671ebc48d5e7b085c12b8a8f0b127f7a3f38c8b4477ccade59" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.098836 4700 scope.go:117] "RemoveContainer" containerID="f935695e58d2a1291ec54b1e5d37fe445ac85fb3e2e8236922509df17513e09a" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.105068 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-272g5\" (UniqueName: \"kubernetes.io/projected/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-kube-api-access-272g5\") pod \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.105120 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-config-out\") pod \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.105169 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.105200 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-tls-assets\") pod \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.105248 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.105335 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-web-config\") pod \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.105361 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-secret-combined-ca-bundle\") pod \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.105403 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-thanos-prometheus-http-client-file\") pod \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.105431 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-prometheus-metric-storage-db\") pod \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.105481 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-prometheus-metric-storage-rulefiles-0\") pod \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.105502 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-config\") pod \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\" (UID: \"89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f\") " Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.106094 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-prometheus-metric-storage-db" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f" (UID: "89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f"). InnerVolumeSpecName "prometheus-metric-storage-db". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.106259 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f" (UID: "89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.111691 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f" (UID: "89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.111692 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f" (UID: "89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.112806 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f" (UID: "89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.113497 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f" (UID: "89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.115363 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-config-out" (OuterVolumeSpecName: "config-out") pod "89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f" (UID: "89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.115394 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-config" (OuterVolumeSpecName: "config") pod "89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f" (UID: "89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.115427 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f" (UID: "89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.122630 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-kube-api-access-272g5" (OuterVolumeSpecName: "kube-api-access-272g5") pod "89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f" (UID: "89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f"). InnerVolumeSpecName "kube-api-access-272g5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.131044 4700 scope.go:117] "RemoveContainer" containerID="7282853fd35db2f480dc1dfb7bb7ea5c62ce7cc1cb7ddd182d11ff542cb9df38" Oct 07 12:23:44 crc kubenswrapper[4700]: E1007 12:23:44.131531 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7282853fd35db2f480dc1dfb7bb7ea5c62ce7cc1cb7ddd182d11ff542cb9df38\": container with ID starting with 7282853fd35db2f480dc1dfb7bb7ea5c62ce7cc1cb7ddd182d11ff542cb9df38 not found: ID does not exist" containerID="7282853fd35db2f480dc1dfb7bb7ea5c62ce7cc1cb7ddd182d11ff542cb9df38" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.131867 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7282853fd35db2f480dc1dfb7bb7ea5c62ce7cc1cb7ddd182d11ff542cb9df38"} err="failed to get container status \"7282853fd35db2f480dc1dfb7bb7ea5c62ce7cc1cb7ddd182d11ff542cb9df38\": rpc error: code = NotFound desc = could not find container \"7282853fd35db2f480dc1dfb7bb7ea5c62ce7cc1cb7ddd182d11ff542cb9df38\": container with ID starting with 7282853fd35db2f480dc1dfb7bb7ea5c62ce7cc1cb7ddd182d11ff542cb9df38 not found: ID does not exist" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.131942 4700 scope.go:117] "RemoveContainer" containerID="b36ec2d380001d57492bd00228973959b739899d591b0890400451716d0218e2" Oct 07 12:23:44 crc kubenswrapper[4700]: E1007 12:23:44.132397 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b36ec2d380001d57492bd00228973959b739899d591b0890400451716d0218e2\": container with ID starting with b36ec2d380001d57492bd00228973959b739899d591b0890400451716d0218e2 not found: ID does not exist" containerID="b36ec2d380001d57492bd00228973959b739899d591b0890400451716d0218e2" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.132451 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b36ec2d380001d57492bd00228973959b739899d591b0890400451716d0218e2"} err="failed to get container status \"b36ec2d380001d57492bd00228973959b739899d591b0890400451716d0218e2\": rpc error: code = NotFound desc = could not find container \"b36ec2d380001d57492bd00228973959b739899d591b0890400451716d0218e2\": container with ID starting with b36ec2d380001d57492bd00228973959b739899d591b0890400451716d0218e2 not found: ID does not exist" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.132488 4700 scope.go:117] "RemoveContainer" containerID="9c562659bbb947671ebc48d5e7b085c12b8a8f0b127f7a3f38c8b4477ccade59" Oct 07 12:23:44 crc kubenswrapper[4700]: E1007 12:23:44.132812 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c562659bbb947671ebc48d5e7b085c12b8a8f0b127f7a3f38c8b4477ccade59\": container with ID starting with 9c562659bbb947671ebc48d5e7b085c12b8a8f0b127f7a3f38c8b4477ccade59 not found: ID does not exist" containerID="9c562659bbb947671ebc48d5e7b085c12b8a8f0b127f7a3f38c8b4477ccade59" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.132890 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c562659bbb947671ebc48d5e7b085c12b8a8f0b127f7a3f38c8b4477ccade59"} err="failed to get container status \"9c562659bbb947671ebc48d5e7b085c12b8a8f0b127f7a3f38c8b4477ccade59\": rpc error: code = NotFound desc = could not find container \"9c562659bbb947671ebc48d5e7b085c12b8a8f0b127f7a3f38c8b4477ccade59\": container with ID starting with 9c562659bbb947671ebc48d5e7b085c12b8a8f0b127f7a3f38c8b4477ccade59 not found: ID does not exist" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.132957 4700 scope.go:117] "RemoveContainer" containerID="f935695e58d2a1291ec54b1e5d37fe445ac85fb3e2e8236922509df17513e09a" Oct 07 12:23:44 crc kubenswrapper[4700]: E1007 12:23:44.133276 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f935695e58d2a1291ec54b1e5d37fe445ac85fb3e2e8236922509df17513e09a\": container with ID starting with f935695e58d2a1291ec54b1e5d37fe445ac85fb3e2e8236922509df17513e09a not found: ID does not exist" containerID="f935695e58d2a1291ec54b1e5d37fe445ac85fb3e2e8236922509df17513e09a" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.133377 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f935695e58d2a1291ec54b1e5d37fe445ac85fb3e2e8236922509df17513e09a"} err="failed to get container status \"f935695e58d2a1291ec54b1e5d37fe445ac85fb3e2e8236922509df17513e09a\": rpc error: code = NotFound desc = could not find container \"f935695e58d2a1291ec54b1e5d37fe445ac85fb3e2e8236922509df17513e09a\": container with ID starting with f935695e58d2a1291ec54b1e5d37fe445ac85fb3e2e8236922509df17513e09a not found: ID does not exist" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.133438 4700 scope.go:117] "RemoveContainer" containerID="7282853fd35db2f480dc1dfb7bb7ea5c62ce7cc1cb7ddd182d11ff542cb9df38" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.133690 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7282853fd35db2f480dc1dfb7bb7ea5c62ce7cc1cb7ddd182d11ff542cb9df38"} err="failed to get container status \"7282853fd35db2f480dc1dfb7bb7ea5c62ce7cc1cb7ddd182d11ff542cb9df38\": rpc error: code = NotFound desc = could not find container \"7282853fd35db2f480dc1dfb7bb7ea5c62ce7cc1cb7ddd182d11ff542cb9df38\": container with ID starting with 7282853fd35db2f480dc1dfb7bb7ea5c62ce7cc1cb7ddd182d11ff542cb9df38 not found: ID does not exist" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.133768 4700 scope.go:117] "RemoveContainer" containerID="b36ec2d380001d57492bd00228973959b739899d591b0890400451716d0218e2" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.134161 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b36ec2d380001d57492bd00228973959b739899d591b0890400451716d0218e2"} err="failed to get container status \"b36ec2d380001d57492bd00228973959b739899d591b0890400451716d0218e2\": rpc error: code = NotFound desc = could not find container \"b36ec2d380001d57492bd00228973959b739899d591b0890400451716d0218e2\": container with ID starting with b36ec2d380001d57492bd00228973959b739899d591b0890400451716d0218e2 not found: ID does not exist" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.134239 4700 scope.go:117] "RemoveContainer" containerID="9c562659bbb947671ebc48d5e7b085c12b8a8f0b127f7a3f38c8b4477ccade59" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.134580 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c562659bbb947671ebc48d5e7b085c12b8a8f0b127f7a3f38c8b4477ccade59"} err="failed to get container status \"9c562659bbb947671ebc48d5e7b085c12b8a8f0b127f7a3f38c8b4477ccade59\": rpc error: code = NotFound desc = could not find container \"9c562659bbb947671ebc48d5e7b085c12b8a8f0b127f7a3f38c8b4477ccade59\": container with ID starting with 9c562659bbb947671ebc48d5e7b085c12b8a8f0b127f7a3f38c8b4477ccade59 not found: ID does not exist" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.134664 4700 scope.go:117] "RemoveContainer" containerID="f935695e58d2a1291ec54b1e5d37fe445ac85fb3e2e8236922509df17513e09a" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.134907 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f935695e58d2a1291ec54b1e5d37fe445ac85fb3e2e8236922509df17513e09a"} err="failed to get container status \"f935695e58d2a1291ec54b1e5d37fe445ac85fb3e2e8236922509df17513e09a\": rpc error: code = NotFound desc = could not find container \"f935695e58d2a1291ec54b1e5d37fe445ac85fb3e2e8236922509df17513e09a\": container with ID starting with f935695e58d2a1291ec54b1e5d37fe445ac85fb3e2e8236922509df17513e09a not found: ID does not exist" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.134930 4700 scope.go:117] "RemoveContainer" containerID="7282853fd35db2f480dc1dfb7bb7ea5c62ce7cc1cb7ddd182d11ff542cb9df38" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.135762 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7282853fd35db2f480dc1dfb7bb7ea5c62ce7cc1cb7ddd182d11ff542cb9df38"} err="failed to get container status \"7282853fd35db2f480dc1dfb7bb7ea5c62ce7cc1cb7ddd182d11ff542cb9df38\": rpc error: code = NotFound desc = could not find container \"7282853fd35db2f480dc1dfb7bb7ea5c62ce7cc1cb7ddd182d11ff542cb9df38\": container with ID starting with 7282853fd35db2f480dc1dfb7bb7ea5c62ce7cc1cb7ddd182d11ff542cb9df38 not found: ID does not exist" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.135783 4700 scope.go:117] "RemoveContainer" containerID="b36ec2d380001d57492bd00228973959b739899d591b0890400451716d0218e2" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.136286 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b36ec2d380001d57492bd00228973959b739899d591b0890400451716d0218e2"} err="failed to get container status \"b36ec2d380001d57492bd00228973959b739899d591b0890400451716d0218e2\": rpc error: code = NotFound desc = could not find container \"b36ec2d380001d57492bd00228973959b739899d591b0890400451716d0218e2\": container with ID starting with b36ec2d380001d57492bd00228973959b739899d591b0890400451716d0218e2 not found: ID does not exist" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.136366 4700 scope.go:117] "RemoveContainer" containerID="9c562659bbb947671ebc48d5e7b085c12b8a8f0b127f7a3f38c8b4477ccade59" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.136654 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c562659bbb947671ebc48d5e7b085c12b8a8f0b127f7a3f38c8b4477ccade59"} err="failed to get container status \"9c562659bbb947671ebc48d5e7b085c12b8a8f0b127f7a3f38c8b4477ccade59\": rpc error: code = NotFound desc = could not find container \"9c562659bbb947671ebc48d5e7b085c12b8a8f0b127f7a3f38c8b4477ccade59\": container with ID starting with 9c562659bbb947671ebc48d5e7b085c12b8a8f0b127f7a3f38c8b4477ccade59 not found: ID does not exist" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.136676 4700 scope.go:117] "RemoveContainer" containerID="f935695e58d2a1291ec54b1e5d37fe445ac85fb3e2e8236922509df17513e09a" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.136903 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f935695e58d2a1291ec54b1e5d37fe445ac85fb3e2e8236922509df17513e09a"} err="failed to get container status \"f935695e58d2a1291ec54b1e5d37fe445ac85fb3e2e8236922509df17513e09a\": rpc error: code = NotFound desc = could not find container \"f935695e58d2a1291ec54b1e5d37fe445ac85fb3e2e8236922509df17513e09a\": container with ID starting with f935695e58d2a1291ec54b1e5d37fe445ac85fb3e2e8236922509df17513e09a not found: ID does not exist" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.196414 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-web-config" (OuterVolumeSpecName: "web-config") pod "89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f" (UID: "89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.207345 4700 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-config-out\") on node \"crc\" DevicePath \"\"" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.207378 4700 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.207390 4700 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-tls-assets\") on node \"crc\" DevicePath \"\"" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.207400 4700 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.207410 4700 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-web-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.207419 4700 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.207428 4700 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.207438 4700 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-prometheus-metric-storage-db\") on node \"crc\" DevicePath \"\"" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.207447 4700 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.207455 4700 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.207464 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-272g5\" (UniqueName: \"kubernetes.io/projected/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f-kube-api-access-272g5\") on node \"crc\" DevicePath \"\"" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.343078 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.356481 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.379931 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 07 12:23:44 crc kubenswrapper[4700]: E1007 12:23:44.380352 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8822222-c4e9-4563-b9c6-f03029e17f19" containerName="registry-server" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.380367 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8822222-c4e9-4563-b9c6-f03029e17f19" containerName="registry-server" Oct 07 12:23:44 crc kubenswrapper[4700]: E1007 12:23:44.380380 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8822222-c4e9-4563-b9c6-f03029e17f19" containerName="extract-utilities" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.380387 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8822222-c4e9-4563-b9c6-f03029e17f19" containerName="extract-utilities" Oct 07 12:23:44 crc kubenswrapper[4700]: E1007 12:23:44.380404 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f" containerName="prometheus" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.380409 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f" containerName="prometheus" Oct 07 12:23:44 crc kubenswrapper[4700]: E1007 12:23:44.380419 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f" containerName="config-reloader" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.380425 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f" containerName="config-reloader" Oct 07 12:23:44 crc kubenswrapper[4700]: E1007 12:23:44.380434 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8822222-c4e9-4563-b9c6-f03029e17f19" containerName="extract-content" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.380440 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8822222-c4e9-4563-b9c6-f03029e17f19" containerName="extract-content" Oct 07 12:23:44 crc kubenswrapper[4700]: E1007 12:23:44.380449 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f" containerName="thanos-sidecar" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.380455 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f" containerName="thanos-sidecar" Oct 07 12:23:44 crc kubenswrapper[4700]: E1007 12:23:44.380483 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f" containerName="init-config-reloader" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.380489 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f" containerName="init-config-reloader" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.380695 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f" containerName="thanos-sidecar" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.380718 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f" containerName="prometheus" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.380727 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8822222-c4e9-4563-b9c6-f03029e17f19" containerName="registry-server" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.380738 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f" containerName="config-reloader" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.382572 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.384493 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.384724 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-cgn6d" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.384864 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.384961 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.385411 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.385524 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.396235 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.405396 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.511943 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/7b10d3cc-1670-4070-9e12-7049b2906d9d-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"7b10d3cc-1670-4070-9e12-7049b2906d9d\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.512026 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7b10d3cc-1670-4070-9e12-7049b2906d9d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"7b10d3cc-1670-4070-9e12-7049b2906d9d\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.512060 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7b10d3cc-1670-4070-9e12-7049b2906d9d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"7b10d3cc-1670-4070-9e12-7049b2906d9d\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.512092 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7b10d3cc-1670-4070-9e12-7049b2906d9d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"7b10d3cc-1670-4070-9e12-7049b2906d9d\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.512146 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/7b10d3cc-1670-4070-9e12-7049b2906d9d-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"7b10d3cc-1670-4070-9e12-7049b2906d9d\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.512184 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7b10d3cc-1670-4070-9e12-7049b2906d9d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"7b10d3cc-1670-4070-9e12-7049b2906d9d\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.512213 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/7b10d3cc-1670-4070-9e12-7049b2906d9d-prometheus-metric-storage-db\") pod \"prometheus-metric-storage-0\" (UID: \"7b10d3cc-1670-4070-9e12-7049b2906d9d\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.512283 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrs4z\" (UniqueName: \"kubernetes.io/projected/7b10d3cc-1670-4070-9e12-7049b2906d9d-kube-api-access-zrs4z\") pod \"prometheus-metric-storage-0\" (UID: \"7b10d3cc-1670-4070-9e12-7049b2906d9d\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.512339 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b10d3cc-1670-4070-9e12-7049b2906d9d-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"7b10d3cc-1670-4070-9e12-7049b2906d9d\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.512371 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7b10d3cc-1670-4070-9e12-7049b2906d9d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"7b10d3cc-1670-4070-9e12-7049b2906d9d\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.512475 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7b10d3cc-1670-4070-9e12-7049b2906d9d-config\") pod \"prometheus-metric-storage-0\" (UID: \"7b10d3cc-1670-4070-9e12-7049b2906d9d\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.614108 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b10d3cc-1670-4070-9e12-7049b2906d9d-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"7b10d3cc-1670-4070-9e12-7049b2906d9d\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.614160 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7b10d3cc-1670-4070-9e12-7049b2906d9d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"7b10d3cc-1670-4070-9e12-7049b2906d9d\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.614192 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7b10d3cc-1670-4070-9e12-7049b2906d9d-config\") pod \"prometheus-metric-storage-0\" (UID: \"7b10d3cc-1670-4070-9e12-7049b2906d9d\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.614239 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/7b10d3cc-1670-4070-9e12-7049b2906d9d-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"7b10d3cc-1670-4070-9e12-7049b2906d9d\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.614273 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7b10d3cc-1670-4070-9e12-7049b2906d9d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"7b10d3cc-1670-4070-9e12-7049b2906d9d\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.614294 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7b10d3cc-1670-4070-9e12-7049b2906d9d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"7b10d3cc-1670-4070-9e12-7049b2906d9d\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.614328 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7b10d3cc-1670-4070-9e12-7049b2906d9d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"7b10d3cc-1670-4070-9e12-7049b2906d9d\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.614364 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/7b10d3cc-1670-4070-9e12-7049b2906d9d-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"7b10d3cc-1670-4070-9e12-7049b2906d9d\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.614394 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7b10d3cc-1670-4070-9e12-7049b2906d9d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"7b10d3cc-1670-4070-9e12-7049b2906d9d\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.614417 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/7b10d3cc-1670-4070-9e12-7049b2906d9d-prometheus-metric-storage-db\") pod \"prometheus-metric-storage-0\" (UID: \"7b10d3cc-1670-4070-9e12-7049b2906d9d\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.614466 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrs4z\" (UniqueName: \"kubernetes.io/projected/7b10d3cc-1670-4070-9e12-7049b2906d9d-kube-api-access-zrs4z\") pod \"prometheus-metric-storage-0\" (UID: \"7b10d3cc-1670-4070-9e12-7049b2906d9d\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.615415 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/7b10d3cc-1670-4070-9e12-7049b2906d9d-prometheus-metric-storage-db\") pod \"prometheus-metric-storage-0\" (UID: \"7b10d3cc-1670-4070-9e12-7049b2906d9d\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.615905 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7b10d3cc-1670-4070-9e12-7049b2906d9d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"7b10d3cc-1670-4070-9e12-7049b2906d9d\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.618580 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7b10d3cc-1670-4070-9e12-7049b2906d9d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"7b10d3cc-1670-4070-9e12-7049b2906d9d\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.619882 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b10d3cc-1670-4070-9e12-7049b2906d9d-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"7b10d3cc-1670-4070-9e12-7049b2906d9d\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.620171 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/7b10d3cc-1670-4070-9e12-7049b2906d9d-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"7b10d3cc-1670-4070-9e12-7049b2906d9d\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.620182 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7b10d3cc-1670-4070-9e12-7049b2906d9d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"7b10d3cc-1670-4070-9e12-7049b2906d9d\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.621109 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7b10d3cc-1670-4070-9e12-7049b2906d9d-config\") pod \"prometheus-metric-storage-0\" (UID: \"7b10d3cc-1670-4070-9e12-7049b2906d9d\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.621576 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7b10d3cc-1670-4070-9e12-7049b2906d9d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"7b10d3cc-1670-4070-9e12-7049b2906d9d\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.627041 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/7b10d3cc-1670-4070-9e12-7049b2906d9d-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"7b10d3cc-1670-4070-9e12-7049b2906d9d\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.630747 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7b10d3cc-1670-4070-9e12-7049b2906d9d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"7b10d3cc-1670-4070-9e12-7049b2906d9d\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.632797 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrs4z\" (UniqueName: \"kubernetes.io/projected/7b10d3cc-1670-4070-9e12-7049b2906d9d-kube-api-access-zrs4z\") pod \"prometheus-metric-storage-0\" (UID: \"7b10d3cc-1670-4070-9e12-7049b2906d9d\") " pod="openstack/prometheus-metric-storage-0" Oct 07 12:23:44 crc kubenswrapper[4700]: I1007 12:23:44.700715 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 07 12:23:45 crc kubenswrapper[4700]: I1007 12:23:45.626942 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 07 12:23:45 crc kubenswrapper[4700]: I1007 12:23:45.970123 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f" path="/var/lib/kubelet/pods/89efb8eb-1618-4f58-9ecd-3ca9a7d1d21f/volumes" Oct 07 12:23:46 crc kubenswrapper[4700]: I1007 12:23:46.019055 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7b10d3cc-1670-4070-9e12-7049b2906d9d","Type":"ContainerStarted","Data":"ce6a64c3949d0019c635f8b6742033e8e0a3ea2cb9ce4a98fccb08454a121ffd"} Oct 07 12:23:50 crc kubenswrapper[4700]: I1007 12:23:50.052199 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7b10d3cc-1670-4070-9e12-7049b2906d9d","Type":"ContainerStarted","Data":"47a14d6ada64a9b621f61257fa6daa3a5357cdc4894f2725cfa0b74f601bb4f5"} Oct 07 12:23:52 crc kubenswrapper[4700]: I1007 12:23:52.957798 4700 scope.go:117] "RemoveContainer" containerID="5607bc9974eeb0f551316eb4488f453bd2b7dc72fa28aa7ecde26c755a3be75b" Oct 07 12:23:52 crc kubenswrapper[4700]: E1007 12:23:52.958615 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:24:00 crc kubenswrapper[4700]: I1007 12:24:00.222790 4700 generic.go:334] "Generic (PLEG): container finished" podID="7b10d3cc-1670-4070-9e12-7049b2906d9d" containerID="47a14d6ada64a9b621f61257fa6daa3a5357cdc4894f2725cfa0b74f601bb4f5" exitCode=0 Oct 07 12:24:00 crc kubenswrapper[4700]: I1007 12:24:00.222878 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7b10d3cc-1670-4070-9e12-7049b2906d9d","Type":"ContainerDied","Data":"47a14d6ada64a9b621f61257fa6daa3a5357cdc4894f2725cfa0b74f601bb4f5"} Oct 07 12:24:01 crc kubenswrapper[4700]: I1007 12:24:01.238358 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7b10d3cc-1670-4070-9e12-7049b2906d9d","Type":"ContainerStarted","Data":"bb52d2b033c2a60b2375b3aa0ad484378d24750f357c40ef5429d0ebd0540872"} Oct 07 12:24:03 crc kubenswrapper[4700]: I1007 12:24:03.966994 4700 scope.go:117] "RemoveContainer" containerID="5607bc9974eeb0f551316eb4488f453bd2b7dc72fa28aa7ecde26c755a3be75b" Oct 07 12:24:03 crc kubenswrapper[4700]: E1007 12:24:03.967503 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:24:06 crc kubenswrapper[4700]: I1007 12:24:06.297212 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7b10d3cc-1670-4070-9e12-7049b2906d9d","Type":"ContainerStarted","Data":"c6e267e4ec7ed007c94874f684a2ad8f4a7243940ffb31987d5ce601ab1bfc88"} Oct 07 12:24:06 crc kubenswrapper[4700]: I1007 12:24:06.297741 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7b10d3cc-1670-4070-9e12-7049b2906d9d","Type":"ContainerStarted","Data":"ccea51915aa0034f8735966a429d1142c4fad28fe4e76f0a11938d12232a5ead"} Oct 07 12:24:06 crc kubenswrapper[4700]: I1007 12:24:06.333511 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=22.333492425 podStartE2EDuration="22.333492425s" podCreationTimestamp="2025-10-07 12:23:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:24:06.323830695 +0000 UTC m=+3813.120229694" watchObservedRunningTime="2025-10-07 12:24:06.333492425 +0000 UTC m=+3813.129891404" Oct 07 12:24:09 crc kubenswrapper[4700]: I1007 12:24:09.700905 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 07 12:24:14 crc kubenswrapper[4700]: I1007 12:24:14.700983 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 07 12:24:14 crc kubenswrapper[4700]: I1007 12:24:14.721034 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 07 12:24:15 crc kubenswrapper[4700]: I1007 12:24:15.396401 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 07 12:24:16 crc kubenswrapper[4700]: I1007 12:24:16.957499 4700 scope.go:117] "RemoveContainer" containerID="5607bc9974eeb0f551316eb4488f453bd2b7dc72fa28aa7ecde26c755a3be75b" Oct 07 12:24:16 crc kubenswrapper[4700]: E1007 12:24:16.958216 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:24:31 crc kubenswrapper[4700]: I1007 12:24:31.957947 4700 scope.go:117] "RemoveContainer" containerID="5607bc9974eeb0f551316eb4488f453bd2b7dc72fa28aa7ecde26c755a3be75b" Oct 07 12:24:31 crc kubenswrapper[4700]: E1007 12:24:31.958995 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:24:45 crc kubenswrapper[4700]: I1007 12:24:45.957241 4700 scope.go:117] "RemoveContainer" containerID="5607bc9974eeb0f551316eb4488f453bd2b7dc72fa28aa7ecde26c755a3be75b" Oct 07 12:24:45 crc kubenswrapper[4700]: E1007 12:24:45.958184 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:24:58 crc kubenswrapper[4700]: I1007 12:24:58.957666 4700 scope.go:117] "RemoveContainer" containerID="5607bc9974eeb0f551316eb4488f453bd2b7dc72fa28aa7ecde26c755a3be75b" Oct 07 12:24:58 crc kubenswrapper[4700]: E1007 12:24:58.959432 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:25:10 crc kubenswrapper[4700]: I1007 12:25:10.958392 4700 scope.go:117] "RemoveContainer" containerID="5607bc9974eeb0f551316eb4488f453bd2b7dc72fa28aa7ecde26c755a3be75b" Oct 07 12:25:10 crc kubenswrapper[4700]: E1007 12:25:10.959284 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:25:23 crc kubenswrapper[4700]: I1007 12:25:23.971458 4700 scope.go:117] "RemoveContainer" containerID="5607bc9974eeb0f551316eb4488f453bd2b7dc72fa28aa7ecde26c755a3be75b" Oct 07 12:25:23 crc kubenswrapper[4700]: E1007 12:25:23.972790 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:25:37 crc kubenswrapper[4700]: I1007 12:25:37.957060 4700 scope.go:117] "RemoveContainer" containerID="5607bc9974eeb0f551316eb4488f453bd2b7dc72fa28aa7ecde26c755a3be75b" Oct 07 12:25:37 crc kubenswrapper[4700]: E1007 12:25:37.957860 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:25:42 crc kubenswrapper[4700]: I1007 12:25:42.546706 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-bf98bb7b6-ghwv9_b5911608-e23e-46e8-9637-488593110278/manager/0.log" Oct 07 12:25:49 crc kubenswrapper[4700]: I1007 12:25:49.958686 4700 scope.go:117] "RemoveContainer" containerID="5607bc9974eeb0f551316eb4488f453bd2b7dc72fa28aa7ecde26c755a3be75b" Oct 07 12:25:50 crc kubenswrapper[4700]: I1007 12:25:50.535134 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" event={"ID":"97a77b38-e9b1-4243-ac3a-28d83d87cf15","Type":"ContainerStarted","Data":"6854934ce5d0721a7ce9761f6927f75206363bc994778df1c1e2fd1114dbbdd4"} Oct 07 12:25:56 crc kubenswrapper[4700]: I1007 12:25:56.194667 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dcccm/must-gather-69xtq"] Oct 07 12:25:56 crc kubenswrapper[4700]: I1007 12:25:56.207785 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dcccm/must-gather-69xtq" Oct 07 12:25:56 crc kubenswrapper[4700]: I1007 12:25:56.250971 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-dcccm"/"kube-root-ca.crt" Oct 07 12:25:56 crc kubenswrapper[4700]: I1007 12:25:56.251323 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-dcccm"/"openshift-service-ca.crt" Oct 07 12:25:56 crc kubenswrapper[4700]: I1007 12:25:56.237722 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-dcccm"/"default-dockercfg-vpzz4" Oct 07 12:25:56 crc kubenswrapper[4700]: I1007 12:25:56.257209 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dcccm/must-gather-69xtq"] Oct 07 12:25:56 crc kubenswrapper[4700]: I1007 12:25:56.355161 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cf42a61d-59d5-4ab2-9980-4294ed138adb-must-gather-output\") pod \"must-gather-69xtq\" (UID: \"cf42a61d-59d5-4ab2-9980-4294ed138adb\") " pod="openshift-must-gather-dcccm/must-gather-69xtq" Oct 07 12:25:56 crc kubenswrapper[4700]: I1007 12:25:56.355583 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn6fm\" (UniqueName: \"kubernetes.io/projected/cf42a61d-59d5-4ab2-9980-4294ed138adb-kube-api-access-rn6fm\") pod \"must-gather-69xtq\" (UID: \"cf42a61d-59d5-4ab2-9980-4294ed138adb\") " pod="openshift-must-gather-dcccm/must-gather-69xtq" Oct 07 12:25:56 crc kubenswrapper[4700]: I1007 12:25:56.457456 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cf42a61d-59d5-4ab2-9980-4294ed138adb-must-gather-output\") pod \"must-gather-69xtq\" (UID: \"cf42a61d-59d5-4ab2-9980-4294ed138adb\") " pod="openshift-must-gather-dcccm/must-gather-69xtq" Oct 07 12:25:56 crc kubenswrapper[4700]: I1007 12:25:56.457980 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn6fm\" (UniqueName: \"kubernetes.io/projected/cf42a61d-59d5-4ab2-9980-4294ed138adb-kube-api-access-rn6fm\") pod \"must-gather-69xtq\" (UID: \"cf42a61d-59d5-4ab2-9980-4294ed138adb\") " pod="openshift-must-gather-dcccm/must-gather-69xtq" Oct 07 12:25:56 crc kubenswrapper[4700]: I1007 12:25:56.458070 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cf42a61d-59d5-4ab2-9980-4294ed138adb-must-gather-output\") pod \"must-gather-69xtq\" (UID: \"cf42a61d-59d5-4ab2-9980-4294ed138adb\") " pod="openshift-must-gather-dcccm/must-gather-69xtq" Oct 07 12:25:56 crc kubenswrapper[4700]: I1007 12:25:56.480534 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn6fm\" (UniqueName: \"kubernetes.io/projected/cf42a61d-59d5-4ab2-9980-4294ed138adb-kube-api-access-rn6fm\") pod \"must-gather-69xtq\" (UID: \"cf42a61d-59d5-4ab2-9980-4294ed138adb\") " pod="openshift-must-gather-dcccm/must-gather-69xtq" Oct 07 12:25:56 crc kubenswrapper[4700]: I1007 12:25:56.574567 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dcccm/must-gather-69xtq" Oct 07 12:25:57 crc kubenswrapper[4700]: I1007 12:25:57.118008 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dcccm/must-gather-69xtq"] Oct 07 12:25:57 crc kubenswrapper[4700]: I1007 12:25:57.609121 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dcccm/must-gather-69xtq" event={"ID":"cf42a61d-59d5-4ab2-9980-4294ed138adb","Type":"ContainerStarted","Data":"bc77925142f7b009e3732e85f1aab710213107d65250ed807a19c5c9c18ce086"} Oct 07 12:26:01 crc kubenswrapper[4700]: I1007 12:26:01.657228 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dcccm/must-gather-69xtq" event={"ID":"cf42a61d-59d5-4ab2-9980-4294ed138adb","Type":"ContainerStarted","Data":"b20e265846b22b583cc2e603b2350d4fe1fbf35d4807804e6c45f1eb71b295f7"} Oct 07 12:26:02 crc kubenswrapper[4700]: I1007 12:26:02.667887 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dcccm/must-gather-69xtq" event={"ID":"cf42a61d-59d5-4ab2-9980-4294ed138adb","Type":"ContainerStarted","Data":"04be5a04d1e657216ee7afd448900987a84c49848cc6d603b81da949494850bd"} Oct 07 12:26:02 crc kubenswrapper[4700]: I1007 12:26:02.688491 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dcccm/must-gather-69xtq" podStartSLOduration=2.704744206 podStartE2EDuration="6.688471792s" podCreationTimestamp="2025-10-07 12:25:56 +0000 UTC" firstStartedPulling="2025-10-07 12:25:57.138041832 +0000 UTC m=+3923.934440851" lastFinishedPulling="2025-10-07 12:26:01.121769448 +0000 UTC m=+3927.918168437" observedRunningTime="2025-10-07 12:26:02.682334663 +0000 UTC m=+3929.478733662" watchObservedRunningTime="2025-10-07 12:26:02.688471792 +0000 UTC m=+3929.484870791" Oct 07 12:26:07 crc kubenswrapper[4700]: I1007 12:26:07.358692 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dcccm/crc-debug-jbfpk"] Oct 07 12:26:07 crc kubenswrapper[4700]: I1007 12:26:07.360526 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dcccm/crc-debug-jbfpk" Oct 07 12:26:07 crc kubenswrapper[4700]: I1007 12:26:07.514673 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/039051d0-844b-4a45-bd6f-edae74cc104e-host\") pod \"crc-debug-jbfpk\" (UID: \"039051d0-844b-4a45-bd6f-edae74cc104e\") " pod="openshift-must-gather-dcccm/crc-debug-jbfpk" Oct 07 12:26:07 crc kubenswrapper[4700]: I1007 12:26:07.514935 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rxxj\" (UniqueName: \"kubernetes.io/projected/039051d0-844b-4a45-bd6f-edae74cc104e-kube-api-access-2rxxj\") pod \"crc-debug-jbfpk\" (UID: \"039051d0-844b-4a45-bd6f-edae74cc104e\") " pod="openshift-must-gather-dcccm/crc-debug-jbfpk" Oct 07 12:26:07 crc kubenswrapper[4700]: I1007 12:26:07.616439 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/039051d0-844b-4a45-bd6f-edae74cc104e-host\") pod \"crc-debug-jbfpk\" (UID: \"039051d0-844b-4a45-bd6f-edae74cc104e\") " pod="openshift-must-gather-dcccm/crc-debug-jbfpk" Oct 07 12:26:07 crc kubenswrapper[4700]: I1007 12:26:07.616559 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/039051d0-844b-4a45-bd6f-edae74cc104e-host\") pod \"crc-debug-jbfpk\" (UID: \"039051d0-844b-4a45-bd6f-edae74cc104e\") " pod="openshift-must-gather-dcccm/crc-debug-jbfpk" Oct 07 12:26:07 crc kubenswrapper[4700]: I1007 12:26:07.616592 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rxxj\" (UniqueName: \"kubernetes.io/projected/039051d0-844b-4a45-bd6f-edae74cc104e-kube-api-access-2rxxj\") pod \"crc-debug-jbfpk\" (UID: \"039051d0-844b-4a45-bd6f-edae74cc104e\") " pod="openshift-must-gather-dcccm/crc-debug-jbfpk" Oct 07 12:26:07 crc kubenswrapper[4700]: I1007 12:26:07.637220 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rxxj\" (UniqueName: \"kubernetes.io/projected/039051d0-844b-4a45-bd6f-edae74cc104e-kube-api-access-2rxxj\") pod \"crc-debug-jbfpk\" (UID: \"039051d0-844b-4a45-bd6f-edae74cc104e\") " pod="openshift-must-gather-dcccm/crc-debug-jbfpk" Oct 07 12:26:07 crc kubenswrapper[4700]: I1007 12:26:07.679753 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dcccm/crc-debug-jbfpk" Oct 07 12:26:07 crc kubenswrapper[4700]: I1007 12:26:07.764145 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dcccm/crc-debug-jbfpk" event={"ID":"039051d0-844b-4a45-bd6f-edae74cc104e","Type":"ContainerStarted","Data":"acb3f46eb24c6f36e70b4df0c06bd9455f8cb2abc05e8011e326653a39b52949"} Oct 07 12:26:18 crc kubenswrapper[4700]: I1007 12:26:18.868372 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dcccm/crc-debug-jbfpk" event={"ID":"039051d0-844b-4a45-bd6f-edae74cc104e","Type":"ContainerStarted","Data":"f1c599eeac98fc59e1c7ecf51fbb14b039a2d828fec462ba0a0cbec96dd8625c"} Oct 07 12:26:18 crc kubenswrapper[4700]: I1007 12:26:18.885893 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dcccm/crc-debug-jbfpk" podStartSLOduration=1.215611497 podStartE2EDuration="11.885873676s" podCreationTimestamp="2025-10-07 12:26:07 +0000 UTC" firstStartedPulling="2025-10-07 12:26:07.725492593 +0000 UTC m=+3934.521891582" lastFinishedPulling="2025-10-07 12:26:18.395754762 +0000 UTC m=+3945.192153761" observedRunningTime="2025-10-07 12:26:18.883651959 +0000 UTC m=+3945.680050968" watchObservedRunningTime="2025-10-07 12:26:18.885873676 +0000 UTC m=+3945.682272675" Oct 07 12:27:02 crc kubenswrapper[4700]: I1007 12:27:02.806029 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qz2cc"] Oct 07 12:27:02 crc kubenswrapper[4700]: I1007 12:27:02.809715 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qz2cc" Oct 07 12:27:02 crc kubenswrapper[4700]: I1007 12:27:02.841390 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qz2cc"] Oct 07 12:27:02 crc kubenswrapper[4700]: I1007 12:27:02.899560 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dda54bc6-3b16-44d3-953a-96cbc6b60b78-utilities\") pod \"certified-operators-qz2cc\" (UID: \"dda54bc6-3b16-44d3-953a-96cbc6b60b78\") " pod="openshift-marketplace/certified-operators-qz2cc" Oct 07 12:27:02 crc kubenswrapper[4700]: I1007 12:27:02.899622 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzb8z\" (UniqueName: \"kubernetes.io/projected/dda54bc6-3b16-44d3-953a-96cbc6b60b78-kube-api-access-tzb8z\") pod \"certified-operators-qz2cc\" (UID: \"dda54bc6-3b16-44d3-953a-96cbc6b60b78\") " pod="openshift-marketplace/certified-operators-qz2cc" Oct 07 12:27:02 crc kubenswrapper[4700]: I1007 12:27:02.899699 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dda54bc6-3b16-44d3-953a-96cbc6b60b78-catalog-content\") pod \"certified-operators-qz2cc\" (UID: \"dda54bc6-3b16-44d3-953a-96cbc6b60b78\") " pod="openshift-marketplace/certified-operators-qz2cc" Oct 07 12:27:03 crc kubenswrapper[4700]: I1007 12:27:03.002488 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dda54bc6-3b16-44d3-953a-96cbc6b60b78-catalog-content\") pod \"certified-operators-qz2cc\" (UID: \"dda54bc6-3b16-44d3-953a-96cbc6b60b78\") " pod="openshift-marketplace/certified-operators-qz2cc" Oct 07 12:27:03 crc kubenswrapper[4700]: I1007 12:27:03.003097 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dda54bc6-3b16-44d3-953a-96cbc6b60b78-catalog-content\") pod \"certified-operators-qz2cc\" (UID: \"dda54bc6-3b16-44d3-953a-96cbc6b60b78\") " pod="openshift-marketplace/certified-operators-qz2cc" Oct 07 12:27:03 crc kubenswrapper[4700]: I1007 12:27:03.003259 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dda54bc6-3b16-44d3-953a-96cbc6b60b78-utilities\") pod \"certified-operators-qz2cc\" (UID: \"dda54bc6-3b16-44d3-953a-96cbc6b60b78\") " pod="openshift-marketplace/certified-operators-qz2cc" Oct 07 12:27:03 crc kubenswrapper[4700]: I1007 12:27:03.003343 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzb8z\" (UniqueName: \"kubernetes.io/projected/dda54bc6-3b16-44d3-953a-96cbc6b60b78-kube-api-access-tzb8z\") pod \"certified-operators-qz2cc\" (UID: \"dda54bc6-3b16-44d3-953a-96cbc6b60b78\") " pod="openshift-marketplace/certified-operators-qz2cc" Oct 07 12:27:03 crc kubenswrapper[4700]: I1007 12:27:03.003582 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dda54bc6-3b16-44d3-953a-96cbc6b60b78-utilities\") pod \"certified-operators-qz2cc\" (UID: \"dda54bc6-3b16-44d3-953a-96cbc6b60b78\") " pod="openshift-marketplace/certified-operators-qz2cc" Oct 07 12:27:03 crc kubenswrapper[4700]: I1007 12:27:03.022240 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzb8z\" (UniqueName: \"kubernetes.io/projected/dda54bc6-3b16-44d3-953a-96cbc6b60b78-kube-api-access-tzb8z\") pod \"certified-operators-qz2cc\" (UID: \"dda54bc6-3b16-44d3-953a-96cbc6b60b78\") " pod="openshift-marketplace/certified-operators-qz2cc" Oct 07 12:27:03 crc kubenswrapper[4700]: I1007 12:27:03.148983 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qz2cc" Oct 07 12:27:03 crc kubenswrapper[4700]: I1007 12:27:03.873471 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qz2cc"] Oct 07 12:27:04 crc kubenswrapper[4700]: I1007 12:27:04.313788 4700 generic.go:334] "Generic (PLEG): container finished" podID="dda54bc6-3b16-44d3-953a-96cbc6b60b78" containerID="19dcf7d7bad66e500aa7aefa768b7bc47636f205487d82d096e9100a6826cb6f" exitCode=0 Oct 07 12:27:04 crc kubenswrapper[4700]: I1007 12:27:04.313882 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qz2cc" event={"ID":"dda54bc6-3b16-44d3-953a-96cbc6b60b78","Type":"ContainerDied","Data":"19dcf7d7bad66e500aa7aefa768b7bc47636f205487d82d096e9100a6826cb6f"} Oct 07 12:27:04 crc kubenswrapper[4700]: I1007 12:27:04.314065 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qz2cc" event={"ID":"dda54bc6-3b16-44d3-953a-96cbc6b60b78","Type":"ContainerStarted","Data":"577394eec174d3d3f9857fffffa342c17a1bc98ff33826059ddc34b5f8315f17"} Oct 07 12:27:04 crc kubenswrapper[4700]: I1007 12:27:04.315898 4700 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 12:27:05 crc kubenswrapper[4700]: I1007 12:27:05.329531 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qz2cc" event={"ID":"dda54bc6-3b16-44d3-953a-96cbc6b60b78","Type":"ContainerStarted","Data":"a6d87bbe653237c098c442b185e3dcda7e102a0a739a5832319ef22e4f280c96"} Oct 07 12:27:07 crc kubenswrapper[4700]: I1007 12:27:07.357438 4700 generic.go:334] "Generic (PLEG): container finished" podID="dda54bc6-3b16-44d3-953a-96cbc6b60b78" containerID="a6d87bbe653237c098c442b185e3dcda7e102a0a739a5832319ef22e4f280c96" exitCode=0 Oct 07 12:27:07 crc kubenswrapper[4700]: I1007 12:27:07.357520 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qz2cc" event={"ID":"dda54bc6-3b16-44d3-953a-96cbc6b60b78","Type":"ContainerDied","Data":"a6d87bbe653237c098c442b185e3dcda7e102a0a739a5832319ef22e4f280c96"} Oct 07 12:27:09 crc kubenswrapper[4700]: I1007 12:27:09.376250 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qz2cc" event={"ID":"dda54bc6-3b16-44d3-953a-96cbc6b60b78","Type":"ContainerStarted","Data":"25a0c3468b946a2e5163ef61ae36685b11cab588dfd4902feb509ff8d537a997"} Oct 07 12:27:09 crc kubenswrapper[4700]: I1007 12:27:09.393376 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qz2cc" podStartSLOduration=3.518497593 podStartE2EDuration="7.393363455s" podCreationTimestamp="2025-10-07 12:27:02 +0000 UTC" firstStartedPulling="2025-10-07 12:27:04.315666249 +0000 UTC m=+3991.112065238" lastFinishedPulling="2025-10-07 12:27:08.190532101 +0000 UTC m=+3994.986931100" observedRunningTime="2025-10-07 12:27:09.391456256 +0000 UTC m=+3996.187855265" watchObservedRunningTime="2025-10-07 12:27:09.393363455 +0000 UTC m=+3996.189762434" Oct 07 12:27:13 crc kubenswrapper[4700]: I1007 12:27:13.150411 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qz2cc" Oct 07 12:27:13 crc kubenswrapper[4700]: I1007 12:27:13.150809 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qz2cc" Oct 07 12:27:13 crc kubenswrapper[4700]: I1007 12:27:13.209096 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qz2cc" Oct 07 12:27:13 crc kubenswrapper[4700]: I1007 12:27:13.469357 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qz2cc" Oct 07 12:27:13 crc kubenswrapper[4700]: I1007 12:27:13.527487 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qz2cc"] Oct 07 12:27:15 crc kubenswrapper[4700]: I1007 12:27:15.442856 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qz2cc" podUID="dda54bc6-3b16-44d3-953a-96cbc6b60b78" containerName="registry-server" containerID="cri-o://25a0c3468b946a2e5163ef61ae36685b11cab588dfd4902feb509ff8d537a997" gracePeriod=2 Oct 07 12:27:15 crc kubenswrapper[4700]: I1007 12:27:15.947254 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qz2cc" Oct 07 12:27:15 crc kubenswrapper[4700]: I1007 12:27:15.983940 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dda54bc6-3b16-44d3-953a-96cbc6b60b78-utilities\") pod \"dda54bc6-3b16-44d3-953a-96cbc6b60b78\" (UID: \"dda54bc6-3b16-44d3-953a-96cbc6b60b78\") " Oct 07 12:27:15 crc kubenswrapper[4700]: I1007 12:27:15.984236 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dda54bc6-3b16-44d3-953a-96cbc6b60b78-catalog-content\") pod \"dda54bc6-3b16-44d3-953a-96cbc6b60b78\" (UID: \"dda54bc6-3b16-44d3-953a-96cbc6b60b78\") " Oct 07 12:27:15 crc kubenswrapper[4700]: I1007 12:27:15.984474 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzb8z\" (UniqueName: \"kubernetes.io/projected/dda54bc6-3b16-44d3-953a-96cbc6b60b78-kube-api-access-tzb8z\") pod \"dda54bc6-3b16-44d3-953a-96cbc6b60b78\" (UID: \"dda54bc6-3b16-44d3-953a-96cbc6b60b78\") " Oct 07 12:27:15 crc kubenswrapper[4700]: I1007 12:27:15.985540 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dda54bc6-3b16-44d3-953a-96cbc6b60b78-utilities" (OuterVolumeSpecName: "utilities") pod "dda54bc6-3b16-44d3-953a-96cbc6b60b78" (UID: "dda54bc6-3b16-44d3-953a-96cbc6b60b78"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:27:15 crc kubenswrapper[4700]: I1007 12:27:15.993038 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dda54bc6-3b16-44d3-953a-96cbc6b60b78-kube-api-access-tzb8z" (OuterVolumeSpecName: "kube-api-access-tzb8z") pod "dda54bc6-3b16-44d3-953a-96cbc6b60b78" (UID: "dda54bc6-3b16-44d3-953a-96cbc6b60b78"). InnerVolumeSpecName "kube-api-access-tzb8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:27:16 crc kubenswrapper[4700]: I1007 12:27:16.048085 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dda54bc6-3b16-44d3-953a-96cbc6b60b78-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dda54bc6-3b16-44d3-953a-96cbc6b60b78" (UID: "dda54bc6-3b16-44d3-953a-96cbc6b60b78"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:27:16 crc kubenswrapper[4700]: I1007 12:27:16.087349 4700 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dda54bc6-3b16-44d3-953a-96cbc6b60b78-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 12:27:16 crc kubenswrapper[4700]: I1007 12:27:16.087376 4700 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dda54bc6-3b16-44d3-953a-96cbc6b60b78-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 12:27:16 crc kubenswrapper[4700]: I1007 12:27:16.087387 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzb8z\" (UniqueName: \"kubernetes.io/projected/dda54bc6-3b16-44d3-953a-96cbc6b60b78-kube-api-access-tzb8z\") on node \"crc\" DevicePath \"\"" Oct 07 12:27:16 crc kubenswrapper[4700]: I1007 12:27:16.456872 4700 generic.go:334] "Generic (PLEG): container finished" podID="dda54bc6-3b16-44d3-953a-96cbc6b60b78" containerID="25a0c3468b946a2e5163ef61ae36685b11cab588dfd4902feb509ff8d537a997" exitCode=0 Oct 07 12:27:16 crc kubenswrapper[4700]: I1007 12:27:16.456914 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qz2cc" event={"ID":"dda54bc6-3b16-44d3-953a-96cbc6b60b78","Type":"ContainerDied","Data":"25a0c3468b946a2e5163ef61ae36685b11cab588dfd4902feb509ff8d537a997"} Oct 07 12:27:16 crc kubenswrapper[4700]: I1007 12:27:16.456926 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qz2cc" Oct 07 12:27:16 crc kubenswrapper[4700]: I1007 12:27:16.456938 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qz2cc" event={"ID":"dda54bc6-3b16-44d3-953a-96cbc6b60b78","Type":"ContainerDied","Data":"577394eec174d3d3f9857fffffa342c17a1bc98ff33826059ddc34b5f8315f17"} Oct 07 12:27:16 crc kubenswrapper[4700]: I1007 12:27:16.456955 4700 scope.go:117] "RemoveContainer" containerID="25a0c3468b946a2e5163ef61ae36685b11cab588dfd4902feb509ff8d537a997" Oct 07 12:27:16 crc kubenswrapper[4700]: I1007 12:27:16.486665 4700 scope.go:117] "RemoveContainer" containerID="a6d87bbe653237c098c442b185e3dcda7e102a0a739a5832319ef22e4f280c96" Oct 07 12:27:16 crc kubenswrapper[4700]: I1007 12:27:16.502971 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qz2cc"] Oct 07 12:27:16 crc kubenswrapper[4700]: I1007 12:27:16.513835 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qz2cc"] Oct 07 12:27:16 crc kubenswrapper[4700]: I1007 12:27:16.533063 4700 scope.go:117] "RemoveContainer" containerID="19dcf7d7bad66e500aa7aefa768b7bc47636f205487d82d096e9100a6826cb6f" Oct 07 12:27:16 crc kubenswrapper[4700]: I1007 12:27:16.565654 4700 scope.go:117] "RemoveContainer" containerID="25a0c3468b946a2e5163ef61ae36685b11cab588dfd4902feb509ff8d537a997" Oct 07 12:27:16 crc kubenswrapper[4700]: E1007 12:27:16.566615 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25a0c3468b946a2e5163ef61ae36685b11cab588dfd4902feb509ff8d537a997\": container with ID starting with 25a0c3468b946a2e5163ef61ae36685b11cab588dfd4902feb509ff8d537a997 not found: ID does not exist" containerID="25a0c3468b946a2e5163ef61ae36685b11cab588dfd4902feb509ff8d537a997" Oct 07 12:27:16 crc kubenswrapper[4700]: I1007 12:27:16.566654 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25a0c3468b946a2e5163ef61ae36685b11cab588dfd4902feb509ff8d537a997"} err="failed to get container status \"25a0c3468b946a2e5163ef61ae36685b11cab588dfd4902feb509ff8d537a997\": rpc error: code = NotFound desc = could not find container \"25a0c3468b946a2e5163ef61ae36685b11cab588dfd4902feb509ff8d537a997\": container with ID starting with 25a0c3468b946a2e5163ef61ae36685b11cab588dfd4902feb509ff8d537a997 not found: ID does not exist" Oct 07 12:27:16 crc kubenswrapper[4700]: I1007 12:27:16.566683 4700 scope.go:117] "RemoveContainer" containerID="a6d87bbe653237c098c442b185e3dcda7e102a0a739a5832319ef22e4f280c96" Oct 07 12:27:16 crc kubenswrapper[4700]: E1007 12:27:16.567005 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6d87bbe653237c098c442b185e3dcda7e102a0a739a5832319ef22e4f280c96\": container with ID starting with a6d87bbe653237c098c442b185e3dcda7e102a0a739a5832319ef22e4f280c96 not found: ID does not exist" containerID="a6d87bbe653237c098c442b185e3dcda7e102a0a739a5832319ef22e4f280c96" Oct 07 12:27:16 crc kubenswrapper[4700]: I1007 12:27:16.567054 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6d87bbe653237c098c442b185e3dcda7e102a0a739a5832319ef22e4f280c96"} err="failed to get container status \"a6d87bbe653237c098c442b185e3dcda7e102a0a739a5832319ef22e4f280c96\": rpc error: code = NotFound desc = could not find container \"a6d87bbe653237c098c442b185e3dcda7e102a0a739a5832319ef22e4f280c96\": container with ID starting with a6d87bbe653237c098c442b185e3dcda7e102a0a739a5832319ef22e4f280c96 not found: ID does not exist" Oct 07 12:27:16 crc kubenswrapper[4700]: I1007 12:27:16.567088 4700 scope.go:117] "RemoveContainer" containerID="19dcf7d7bad66e500aa7aefa768b7bc47636f205487d82d096e9100a6826cb6f" Oct 07 12:27:16 crc kubenswrapper[4700]: E1007 12:27:16.567521 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19dcf7d7bad66e500aa7aefa768b7bc47636f205487d82d096e9100a6826cb6f\": container with ID starting with 19dcf7d7bad66e500aa7aefa768b7bc47636f205487d82d096e9100a6826cb6f not found: ID does not exist" containerID="19dcf7d7bad66e500aa7aefa768b7bc47636f205487d82d096e9100a6826cb6f" Oct 07 12:27:16 crc kubenswrapper[4700]: I1007 12:27:16.567615 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19dcf7d7bad66e500aa7aefa768b7bc47636f205487d82d096e9100a6826cb6f"} err="failed to get container status \"19dcf7d7bad66e500aa7aefa768b7bc47636f205487d82d096e9100a6826cb6f\": rpc error: code = NotFound desc = could not find container \"19dcf7d7bad66e500aa7aefa768b7bc47636f205487d82d096e9100a6826cb6f\": container with ID starting with 19dcf7d7bad66e500aa7aefa768b7bc47636f205487d82d096e9100a6826cb6f not found: ID does not exist" Oct 07 12:27:17 crc kubenswrapper[4700]: I1007 12:27:17.974068 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dda54bc6-3b16-44d3-953a-96cbc6b60b78" path="/var/lib/kubelet/pods/dda54bc6-3b16-44d3-953a-96cbc6b60b78/volumes" Oct 07 12:27:20 crc kubenswrapper[4700]: I1007 12:27:20.559102 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_de281a78-c284-4c5e-8312-6661e2543668/init-config-reloader/0.log" Oct 07 12:27:20 crc kubenswrapper[4700]: I1007 12:27:20.732441 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_de281a78-c284-4c5e-8312-6661e2543668/init-config-reloader/0.log" Oct 07 12:27:20 crc kubenswrapper[4700]: I1007 12:27:20.756593 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_de281a78-c284-4c5e-8312-6661e2543668/alertmanager/0.log" Oct 07 12:27:20 crc kubenswrapper[4700]: I1007 12:27:20.766978 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_de281a78-c284-4c5e-8312-6661e2543668/config-reloader/0.log" Oct 07 12:27:20 crc kubenswrapper[4700]: I1007 12:27:20.908392 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_c868df58-1fc5-45c0-967d-d42bdb1390f5/aodh-api/0.log" Oct 07 12:27:20 crc kubenswrapper[4700]: I1007 12:27:20.954821 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_c868df58-1fc5-45c0-967d-d42bdb1390f5/aodh-evaluator/0.log" Oct 07 12:27:21 crc kubenswrapper[4700]: I1007 12:27:21.079374 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_c868df58-1fc5-45c0-967d-d42bdb1390f5/aodh-listener/0.log" Oct 07 12:27:21 crc kubenswrapper[4700]: I1007 12:27:21.133937 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_c868df58-1fc5-45c0-967d-d42bdb1390f5/aodh-notifier/0.log" Oct 07 12:27:21 crc kubenswrapper[4700]: I1007 12:27:21.339288 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-746789dcd4-wsdtq_c1e6ae51-277f-403c-a01a-5786160b1298/barbican-api/0.log" Oct 07 12:27:21 crc kubenswrapper[4700]: I1007 12:27:21.342462 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-746789dcd4-wsdtq_c1e6ae51-277f-403c-a01a-5786160b1298/barbican-api-log/0.log" Oct 07 12:27:21 crc kubenswrapper[4700]: I1007 12:27:21.529784 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-65bb648478-f5h6h_62cb738e-4901-49c1-8516-02b0c2a44482/barbican-keystone-listener-log/0.log" Oct 07 12:27:21 crc kubenswrapper[4700]: I1007 12:27:21.537792 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-65bb648478-f5h6h_62cb738e-4901-49c1-8516-02b0c2a44482/barbican-keystone-listener/0.log" Oct 07 12:27:21 crc kubenswrapper[4700]: I1007 12:27:21.701967 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-55444c599f-s65df_f28d9836-f2c1-4a60-97fd-324ba6b0331b/barbican-worker-log/0.log" Oct 07 12:27:21 crc kubenswrapper[4700]: I1007 12:27:21.711419 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-55444c599f-s65df_f28d9836-f2c1-4a60-97fd-324ba6b0331b/barbican-worker/0.log" Oct 07 12:27:21 crc kubenswrapper[4700]: I1007 12:27:21.933041 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx_112df1a0-e767-41be-a95e-4f7e62024fa2/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 12:27:22 crc kubenswrapper[4700]: I1007 12:27:22.079255 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_dff20986-65c2-4eb2-859c-55ea212165b5/ceilometer-central-agent/0.log" Oct 07 12:27:22 crc kubenswrapper[4700]: I1007 12:27:22.130187 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_dff20986-65c2-4eb2-859c-55ea212165b5/ceilometer-notification-agent/0.log" Oct 07 12:27:22 crc kubenswrapper[4700]: I1007 12:27:22.140314 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_dff20986-65c2-4eb2-859c-55ea212165b5/proxy-httpd/0.log" Oct 07 12:27:22 crc kubenswrapper[4700]: I1007 12:27:22.276608 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_dff20986-65c2-4eb2-859c-55ea212165b5/sg-core/0.log" Oct 07 12:27:22 crc kubenswrapper[4700]: I1007 12:27:22.370921 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9ab9a3d0-4f1c-4650-b766-836415e6cb40/cinder-api/0.log" Oct 07 12:27:22 crc kubenswrapper[4700]: I1007 12:27:22.477949 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9ab9a3d0-4f1c-4650-b766-836415e6cb40/cinder-api-log/0.log" Oct 07 12:27:22 crc kubenswrapper[4700]: I1007 12:27:22.581134 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_36ec56ec-a014-4027-a6c0-c817f5bda5ca/cinder-scheduler/0.log" Oct 07 12:27:22 crc kubenswrapper[4700]: I1007 12:27:22.705450 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_36ec56ec-a014-4027-a6c0-c817f5bda5ca/probe/0.log" Oct 07 12:27:22 crc kubenswrapper[4700]: I1007 12:27:22.848982 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-hxcml_e0170992-8798-4624-9953-368a237e9903/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 12:27:23 crc kubenswrapper[4700]: I1007 12:27:23.041736 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-6gh6p_efce44a2-43b7-497a-bd61-81d0bbb5259b/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 12:27:23 crc kubenswrapper[4700]: I1007 12:27:23.277836 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-9rwp9_abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 12:27:23 crc kubenswrapper[4700]: I1007 12:27:23.335032 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-bb85b8995-42kbj_33bd4ab3-f047-4932-a264-163e46ec9749/init/0.log" Oct 07 12:27:23 crc kubenswrapper[4700]: I1007 12:27:23.660466 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-bb85b8995-42kbj_33bd4ab3-f047-4932-a264-163e46ec9749/init/0.log" Oct 07 12:27:23 crc kubenswrapper[4700]: I1007 12:27:23.746420 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-wkfzn_f2864034-dca7-4ae9-b846-17c9ba11e35c/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 12:27:23 crc kubenswrapper[4700]: I1007 12:27:23.759337 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-bb85b8995-42kbj_33bd4ab3-f047-4932-a264-163e46ec9749/dnsmasq-dns/0.log" Oct 07 12:27:24 crc kubenswrapper[4700]: I1007 12:27:24.071811 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2c374f64-ff8f-42c4-b879-fc4a8462a252/glance-httpd/0.log" Oct 07 12:27:24 crc kubenswrapper[4700]: I1007 12:27:24.114334 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2c374f64-ff8f-42c4-b879-fc4a8462a252/glance-log/0.log" Oct 07 12:27:24 crc kubenswrapper[4700]: I1007 12:27:24.264716 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_91a6e182-e619-4e81-a9f1-4a31630788c5/glance-httpd/0.log" Oct 07 12:27:24 crc kubenswrapper[4700]: I1007 12:27:24.283426 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_91a6e182-e619-4e81-a9f1-4a31630788c5/glance-log/0.log" Oct 07 12:27:24 crc kubenswrapper[4700]: I1007 12:27:24.766773 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-575565f88c-czn8g_83eadcce-bdaa-493b-b76c-91cdfd9f8b15/heat-engine/0.log" Oct 07 12:27:24 crc kubenswrapper[4700]: I1007 12:27:24.870804 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-5bd5586b7f-mt9tb_6d6d6a4d-b338-4b5f-8606-ecb9129b2a15/heat-api/0.log" Oct 07 12:27:24 crc kubenswrapper[4700]: I1007 12:27:24.939946 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-764bc4c4ff-fb769_5a792755-beef-4d08-a80d-8fd891e9027a/heat-cfnapi/0.log" Oct 07 12:27:24 crc kubenswrapper[4700]: I1007 12:27:24.986893 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh_1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 12:27:25 crc kubenswrapper[4700]: I1007 12:27:25.077053 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-lnpzk_52767ee1-91e5-47e9-b945-70e2a4df6ec8/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 12:27:25 crc kubenswrapper[4700]: I1007 12:27:25.253585 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-8444f487fd-js794_2300bc48-b64d-42ea-bc78-be6ca9508d5b/keystone-api/0.log" Oct 07 12:27:25 crc kubenswrapper[4700]: I1007 12:27:25.284351 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29330641-xmvkv_41b7be8c-afe7-4893-a50a-2e73d28bb1a9/keystone-cron/0.log" Oct 07 12:27:25 crc kubenswrapper[4700]: I1007 12:27:25.424395 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_ed269e79-4083-4c3b-b44e-4986f2d82921/kube-state-metrics/0.log" Oct 07 12:27:25 crc kubenswrapper[4700]: I1007 12:27:25.538391 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n_075a58e4-36cd-4194-a235-b75f63adb1e2/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 12:27:25 crc kubenswrapper[4700]: I1007 12:27:25.764255 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-744b8f5559-c67wh_3b103be5-6b3d-41f7-ba2e-34f1f5b2730a/neutron-api/0.log" Oct 07 12:27:25 crc kubenswrapper[4700]: I1007 12:27:25.880770 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-744b8f5559-c67wh_3b103be5-6b3d-41f7-ba2e-34f1f5b2730a/neutron-httpd/0.log" Oct 07 12:27:26 crc kubenswrapper[4700]: I1007 12:27:26.055492 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d_e614fc07-932e-461a-9921-3471f4649838/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 12:27:26 crc kubenswrapper[4700]: I1007 12:27:26.459725 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88/nova-api-log/0.log" Oct 07 12:27:26 crc kubenswrapper[4700]: I1007 12:27:26.642258 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88/nova-api-api/0.log" Oct 07 12:27:26 crc kubenswrapper[4700]: I1007 12:27:26.723844 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_606c2b50-ba1e-4181-8615-29f434e0597e/nova-cell0-conductor-conductor/0.log" Oct 07 12:27:26 crc kubenswrapper[4700]: I1007 12:27:26.948514 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_c70174ca-8b17-4f4a-9c6f-0df36cdd3fe1/nova-cell1-conductor-conductor/0.log" Oct 07 12:27:27 crc kubenswrapper[4700]: I1007 12:27:27.111982 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_a4b40ae6-2f36-447e-bc97-7cbcfd970bce/nova-cell1-novncproxy-novncproxy/0.log" Oct 07 12:27:27 crc kubenswrapper[4700]: I1007 12:27:27.276281 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-wlkdf_f28c07c7-b33b-4203-a814-25cc5156660b/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 12:27:27 crc kubenswrapper[4700]: I1007 12:27:27.470688 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_adba6450-c198-456b-a139-67d93e54847b/nova-metadata-log/0.log" Oct 07 12:27:27 crc kubenswrapper[4700]: I1007 12:27:27.830052 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_5785d364-839d-453a-a35f-b95ea89c2152/nova-scheduler-scheduler/0.log" Oct 07 12:27:28 crc kubenswrapper[4700]: I1007 12:27:28.024622 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_793ba797-8da0-4e56-8dcc-14d7d2b0e217/mysql-bootstrap/0.log" Oct 07 12:27:28 crc kubenswrapper[4700]: I1007 12:27:28.193166 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_793ba797-8da0-4e56-8dcc-14d7d2b0e217/mysql-bootstrap/0.log" Oct 07 12:27:28 crc kubenswrapper[4700]: I1007 12:27:28.265040 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_793ba797-8da0-4e56-8dcc-14d7d2b0e217/galera/0.log" Oct 07 12:27:28 crc kubenswrapper[4700]: I1007 12:27:28.499277 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0706b451-8379-454a-bf71-483b779cb17b/mysql-bootstrap/0.log" Oct 07 12:27:28 crc kubenswrapper[4700]: I1007 12:27:28.707171 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0706b451-8379-454a-bf71-483b779cb17b/mysql-bootstrap/0.log" Oct 07 12:27:28 crc kubenswrapper[4700]: I1007 12:27:28.734857 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0706b451-8379-454a-bf71-483b779cb17b/galera/0.log" Oct 07 12:27:28 crc kubenswrapper[4700]: I1007 12:27:28.920630 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_7474ed66-6936-4cd0-b7ca-0182eaeec767/openstackclient/0.log" Oct 07 12:27:28 crc kubenswrapper[4700]: I1007 12:27:28.926329 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_adba6450-c198-456b-a139-67d93e54847b/nova-metadata-metadata/0.log" Oct 07 12:27:29 crc kubenswrapper[4700]: I1007 12:27:29.157113 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-m9nzp_f39e97ad-dbbb-45d4-a595-8f675165ed7d/ovn-controller/0.log" Oct 07 12:27:29 crc kubenswrapper[4700]: I1007 12:27:29.368345 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-44drg_de65476d-b545-432c-a5d2-5b5bd95a9369/openstack-network-exporter/0.log" Oct 07 12:27:29 crc kubenswrapper[4700]: I1007 12:27:29.446622 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ksvhb_36a3b431-4387-4ba7-a2c1-e72622594a8c/ovsdb-server-init/0.log" Oct 07 12:27:29 crc kubenswrapper[4700]: I1007 12:27:29.721205 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ksvhb_36a3b431-4387-4ba7-a2c1-e72622594a8c/ovsdb-server-init/0.log" Oct 07 12:27:29 crc kubenswrapper[4700]: I1007 12:27:29.737255 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ksvhb_36a3b431-4387-4ba7-a2c1-e72622594a8c/ovsdb-server/0.log" Oct 07 12:27:29 crc kubenswrapper[4700]: I1007 12:27:29.744594 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ksvhb_36a3b431-4387-4ba7-a2c1-e72622594a8c/ovs-vswitchd/0.log" Oct 07 12:27:29 crc kubenswrapper[4700]: I1007 12:27:29.944805 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-s25ms_19561523-a9ea-4632-9aa7-6be23fa3eee5/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 12:27:30 crc kubenswrapper[4700]: I1007 12:27:30.170146 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_2b5f78cd-b302-4c30-87c5-82954e351d55/openstack-network-exporter/0.log" Oct 07 12:27:30 crc kubenswrapper[4700]: I1007 12:27:30.171191 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_2b5f78cd-b302-4c30-87c5-82954e351d55/ovn-northd/0.log" Oct 07 12:27:30 crc kubenswrapper[4700]: I1007 12:27:30.347106 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_57e7be90-ef51-432f-afa7-edbff56123e0/openstack-network-exporter/0.log" Oct 07 12:27:30 crc kubenswrapper[4700]: I1007 12:27:30.359812 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_57e7be90-ef51-432f-afa7-edbff56123e0/ovsdbserver-nb/0.log" Oct 07 12:27:30 crc kubenswrapper[4700]: I1007 12:27:30.539084 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f13aad27-7d23-4de6-8de0-c8a61809de5d/openstack-network-exporter/0.log" Oct 07 12:27:30 crc kubenswrapper[4700]: I1007 12:27:30.596670 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f13aad27-7d23-4de6-8de0-c8a61809de5d/ovsdbserver-sb/0.log" Oct 07 12:27:30 crc kubenswrapper[4700]: I1007 12:27:30.775935 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7d99cbbb56-xqb5x_36efa6df-bc80-48f6-8611-e8dff3530d8e/placement-api/0.log" Oct 07 12:27:30 crc kubenswrapper[4700]: I1007 12:27:30.822469 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7d99cbbb56-xqb5x_36efa6df-bc80-48f6-8611-e8dff3530d8e/placement-log/0.log" Oct 07 12:27:31 crc kubenswrapper[4700]: I1007 12:27:31.015305 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7b10d3cc-1670-4070-9e12-7049b2906d9d/init-config-reloader/0.log" Oct 07 12:27:31 crc kubenswrapper[4700]: I1007 12:27:31.178272 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7b10d3cc-1670-4070-9e12-7049b2906d9d/init-config-reloader/0.log" Oct 07 12:27:31 crc kubenswrapper[4700]: I1007 12:27:31.218571 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7b10d3cc-1670-4070-9e12-7049b2906d9d/config-reloader/0.log" Oct 07 12:27:31 crc kubenswrapper[4700]: I1007 12:27:31.253846 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7b10d3cc-1670-4070-9e12-7049b2906d9d/prometheus/0.log" Oct 07 12:27:31 crc kubenswrapper[4700]: I1007 12:27:31.420601 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7b10d3cc-1670-4070-9e12-7049b2906d9d/thanos-sidecar/0.log" Oct 07 12:27:31 crc kubenswrapper[4700]: I1007 12:27:31.483793 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_abea1f83-cad5-40e9-a9d7-543660436ae0/setup-container/0.log" Oct 07 12:27:31 crc kubenswrapper[4700]: I1007 12:27:31.715510 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_abea1f83-cad5-40e9-a9d7-543660436ae0/rabbitmq/0.log" Oct 07 12:27:31 crc kubenswrapper[4700]: I1007 12:27:31.715603 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_abea1f83-cad5-40e9-a9d7-543660436ae0/setup-container/0.log" Oct 07 12:27:31 crc kubenswrapper[4700]: I1007 12:27:31.904219 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_509a6d73-2ff1-43f5-aa66-97d3a7d10e88/setup-container/0.log" Oct 07 12:27:32 crc kubenswrapper[4700]: I1007 12:27:32.084506 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_509a6d73-2ff1-43f5-aa66-97d3a7d10e88/setup-container/0.log" Oct 07 12:27:32 crc kubenswrapper[4700]: I1007 12:27:32.180612 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_509a6d73-2ff1-43f5-aa66-97d3a7d10e88/rabbitmq/0.log" Oct 07 12:27:32 crc kubenswrapper[4700]: I1007 12:27:32.307422 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-2p4sb_5c0097f4-e71a-4bfe-8425-c87d93929a43/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 12:27:32 crc kubenswrapper[4700]: I1007 12:27:32.426791 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-7926j_d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 12:27:32 crc kubenswrapper[4700]: I1007 12:27:32.597735 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5_3da1015a-c431-4f0f-971a-98b31f112e53/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 12:27:32 crc kubenswrapper[4700]: I1007 12:27:32.758344 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-h694x_e0c8a166-6dac-4916-a2a7-9367a5ba765c/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 12:27:32 crc kubenswrapper[4700]: I1007 12:27:32.911212 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-zdwmq_046d1005-6634-4a0a-b10c-f5e3faf34ba6/ssh-known-hosts-edpm-deployment/0.log" Oct 07 12:27:33 crc kubenswrapper[4700]: I1007 12:27:33.132481 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7cd7d44d75-xs58b_2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81/proxy-server/0.log" Oct 07 12:27:33 crc kubenswrapper[4700]: I1007 12:27:33.267987 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7cd7d44d75-xs58b_2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81/proxy-httpd/0.log" Oct 07 12:27:33 crc kubenswrapper[4700]: I1007 12:27:33.282925 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-pfsbd_3352f4b9-00aa-419c-a354-1fb7b7120ad5/swift-ring-rebalance/0.log" Oct 07 12:27:33 crc kubenswrapper[4700]: I1007 12:27:33.483065 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6555d4a9-f098-43b2-9b50-f7c9d855cf6a/account-reaper/0.log" Oct 07 12:27:33 crc kubenswrapper[4700]: I1007 12:27:33.496525 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6555d4a9-f098-43b2-9b50-f7c9d855cf6a/account-auditor/0.log" Oct 07 12:27:33 crc kubenswrapper[4700]: I1007 12:27:33.679905 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6555d4a9-f098-43b2-9b50-f7c9d855cf6a/account-replicator/0.log" Oct 07 12:27:33 crc kubenswrapper[4700]: I1007 12:27:33.684346 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6555d4a9-f098-43b2-9b50-f7c9d855cf6a/container-auditor/0.log" Oct 07 12:27:33 crc kubenswrapper[4700]: I1007 12:27:33.688855 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6555d4a9-f098-43b2-9b50-f7c9d855cf6a/account-server/0.log" Oct 07 12:27:33 crc kubenswrapper[4700]: I1007 12:27:33.856219 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6555d4a9-f098-43b2-9b50-f7c9d855cf6a/container-server/0.log" Oct 07 12:27:33 crc kubenswrapper[4700]: I1007 12:27:33.880967 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6555d4a9-f098-43b2-9b50-f7c9d855cf6a/container-replicator/0.log" Oct 07 12:27:33 crc kubenswrapper[4700]: I1007 12:27:33.947671 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6555d4a9-f098-43b2-9b50-f7c9d855cf6a/container-updater/0.log" Oct 07 12:27:34 crc kubenswrapper[4700]: I1007 12:27:34.071183 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6555d4a9-f098-43b2-9b50-f7c9d855cf6a/object-expirer/0.log" Oct 07 12:27:34 crc kubenswrapper[4700]: I1007 12:27:34.073871 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6555d4a9-f098-43b2-9b50-f7c9d855cf6a/object-auditor/0.log" Oct 07 12:27:34 crc kubenswrapper[4700]: I1007 12:27:34.219560 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6555d4a9-f098-43b2-9b50-f7c9d855cf6a/object-replicator/0.log" Oct 07 12:27:34 crc kubenswrapper[4700]: I1007 12:27:34.328214 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6555d4a9-f098-43b2-9b50-f7c9d855cf6a/object-updater/0.log" Oct 07 12:27:34 crc kubenswrapper[4700]: I1007 12:27:34.344850 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6555d4a9-f098-43b2-9b50-f7c9d855cf6a/object-server/0.log" Oct 07 12:27:34 crc kubenswrapper[4700]: I1007 12:27:34.449083 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6555d4a9-f098-43b2-9b50-f7c9d855cf6a/rsync/0.log" Oct 07 12:27:34 crc kubenswrapper[4700]: I1007 12:27:34.558089 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6555d4a9-f098-43b2-9b50-f7c9d855cf6a/swift-recon-cron/0.log" Oct 07 12:27:34 crc kubenswrapper[4700]: I1007 12:27:34.713131 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q_3b253611-bde5-4dcd-9291-284951206e6f/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 12:27:34 crc kubenswrapper[4700]: I1007 12:27:34.839946 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-4xzsk_fb8189cc-a34b-4793-91c0-7c4d5b837374/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 12:27:41 crc kubenswrapper[4700]: I1007 12:27:41.791037 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_74b58ae8-a57b-4c4d-9ca1-97bfa7ca8a81/memcached/0.log" Oct 07 12:27:57 crc kubenswrapper[4700]: I1007 12:27:57.831213 4700 generic.go:334] "Generic (PLEG): container finished" podID="039051d0-844b-4a45-bd6f-edae74cc104e" containerID="f1c599eeac98fc59e1c7ecf51fbb14b039a2d828fec462ba0a0cbec96dd8625c" exitCode=0 Oct 07 12:27:57 crc kubenswrapper[4700]: I1007 12:27:57.831277 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dcccm/crc-debug-jbfpk" event={"ID":"039051d0-844b-4a45-bd6f-edae74cc104e","Type":"ContainerDied","Data":"f1c599eeac98fc59e1c7ecf51fbb14b039a2d828fec462ba0a0cbec96dd8625c"} Oct 07 12:27:58 crc kubenswrapper[4700]: I1007 12:27:58.966364 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dcccm/crc-debug-jbfpk" Oct 07 12:27:59 crc kubenswrapper[4700]: I1007 12:27:59.017342 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dcccm/crc-debug-jbfpk"] Oct 07 12:27:59 crc kubenswrapper[4700]: I1007 12:27:59.024091 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/039051d0-844b-4a45-bd6f-edae74cc104e-host\") pod \"039051d0-844b-4a45-bd6f-edae74cc104e\" (UID: \"039051d0-844b-4a45-bd6f-edae74cc104e\") " Oct 07 12:27:59 crc kubenswrapper[4700]: I1007 12:27:59.024216 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/039051d0-844b-4a45-bd6f-edae74cc104e-host" (OuterVolumeSpecName: "host") pod "039051d0-844b-4a45-bd6f-edae74cc104e" (UID: "039051d0-844b-4a45-bd6f-edae74cc104e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:27:59 crc kubenswrapper[4700]: I1007 12:27:59.024464 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rxxj\" (UniqueName: \"kubernetes.io/projected/039051d0-844b-4a45-bd6f-edae74cc104e-kube-api-access-2rxxj\") pod \"039051d0-844b-4a45-bd6f-edae74cc104e\" (UID: \"039051d0-844b-4a45-bd6f-edae74cc104e\") " Oct 07 12:27:59 crc kubenswrapper[4700]: I1007 12:27:59.026293 4700 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/039051d0-844b-4a45-bd6f-edae74cc104e-host\") on node \"crc\" DevicePath \"\"" Oct 07 12:27:59 crc kubenswrapper[4700]: I1007 12:27:59.026371 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dcccm/crc-debug-jbfpk"] Oct 07 12:27:59 crc kubenswrapper[4700]: I1007 12:27:59.035208 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/039051d0-844b-4a45-bd6f-edae74cc104e-kube-api-access-2rxxj" (OuterVolumeSpecName: "kube-api-access-2rxxj") pod "039051d0-844b-4a45-bd6f-edae74cc104e" (UID: "039051d0-844b-4a45-bd6f-edae74cc104e"). InnerVolumeSpecName "kube-api-access-2rxxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:27:59 crc kubenswrapper[4700]: I1007 12:27:59.127846 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rxxj\" (UniqueName: \"kubernetes.io/projected/039051d0-844b-4a45-bd6f-edae74cc104e-kube-api-access-2rxxj\") on node \"crc\" DevicePath \"\"" Oct 07 12:27:59 crc kubenswrapper[4700]: I1007 12:27:59.858255 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acb3f46eb24c6f36e70b4df0c06bd9455f8cb2abc05e8011e326653a39b52949" Oct 07 12:27:59 crc kubenswrapper[4700]: I1007 12:27:59.858318 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dcccm/crc-debug-jbfpk" Oct 07 12:27:59 crc kubenswrapper[4700]: I1007 12:27:59.976732 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="039051d0-844b-4a45-bd6f-edae74cc104e" path="/var/lib/kubelet/pods/039051d0-844b-4a45-bd6f-edae74cc104e/volumes" Oct 07 12:28:00 crc kubenswrapper[4700]: I1007 12:28:00.224945 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dcccm/crc-debug-9rvj6"] Oct 07 12:28:00 crc kubenswrapper[4700]: E1007 12:28:00.225351 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="039051d0-844b-4a45-bd6f-edae74cc104e" containerName="container-00" Oct 07 12:28:00 crc kubenswrapper[4700]: I1007 12:28:00.225366 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="039051d0-844b-4a45-bd6f-edae74cc104e" containerName="container-00" Oct 07 12:28:00 crc kubenswrapper[4700]: E1007 12:28:00.225400 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dda54bc6-3b16-44d3-953a-96cbc6b60b78" containerName="extract-content" Oct 07 12:28:00 crc kubenswrapper[4700]: I1007 12:28:00.225406 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="dda54bc6-3b16-44d3-953a-96cbc6b60b78" containerName="extract-content" Oct 07 12:28:00 crc kubenswrapper[4700]: E1007 12:28:00.225425 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dda54bc6-3b16-44d3-953a-96cbc6b60b78" containerName="registry-server" Oct 07 12:28:00 crc kubenswrapper[4700]: I1007 12:28:00.225431 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="dda54bc6-3b16-44d3-953a-96cbc6b60b78" containerName="registry-server" Oct 07 12:28:00 crc kubenswrapper[4700]: E1007 12:28:00.225439 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dda54bc6-3b16-44d3-953a-96cbc6b60b78" containerName="extract-utilities" Oct 07 12:28:00 crc kubenswrapper[4700]: I1007 12:28:00.225445 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="dda54bc6-3b16-44d3-953a-96cbc6b60b78" containerName="extract-utilities" Oct 07 12:28:00 crc kubenswrapper[4700]: I1007 12:28:00.225625 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="039051d0-844b-4a45-bd6f-edae74cc104e" containerName="container-00" Oct 07 12:28:00 crc kubenswrapper[4700]: I1007 12:28:00.225641 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="dda54bc6-3b16-44d3-953a-96cbc6b60b78" containerName="registry-server" Oct 07 12:28:00 crc kubenswrapper[4700]: I1007 12:28:00.226248 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dcccm/crc-debug-9rvj6" Oct 07 12:28:00 crc kubenswrapper[4700]: I1007 12:28:00.354402 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx2zp\" (UniqueName: \"kubernetes.io/projected/978a746e-3bee-467f-a5bb-30699bc135bf-kube-api-access-vx2zp\") pod \"crc-debug-9rvj6\" (UID: \"978a746e-3bee-467f-a5bb-30699bc135bf\") " pod="openshift-must-gather-dcccm/crc-debug-9rvj6" Oct 07 12:28:00 crc kubenswrapper[4700]: I1007 12:28:00.354893 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/978a746e-3bee-467f-a5bb-30699bc135bf-host\") pod \"crc-debug-9rvj6\" (UID: \"978a746e-3bee-467f-a5bb-30699bc135bf\") " pod="openshift-must-gather-dcccm/crc-debug-9rvj6" Oct 07 12:28:00 crc kubenswrapper[4700]: I1007 12:28:00.457212 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx2zp\" (UniqueName: \"kubernetes.io/projected/978a746e-3bee-467f-a5bb-30699bc135bf-kube-api-access-vx2zp\") pod \"crc-debug-9rvj6\" (UID: \"978a746e-3bee-467f-a5bb-30699bc135bf\") " pod="openshift-must-gather-dcccm/crc-debug-9rvj6" Oct 07 12:28:00 crc kubenswrapper[4700]: I1007 12:28:00.457283 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/978a746e-3bee-467f-a5bb-30699bc135bf-host\") pod \"crc-debug-9rvj6\" (UID: \"978a746e-3bee-467f-a5bb-30699bc135bf\") " pod="openshift-must-gather-dcccm/crc-debug-9rvj6" Oct 07 12:28:00 crc kubenswrapper[4700]: I1007 12:28:00.457566 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/978a746e-3bee-467f-a5bb-30699bc135bf-host\") pod \"crc-debug-9rvj6\" (UID: \"978a746e-3bee-467f-a5bb-30699bc135bf\") " pod="openshift-must-gather-dcccm/crc-debug-9rvj6" Oct 07 12:28:00 crc kubenswrapper[4700]: I1007 12:28:00.477278 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx2zp\" (UniqueName: \"kubernetes.io/projected/978a746e-3bee-467f-a5bb-30699bc135bf-kube-api-access-vx2zp\") pod \"crc-debug-9rvj6\" (UID: \"978a746e-3bee-467f-a5bb-30699bc135bf\") " pod="openshift-must-gather-dcccm/crc-debug-9rvj6" Oct 07 12:28:00 crc kubenswrapper[4700]: I1007 12:28:00.549041 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dcccm/crc-debug-9rvj6" Oct 07 12:28:00 crc kubenswrapper[4700]: I1007 12:28:00.870595 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dcccm/crc-debug-9rvj6" event={"ID":"978a746e-3bee-467f-a5bb-30699bc135bf","Type":"ContainerStarted","Data":"201793c0471af59593892fdd28d8a293164b068105dd09fd9958f1514baa47a6"} Oct 07 12:28:00 crc kubenswrapper[4700]: I1007 12:28:00.870929 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dcccm/crc-debug-9rvj6" event={"ID":"978a746e-3bee-467f-a5bb-30699bc135bf","Type":"ContainerStarted","Data":"829960702faf5b85242cb66f1a697d7c2f671784489a1bcc159fe72566235b73"} Oct 07 12:28:00 crc kubenswrapper[4700]: I1007 12:28:00.887149 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dcccm/crc-debug-9rvj6" podStartSLOduration=0.887128494 podStartE2EDuration="887.128494ms" podCreationTimestamp="2025-10-07 12:28:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:28:00.885679626 +0000 UTC m=+4047.682078635" watchObservedRunningTime="2025-10-07 12:28:00.887128494 +0000 UTC m=+4047.683527473" Oct 07 12:28:01 crc kubenswrapper[4700]: I1007 12:28:01.885143 4700 generic.go:334] "Generic (PLEG): container finished" podID="978a746e-3bee-467f-a5bb-30699bc135bf" containerID="201793c0471af59593892fdd28d8a293164b068105dd09fd9958f1514baa47a6" exitCode=0 Oct 07 12:28:01 crc kubenswrapper[4700]: I1007 12:28:01.885212 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dcccm/crc-debug-9rvj6" event={"ID":"978a746e-3bee-467f-a5bb-30699bc135bf","Type":"ContainerDied","Data":"201793c0471af59593892fdd28d8a293164b068105dd09fd9958f1514baa47a6"} Oct 07 12:28:02 crc kubenswrapper[4700]: I1007 12:28:02.989739 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dcccm/crc-debug-9rvj6" Oct 07 12:28:03 crc kubenswrapper[4700]: I1007 12:28:03.099035 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vx2zp\" (UniqueName: \"kubernetes.io/projected/978a746e-3bee-467f-a5bb-30699bc135bf-kube-api-access-vx2zp\") pod \"978a746e-3bee-467f-a5bb-30699bc135bf\" (UID: \"978a746e-3bee-467f-a5bb-30699bc135bf\") " Oct 07 12:28:03 crc kubenswrapper[4700]: I1007 12:28:03.099074 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/978a746e-3bee-467f-a5bb-30699bc135bf-host\") pod \"978a746e-3bee-467f-a5bb-30699bc135bf\" (UID: \"978a746e-3bee-467f-a5bb-30699bc135bf\") " Oct 07 12:28:03 crc kubenswrapper[4700]: I1007 12:28:03.099192 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/978a746e-3bee-467f-a5bb-30699bc135bf-host" (OuterVolumeSpecName: "host") pod "978a746e-3bee-467f-a5bb-30699bc135bf" (UID: "978a746e-3bee-467f-a5bb-30699bc135bf"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:28:03 crc kubenswrapper[4700]: I1007 12:28:03.099830 4700 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/978a746e-3bee-467f-a5bb-30699bc135bf-host\") on node \"crc\" DevicePath \"\"" Oct 07 12:28:03 crc kubenswrapper[4700]: I1007 12:28:03.104158 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/978a746e-3bee-467f-a5bb-30699bc135bf-kube-api-access-vx2zp" (OuterVolumeSpecName: "kube-api-access-vx2zp") pod "978a746e-3bee-467f-a5bb-30699bc135bf" (UID: "978a746e-3bee-467f-a5bb-30699bc135bf"). InnerVolumeSpecName "kube-api-access-vx2zp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:28:03 crc kubenswrapper[4700]: I1007 12:28:03.201038 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vx2zp\" (UniqueName: \"kubernetes.io/projected/978a746e-3bee-467f-a5bb-30699bc135bf-kube-api-access-vx2zp\") on node \"crc\" DevicePath \"\"" Oct 07 12:28:03 crc kubenswrapper[4700]: I1007 12:28:03.904078 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dcccm/crc-debug-9rvj6" event={"ID":"978a746e-3bee-467f-a5bb-30699bc135bf","Type":"ContainerDied","Data":"829960702faf5b85242cb66f1a697d7c2f671784489a1bcc159fe72566235b73"} Oct 07 12:28:03 crc kubenswrapper[4700]: I1007 12:28:03.904169 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="829960702faf5b85242cb66f1a697d7c2f671784489a1bcc159fe72566235b73" Oct 07 12:28:03 crc kubenswrapper[4700]: I1007 12:28:03.904120 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dcccm/crc-debug-9rvj6" Oct 07 12:28:07 crc kubenswrapper[4700]: I1007 12:28:07.565659 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dcccm/crc-debug-9rvj6"] Oct 07 12:28:07 crc kubenswrapper[4700]: I1007 12:28:07.580240 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dcccm/crc-debug-9rvj6"] Oct 07 12:28:07 crc kubenswrapper[4700]: I1007 12:28:07.977865 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="978a746e-3bee-467f-a5bb-30699bc135bf" path="/var/lib/kubelet/pods/978a746e-3bee-467f-a5bb-30699bc135bf/volumes" Oct 07 12:28:08 crc kubenswrapper[4700]: I1007 12:28:08.767490 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dcccm/crc-debug-xbrn8"] Oct 07 12:28:08 crc kubenswrapper[4700]: E1007 12:28:08.768430 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="978a746e-3bee-467f-a5bb-30699bc135bf" containerName="container-00" Oct 07 12:28:08 crc kubenswrapper[4700]: I1007 12:28:08.768445 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="978a746e-3bee-467f-a5bb-30699bc135bf" containerName="container-00" Oct 07 12:28:08 crc kubenswrapper[4700]: I1007 12:28:08.768639 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="978a746e-3bee-467f-a5bb-30699bc135bf" containerName="container-00" Oct 07 12:28:08 crc kubenswrapper[4700]: I1007 12:28:08.769270 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dcccm/crc-debug-xbrn8" Oct 07 12:28:08 crc kubenswrapper[4700]: I1007 12:28:08.918671 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnbn2\" (UniqueName: \"kubernetes.io/projected/6e1fa560-dcc4-4cee-bb3c-0efdf44cd401-kube-api-access-jnbn2\") pod \"crc-debug-xbrn8\" (UID: \"6e1fa560-dcc4-4cee-bb3c-0efdf44cd401\") " pod="openshift-must-gather-dcccm/crc-debug-xbrn8" Oct 07 12:28:08 crc kubenswrapper[4700]: I1007 12:28:08.919002 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6e1fa560-dcc4-4cee-bb3c-0efdf44cd401-host\") pod \"crc-debug-xbrn8\" (UID: \"6e1fa560-dcc4-4cee-bb3c-0efdf44cd401\") " pod="openshift-must-gather-dcccm/crc-debug-xbrn8" Oct 07 12:28:09 crc kubenswrapper[4700]: I1007 12:28:09.020772 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnbn2\" (UniqueName: \"kubernetes.io/projected/6e1fa560-dcc4-4cee-bb3c-0efdf44cd401-kube-api-access-jnbn2\") pod \"crc-debug-xbrn8\" (UID: \"6e1fa560-dcc4-4cee-bb3c-0efdf44cd401\") " pod="openshift-must-gather-dcccm/crc-debug-xbrn8" Oct 07 12:28:09 crc kubenswrapper[4700]: I1007 12:28:09.020960 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6e1fa560-dcc4-4cee-bb3c-0efdf44cd401-host\") pod \"crc-debug-xbrn8\" (UID: \"6e1fa560-dcc4-4cee-bb3c-0efdf44cd401\") " pod="openshift-must-gather-dcccm/crc-debug-xbrn8" Oct 07 12:28:09 crc kubenswrapper[4700]: I1007 12:28:09.021119 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6e1fa560-dcc4-4cee-bb3c-0efdf44cd401-host\") pod \"crc-debug-xbrn8\" (UID: \"6e1fa560-dcc4-4cee-bb3c-0efdf44cd401\") " pod="openshift-must-gather-dcccm/crc-debug-xbrn8" Oct 07 12:28:09 crc kubenswrapper[4700]: I1007 12:28:09.040382 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnbn2\" (UniqueName: \"kubernetes.io/projected/6e1fa560-dcc4-4cee-bb3c-0efdf44cd401-kube-api-access-jnbn2\") pod \"crc-debug-xbrn8\" (UID: \"6e1fa560-dcc4-4cee-bb3c-0efdf44cd401\") " pod="openshift-must-gather-dcccm/crc-debug-xbrn8" Oct 07 12:28:09 crc kubenswrapper[4700]: I1007 12:28:09.093799 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dcccm/crc-debug-xbrn8" Oct 07 12:28:09 crc kubenswrapper[4700]: I1007 12:28:09.974929 4700 generic.go:334] "Generic (PLEG): container finished" podID="6e1fa560-dcc4-4cee-bb3c-0efdf44cd401" containerID="5922ac1a2224e3dfa66ee6544d06d6ce4d7cd4be0ea045bb238c5dacc698b632" exitCode=0 Oct 07 12:28:09 crc kubenswrapper[4700]: I1007 12:28:09.979626 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dcccm/crc-debug-xbrn8" event={"ID":"6e1fa560-dcc4-4cee-bb3c-0efdf44cd401","Type":"ContainerDied","Data":"5922ac1a2224e3dfa66ee6544d06d6ce4d7cd4be0ea045bb238c5dacc698b632"} Oct 07 12:28:09 crc kubenswrapper[4700]: I1007 12:28:09.979688 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dcccm/crc-debug-xbrn8" event={"ID":"6e1fa560-dcc4-4cee-bb3c-0efdf44cd401","Type":"ContainerStarted","Data":"7ea005f7965d9413ac6077186749a060e46a6f863ec6b4c77db8ced6b2e6507d"} Oct 07 12:28:10 crc kubenswrapper[4700]: I1007 12:28:10.028299 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dcccm/crc-debug-xbrn8"] Oct 07 12:28:10 crc kubenswrapper[4700]: I1007 12:28:10.042021 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dcccm/crc-debug-xbrn8"] Oct 07 12:28:11 crc kubenswrapper[4700]: I1007 12:28:11.077539 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dcccm/crc-debug-xbrn8" Oct 07 12:28:11 crc kubenswrapper[4700]: I1007 12:28:11.162014 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6e1fa560-dcc4-4cee-bb3c-0efdf44cd401-host\") pod \"6e1fa560-dcc4-4cee-bb3c-0efdf44cd401\" (UID: \"6e1fa560-dcc4-4cee-bb3c-0efdf44cd401\") " Oct 07 12:28:11 crc kubenswrapper[4700]: I1007 12:28:11.162093 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnbn2\" (UniqueName: \"kubernetes.io/projected/6e1fa560-dcc4-4cee-bb3c-0efdf44cd401-kube-api-access-jnbn2\") pod \"6e1fa560-dcc4-4cee-bb3c-0efdf44cd401\" (UID: \"6e1fa560-dcc4-4cee-bb3c-0efdf44cd401\") " Oct 07 12:28:11 crc kubenswrapper[4700]: I1007 12:28:11.162257 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e1fa560-dcc4-4cee-bb3c-0efdf44cd401-host" (OuterVolumeSpecName: "host") pod "6e1fa560-dcc4-4cee-bb3c-0efdf44cd401" (UID: "6e1fa560-dcc4-4cee-bb3c-0efdf44cd401"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:28:11 crc kubenswrapper[4700]: I1007 12:28:11.162629 4700 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6e1fa560-dcc4-4cee-bb3c-0efdf44cd401-host\") on node \"crc\" DevicePath \"\"" Oct 07 12:28:11 crc kubenswrapper[4700]: I1007 12:28:11.178505 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e1fa560-dcc4-4cee-bb3c-0efdf44cd401-kube-api-access-jnbn2" (OuterVolumeSpecName: "kube-api-access-jnbn2") pod "6e1fa560-dcc4-4cee-bb3c-0efdf44cd401" (UID: "6e1fa560-dcc4-4cee-bb3c-0efdf44cd401"). InnerVolumeSpecName "kube-api-access-jnbn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:28:11 crc kubenswrapper[4700]: I1007 12:28:11.264339 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnbn2\" (UniqueName: \"kubernetes.io/projected/6e1fa560-dcc4-4cee-bb3c-0efdf44cd401-kube-api-access-jnbn2\") on node \"crc\" DevicePath \"\"" Oct 07 12:28:11 crc kubenswrapper[4700]: I1007 12:28:11.706858 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml_6227f006-b587-451f-bff5-cf97da256b9f/util/0.log" Oct 07 12:28:11 crc kubenswrapper[4700]: I1007 12:28:11.890376 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml_6227f006-b587-451f-bff5-cf97da256b9f/util/0.log" Oct 07 12:28:11 crc kubenswrapper[4700]: I1007 12:28:11.931069 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml_6227f006-b587-451f-bff5-cf97da256b9f/pull/0.log" Oct 07 12:28:11 crc kubenswrapper[4700]: I1007 12:28:11.936000 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml_6227f006-b587-451f-bff5-cf97da256b9f/pull/0.log" Oct 07 12:28:11 crc kubenswrapper[4700]: I1007 12:28:11.971109 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e1fa560-dcc4-4cee-bb3c-0efdf44cd401" path="/var/lib/kubelet/pods/6e1fa560-dcc4-4cee-bb3c-0efdf44cd401/volumes" Oct 07 12:28:11 crc kubenswrapper[4700]: I1007 12:28:11.994383 4700 scope.go:117] "RemoveContainer" containerID="5922ac1a2224e3dfa66ee6544d06d6ce4d7cd4be0ea045bb238c5dacc698b632" Oct 07 12:28:11 crc kubenswrapper[4700]: I1007 12:28:11.994599 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dcccm/crc-debug-xbrn8" Oct 07 12:28:12 crc kubenswrapper[4700]: I1007 12:28:12.138272 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml_6227f006-b587-451f-bff5-cf97da256b9f/pull/0.log" Oct 07 12:28:12 crc kubenswrapper[4700]: I1007 12:28:12.144408 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml_6227f006-b587-451f-bff5-cf97da256b9f/util/0.log" Oct 07 12:28:12 crc kubenswrapper[4700]: I1007 12:28:12.180871 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml_6227f006-b587-451f-bff5-cf97da256b9f/extract/0.log" Oct 07 12:28:12 crc kubenswrapper[4700]: I1007 12:28:12.296180 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-58c4cd55f4-vvp7c_b69da5cc-fa66-4adb-b136-1efe25092b40/kube-rbac-proxy/0.log" Oct 07 12:28:12 crc kubenswrapper[4700]: I1007 12:28:12.398643 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-mg8jv_b5298439-5a42-4bca-aa5b-c3fb26b2e5e3/kube-rbac-proxy/0.log" Oct 07 12:28:12 crc kubenswrapper[4700]: I1007 12:28:12.418481 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-58c4cd55f4-vvp7c_b69da5cc-fa66-4adb-b136-1efe25092b40/manager/0.log" Oct 07 12:28:12 crc kubenswrapper[4700]: I1007 12:28:12.517290 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-mg8jv_b5298439-5a42-4bca-aa5b-c3fb26b2e5e3/manager/0.log" Oct 07 12:28:12 crc kubenswrapper[4700]: I1007 12:28:12.596798 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-9xkbh_6b424cb2-c37e-4db9-86f4-75132c345127/kube-rbac-proxy/0.log" Oct 07 12:28:12 crc kubenswrapper[4700]: I1007 12:28:12.625903 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-9xkbh_6b424cb2-c37e-4db9-86f4-75132c345127/manager/0.log" Oct 07 12:28:12 crc kubenswrapper[4700]: I1007 12:28:12.809825 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5dc44df7d5-8dhrp_0ef02b09-6290-414e-b0b3-f9d52138d53d/kube-rbac-proxy/0.log" Oct 07 12:28:12 crc kubenswrapper[4700]: I1007 12:28:12.860965 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5dc44df7d5-8dhrp_0ef02b09-6290-414e-b0b3-f9d52138d53d/manager/0.log" Oct 07 12:28:12 crc kubenswrapper[4700]: I1007 12:28:12.945825 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-54b4974c45-s2qwp_6992fc43-2f9e-414d-8c14-f08185ed395a/kube-rbac-proxy/0.log" Oct 07 12:28:13 crc kubenswrapper[4700]: I1007 12:28:13.042431 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-54b4974c45-s2qwp_6992fc43-2f9e-414d-8c14-f08185ed395a/manager/0.log" Oct 07 12:28:13 crc kubenswrapper[4700]: I1007 12:28:13.074367 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-76d5b87f47-69f5t_6f56ee78-9e72-4fbb-abff-985e142a17cb/kube-rbac-proxy/0.log" Oct 07 12:28:13 crc kubenswrapper[4700]: I1007 12:28:13.161776 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-76d5b87f47-69f5t_6f56ee78-9e72-4fbb-abff-985e142a17cb/manager/0.log" Oct 07 12:28:13 crc kubenswrapper[4700]: I1007 12:28:13.234150 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-5nnwj_0df5f995-5a5e-4c40-a498-9dd5ffd4381c/kube-rbac-proxy/0.log" Oct 07 12:28:13 crc kubenswrapper[4700]: I1007 12:28:13.423744 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-5nnwj_0df5f995-5a5e-4c40-a498-9dd5ffd4381c/manager/0.log" Oct 07 12:28:13 crc kubenswrapper[4700]: I1007 12:28:13.461029 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-649675d675-dfs4r_42762f72-5039-4394-a311-299b57c3485a/manager/0.log" Oct 07 12:28:13 crc kubenswrapper[4700]: I1007 12:28:13.471427 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-649675d675-dfs4r_42762f72-5039-4394-a311-299b57c3485a/kube-rbac-proxy/0.log" Oct 07 12:28:13 crc kubenswrapper[4700]: I1007 12:28:13.624125 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b5ccf6d9c-dnn6x_e260e7ed-c267-40b2-861a-9a77325e027a/kube-rbac-proxy/0.log" Oct 07 12:28:13 crc kubenswrapper[4700]: I1007 12:28:13.712095 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b5ccf6d9c-dnn6x_e260e7ed-c267-40b2-861a-9a77325e027a/manager/0.log" Oct 07 12:28:13 crc kubenswrapper[4700]: I1007 12:28:13.743768 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-n4w8d_1f959f97-0c30-4e7c-a006-9517950bc1c1/kube-rbac-proxy/0.log" Oct 07 12:28:13 crc kubenswrapper[4700]: I1007 12:28:13.789378 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-n4w8d_1f959f97-0c30-4e7c-a006-9517950bc1c1/manager/0.log" Oct 07 12:28:13 crc kubenswrapper[4700]: I1007 12:28:13.878724 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-5szjk_bc62ffd3-f1d8-46c2-8777-c6ad960d68a8/kube-rbac-proxy/0.log" Oct 07 12:28:13 crc kubenswrapper[4700]: I1007 12:28:13.944107 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-5szjk_bc62ffd3-f1d8-46c2-8777-c6ad960d68a8/manager/0.log" Oct 07 12:28:14 crc kubenswrapper[4700]: I1007 12:28:14.058536 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-jrw6r_1986b44c-0b98-4c70-a2a3-78f86e586d87/kube-rbac-proxy/0.log" Oct 07 12:28:14 crc kubenswrapper[4700]: I1007 12:28:14.111917 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-jrw6r_1986b44c-0b98-4c70-a2a3-78f86e586d87/manager/0.log" Oct 07 12:28:14 crc kubenswrapper[4700]: I1007 12:28:14.188663 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-xxzx6_7a64a375-b3e0-47ac-b715-cea59989a781/kube-rbac-proxy/0.log" Oct 07 12:28:14 crc kubenswrapper[4700]: I1007 12:28:14.356654 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-xxzx6_7a64a375-b3e0-47ac-b715-cea59989a781/manager/0.log" Oct 07 12:28:14 crc kubenswrapper[4700]: I1007 12:28:14.399190 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-x8twn_4e24a2f2-716e-4273-86a7-7ad450736748/kube-rbac-proxy/0.log" Oct 07 12:28:14 crc kubenswrapper[4700]: I1007 12:28:14.437593 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-x8twn_4e24a2f2-716e-4273-86a7-7ad450736748/manager/0.log" Oct 07 12:28:14 crc kubenswrapper[4700]: I1007 12:28:14.538590 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665ctqg6s_e4140e5b-60c0-42c1-9440-0070b773f8c6/kube-rbac-proxy/0.log" Oct 07 12:28:14 crc kubenswrapper[4700]: I1007 12:28:14.608180 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665ctqg6s_e4140e5b-60c0-42c1-9440-0070b773f8c6/manager/0.log" Oct 07 12:28:14 crc kubenswrapper[4700]: I1007 12:28:14.708072 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-589f7cdddc-lk7np_86cca1e4-373c-41df-b120-c3199ef30fe0/kube-rbac-proxy/0.log" Oct 07 12:28:14 crc kubenswrapper[4700]: I1007 12:28:14.819346 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6489b698cc-vp52r_6eb7ce42-9c6b-49c9-b46e-333112be077d/kube-rbac-proxy/0.log" Oct 07 12:28:15 crc kubenswrapper[4700]: I1007 12:28:15.027931 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-7lqzz_5d24bd3d-929c-478a-9c26-3a94f09dd79a/registry-server/0.log" Oct 07 12:28:15 crc kubenswrapper[4700]: I1007 12:28:15.086920 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6489b698cc-vp52r_6eb7ce42-9c6b-49c9-b46e-333112be077d/operator/0.log" Oct 07 12:28:15 crc kubenswrapper[4700]: I1007 12:28:15.283480 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6d8b6f9b9-jv8c7_d50469d1-cf53-4396-9d9c-f03db7eb43f3/manager/0.log" Oct 07 12:28:15 crc kubenswrapper[4700]: I1007 12:28:15.300196 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6d8b6f9b9-jv8c7_d50469d1-cf53-4396-9d9c-f03db7eb43f3/kube-rbac-proxy/0.log" Oct 07 12:28:15 crc kubenswrapper[4700]: I1007 12:28:15.332652 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-xvtgc_81628b8b-eb67-4514-bbb4-44341c3962ce/kube-rbac-proxy/0.log" Oct 07 12:28:15 crc kubenswrapper[4700]: I1007 12:28:15.335917 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:28:15 crc kubenswrapper[4700]: I1007 12:28:15.335970 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:28:15 crc kubenswrapper[4700]: I1007 12:28:15.518663 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-xvtgc_81628b8b-eb67-4514-bbb4-44341c3962ce/manager/0.log" Oct 07 12:28:15 crc kubenswrapper[4700]: I1007 12:28:15.526980 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-rcxsc_ad8f84bd-e06b-4015-8168-938a9e1ebeaa/operator/0.log" Oct 07 12:28:15 crc kubenswrapper[4700]: I1007 12:28:15.747673 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-nq2r8_0f2a771c-5adf-4c17-94ac-d7b988e3ea86/manager/0.log" Oct 07 12:28:15 crc kubenswrapper[4700]: I1007 12:28:15.786225 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-bf98bb7b6-ghwv9_b5911608-e23e-46e8-9637-488593110278/kube-rbac-proxy/0.log" Oct 07 12:28:15 crc kubenswrapper[4700]: I1007 12:28:15.786428 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-nq2r8_0f2a771c-5adf-4c17-94ac-d7b988e3ea86/kube-rbac-proxy/0.log" Oct 07 12:28:16 crc kubenswrapper[4700]: I1007 12:28:16.004708 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-9zh29_ae80f142-9ac8-4614-a0c5-b74dfe98b0c8/kube-rbac-proxy/0.log" Oct 07 12:28:16 crc kubenswrapper[4700]: I1007 12:28:16.022432 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-9zh29_ae80f142-9ac8-4614-a0c5-b74dfe98b0c8/manager/0.log" Oct 07 12:28:16 crc kubenswrapper[4700]: I1007 12:28:16.095921 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-589f7cdddc-lk7np_86cca1e4-373c-41df-b120-c3199ef30fe0/manager/0.log" Oct 07 12:28:16 crc kubenswrapper[4700]: I1007 12:28:16.120369 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-bf98bb7b6-ghwv9_b5911608-e23e-46e8-9637-488593110278/manager/0.log" Oct 07 12:28:16 crc kubenswrapper[4700]: I1007 12:28:16.222625 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-khdlk_c147a183-5f67-45a0-a971-87e75df2a66e/kube-rbac-proxy/0.log" Oct 07 12:28:16 crc kubenswrapper[4700]: I1007 12:28:16.257042 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-khdlk_c147a183-5f67-45a0-a971-87e75df2a66e/manager/0.log" Oct 07 12:28:33 crc kubenswrapper[4700]: I1007 12:28:33.775230 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-wcjjj_c271ef31-887c-4b30-857a-7969eb9063bf/control-plane-machine-set-operator/0.log" Oct 07 12:28:34 crc kubenswrapper[4700]: I1007 12:28:34.666777 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-hbprv_4c92443d-f8f6-4941-9729-013d10138707/machine-api-operator/0.log" Oct 07 12:28:34 crc kubenswrapper[4700]: I1007 12:28:34.670343 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-hbprv_4c92443d-f8f6-4941-9729-013d10138707/kube-rbac-proxy/0.log" Oct 07 12:28:45 crc kubenswrapper[4700]: I1007 12:28:45.333597 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:28:45 crc kubenswrapper[4700]: I1007 12:28:45.334264 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:28:48 crc kubenswrapper[4700]: I1007 12:28:48.609870 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-lv822_43b66d69-a0b2-4c0c-85d4-107ab9700398/cert-manager-controller/0.log" Oct 07 12:28:48 crc kubenswrapper[4700]: I1007 12:28:48.771883 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-x2cvv_df92deba-17c9-40f7-8079-4699a5c17bf8/cert-manager-cainjector/0.log" Oct 07 12:28:48 crc kubenswrapper[4700]: I1007 12:28:48.816126 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-nxpb9_ad60c720-bd0b-4a09-807d-88587ca33ed7/cert-manager-webhook/0.log" Oct 07 12:29:01 crc kubenswrapper[4700]: I1007 12:29:01.853204 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-9r6fx_65303df1-c2ad-4edf-83c1-8b3d6bce8c33/nmstate-console-plugin/0.log" Oct 07 12:29:02 crc kubenswrapper[4700]: I1007 12:29:02.137962 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-7q2md_c2c057c9-3fea-45df-9991-448998e13a79/nmstate-handler/0.log" Oct 07 12:29:02 crc kubenswrapper[4700]: I1007 12:29:02.237963 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-svznx_7d6bc40a-05b0-4ff6-9111-44cc8bc88ea2/nmstate-metrics/0.log" Oct 07 12:29:02 crc kubenswrapper[4700]: I1007 12:29:02.270060 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-svznx_7d6bc40a-05b0-4ff6-9111-44cc8bc88ea2/kube-rbac-proxy/0.log" Oct 07 12:29:02 crc kubenswrapper[4700]: I1007 12:29:02.433232 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-h9nxf_ab86639c-d812-4e56-9f44-f8727fa8b6b5/nmstate-operator/0.log" Oct 07 12:29:02 crc kubenswrapper[4700]: I1007 12:29:02.517828 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-k4h2w_607c033a-3b68-4731-92c3-c9a9a08acd5c/nmstate-webhook/0.log" Oct 07 12:29:15 crc kubenswrapper[4700]: I1007 12:29:15.334093 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:29:15 crc kubenswrapper[4700]: I1007 12:29:15.334748 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:29:15 crc kubenswrapper[4700]: I1007 12:29:15.334802 4700 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" Oct 07 12:29:15 crc kubenswrapper[4700]: I1007 12:29:15.335689 4700 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6854934ce5d0721a7ce9761f6927f75206363bc994778df1c1e2fd1114dbbdd4"} pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 12:29:15 crc kubenswrapper[4700]: I1007 12:29:15.335765 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" containerID="cri-o://6854934ce5d0721a7ce9761f6927f75206363bc994778df1c1e2fd1114dbbdd4" gracePeriod=600 Oct 07 12:29:15 crc kubenswrapper[4700]: I1007 12:29:15.650663 4700 generic.go:334] "Generic (PLEG): container finished" podID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerID="6854934ce5d0721a7ce9761f6927f75206363bc994778df1c1e2fd1114dbbdd4" exitCode=0 Oct 07 12:29:15 crc kubenswrapper[4700]: I1007 12:29:15.650744 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" event={"ID":"97a77b38-e9b1-4243-ac3a-28d83d87cf15","Type":"ContainerDied","Data":"6854934ce5d0721a7ce9761f6927f75206363bc994778df1c1e2fd1114dbbdd4"} Oct 07 12:29:15 crc kubenswrapper[4700]: I1007 12:29:15.650823 4700 scope.go:117] "RemoveContainer" containerID="5607bc9974eeb0f551316eb4488f453bd2b7dc72fa28aa7ecde26c755a3be75b" Oct 07 12:29:16 crc kubenswrapper[4700]: I1007 12:29:16.661901 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" event={"ID":"97a77b38-e9b1-4243-ac3a-28d83d87cf15","Type":"ContainerStarted","Data":"dd3569fef33a6369b30b4ac53f5a45541b5a0d27944d097fc6979cfc99c5dc6a"} Oct 07 12:29:17 crc kubenswrapper[4700]: I1007 12:29:17.495894 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-47w4l_ab4aa75e-5802-4ba7-b88c-655fad15d8af/kube-rbac-proxy/0.log" Oct 07 12:29:17 crc kubenswrapper[4700]: I1007 12:29:17.596685 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-47w4l_ab4aa75e-5802-4ba7-b88c-655fad15d8af/controller/0.log" Oct 07 12:29:17 crc kubenswrapper[4700]: I1007 12:29:17.668733 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-59fcf_173c5607-1006-4c8c-afc1-79c8248bbe7a/frr-k8s-webhook-server/0.log" Oct 07 12:29:17 crc kubenswrapper[4700]: I1007 12:29:17.684493 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zz4cl_d2a66390-1e1f-433a-8637-8f9c03197ab8/cp-frr-files/0.log" Oct 07 12:29:17 crc kubenswrapper[4700]: I1007 12:29:17.940873 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zz4cl_d2a66390-1e1f-433a-8637-8f9c03197ab8/cp-frr-files/0.log" Oct 07 12:29:17 crc kubenswrapper[4700]: I1007 12:29:17.957780 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zz4cl_d2a66390-1e1f-433a-8637-8f9c03197ab8/cp-reloader/0.log" Oct 07 12:29:17 crc kubenswrapper[4700]: I1007 12:29:17.958581 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zz4cl_d2a66390-1e1f-433a-8637-8f9c03197ab8/cp-reloader/0.log" Oct 07 12:29:17 crc kubenswrapper[4700]: I1007 12:29:17.968785 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zz4cl_d2a66390-1e1f-433a-8637-8f9c03197ab8/cp-metrics/0.log" Oct 07 12:29:18 crc kubenswrapper[4700]: I1007 12:29:18.168264 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zz4cl_d2a66390-1e1f-433a-8637-8f9c03197ab8/cp-metrics/0.log" Oct 07 12:29:18 crc kubenswrapper[4700]: I1007 12:29:18.192693 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zz4cl_d2a66390-1e1f-433a-8637-8f9c03197ab8/cp-frr-files/0.log" Oct 07 12:29:18 crc kubenswrapper[4700]: I1007 12:29:18.221131 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zz4cl_d2a66390-1e1f-433a-8637-8f9c03197ab8/cp-reloader/0.log" Oct 07 12:29:18 crc kubenswrapper[4700]: I1007 12:29:18.229366 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zz4cl_d2a66390-1e1f-433a-8637-8f9c03197ab8/cp-metrics/0.log" Oct 07 12:29:18 crc kubenswrapper[4700]: I1007 12:29:18.380869 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zz4cl_d2a66390-1e1f-433a-8637-8f9c03197ab8/cp-reloader/0.log" Oct 07 12:29:18 crc kubenswrapper[4700]: I1007 12:29:18.392844 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zz4cl_d2a66390-1e1f-433a-8637-8f9c03197ab8/cp-metrics/0.log" Oct 07 12:29:18 crc kubenswrapper[4700]: I1007 12:29:18.409520 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zz4cl_d2a66390-1e1f-433a-8637-8f9c03197ab8/cp-frr-files/0.log" Oct 07 12:29:18 crc kubenswrapper[4700]: I1007 12:29:18.455649 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zz4cl_d2a66390-1e1f-433a-8637-8f9c03197ab8/controller/0.log" Oct 07 12:29:18 crc kubenswrapper[4700]: I1007 12:29:18.549545 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zz4cl_d2a66390-1e1f-433a-8637-8f9c03197ab8/kube-rbac-proxy/0.log" Oct 07 12:29:18 crc kubenswrapper[4700]: I1007 12:29:18.584064 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zz4cl_d2a66390-1e1f-433a-8637-8f9c03197ab8/frr-metrics/0.log" Oct 07 12:29:18 crc kubenswrapper[4700]: I1007 12:29:18.681405 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zz4cl_d2a66390-1e1f-433a-8637-8f9c03197ab8/kube-rbac-proxy-frr/0.log" Oct 07 12:29:18 crc kubenswrapper[4700]: I1007 12:29:18.761760 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zz4cl_d2a66390-1e1f-433a-8637-8f9c03197ab8/reloader/0.log" Oct 07 12:29:18 crc kubenswrapper[4700]: I1007 12:29:18.895417 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-779847879b-zbmnp_7089843f-2eed-4318-b373-ff19fa518a8d/manager/0.log" Oct 07 12:29:19 crc kubenswrapper[4700]: I1007 12:29:19.044121 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-654f9bf6d-jll85_97d901d3-6f2a-4e96-8578-f169200d5f6a/webhook-server/0.log" Oct 07 12:29:19 crc kubenswrapper[4700]: I1007 12:29:19.158479 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-pv9ct_30cc7c7a-bda9-4b40-972b-6c87af01ad23/kube-rbac-proxy/0.log" Oct 07 12:29:19 crc kubenswrapper[4700]: I1007 12:29:19.751710 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-pv9ct_30cc7c7a-bda9-4b40-972b-6c87af01ad23/speaker/0.log" Oct 07 12:29:20 crc kubenswrapper[4700]: I1007 12:29:20.042483 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zz4cl_d2a66390-1e1f-433a-8637-8f9c03197ab8/frr/0.log" Oct 07 12:29:33 crc kubenswrapper[4700]: I1007 12:29:33.789593 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk_13ec320d-824d-4483-bdc7-6a419cbdd630/util/0.log" Oct 07 12:29:34 crc kubenswrapper[4700]: I1007 12:29:34.013138 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk_13ec320d-824d-4483-bdc7-6a419cbdd630/pull/0.log" Oct 07 12:29:34 crc kubenswrapper[4700]: I1007 12:29:34.034627 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk_13ec320d-824d-4483-bdc7-6a419cbdd630/pull/0.log" Oct 07 12:29:34 crc kubenswrapper[4700]: I1007 12:29:34.037558 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk_13ec320d-824d-4483-bdc7-6a419cbdd630/util/0.log" Oct 07 12:29:34 crc kubenswrapper[4700]: I1007 12:29:34.166002 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk_13ec320d-824d-4483-bdc7-6a419cbdd630/util/0.log" Oct 07 12:29:34 crc kubenswrapper[4700]: I1007 12:29:34.224287 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk_13ec320d-824d-4483-bdc7-6a419cbdd630/extract/0.log" Oct 07 12:29:34 crc kubenswrapper[4700]: I1007 12:29:34.237678 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk_13ec320d-824d-4483-bdc7-6a419cbdd630/pull/0.log" Oct 07 12:29:34 crc kubenswrapper[4700]: I1007 12:29:34.387948 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m_a1e54796-2008-49d7-9ab4-0a865b57e743/util/0.log" Oct 07 12:29:34 crc kubenswrapper[4700]: I1007 12:29:34.540906 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m_a1e54796-2008-49d7-9ab4-0a865b57e743/pull/0.log" Oct 07 12:29:34 crc kubenswrapper[4700]: I1007 12:29:34.551365 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m_a1e54796-2008-49d7-9ab4-0a865b57e743/util/0.log" Oct 07 12:29:34 crc kubenswrapper[4700]: I1007 12:29:34.552079 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m_a1e54796-2008-49d7-9ab4-0a865b57e743/pull/0.log" Oct 07 12:29:34 crc kubenswrapper[4700]: I1007 12:29:34.707870 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m_a1e54796-2008-49d7-9ab4-0a865b57e743/util/0.log" Oct 07 12:29:34 crc kubenswrapper[4700]: I1007 12:29:34.727018 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m_a1e54796-2008-49d7-9ab4-0a865b57e743/pull/0.log" Oct 07 12:29:34 crc kubenswrapper[4700]: I1007 12:29:34.753150 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m_a1e54796-2008-49d7-9ab4-0a865b57e743/extract/0.log" Oct 07 12:29:35 crc kubenswrapper[4700]: I1007 12:29:35.489232 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ngg7w_152601b0-4148-4077-bf15-899a1ee66ce7/extract-utilities/0.log" Oct 07 12:29:35 crc kubenswrapper[4700]: I1007 12:29:35.614956 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ngg7w_152601b0-4148-4077-bf15-899a1ee66ce7/extract-utilities/0.log" Oct 07 12:29:35 crc kubenswrapper[4700]: I1007 12:29:35.638964 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ngg7w_152601b0-4148-4077-bf15-899a1ee66ce7/extract-content/0.log" Oct 07 12:29:35 crc kubenswrapper[4700]: I1007 12:29:35.686864 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ngg7w_152601b0-4148-4077-bf15-899a1ee66ce7/extract-content/0.log" Oct 07 12:29:35 crc kubenswrapper[4700]: I1007 12:29:35.832175 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ngg7w_152601b0-4148-4077-bf15-899a1ee66ce7/extract-utilities/0.log" Oct 07 12:29:35 crc kubenswrapper[4700]: I1007 12:29:35.883453 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ngg7w_152601b0-4148-4077-bf15-899a1ee66ce7/extract-content/0.log" Oct 07 12:29:36 crc kubenswrapper[4700]: I1007 12:29:36.081473 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k5jbs_4a424dee-5265-41d8-8ef7-fe5d2bcec50c/extract-utilities/0.log" Oct 07 12:29:36 crc kubenswrapper[4700]: I1007 12:29:36.240951 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k5jbs_4a424dee-5265-41d8-8ef7-fe5d2bcec50c/extract-utilities/0.log" Oct 07 12:29:36 crc kubenswrapper[4700]: I1007 12:29:36.309999 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k5jbs_4a424dee-5265-41d8-8ef7-fe5d2bcec50c/extract-content/0.log" Oct 07 12:29:36 crc kubenswrapper[4700]: I1007 12:29:36.360551 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k5jbs_4a424dee-5265-41d8-8ef7-fe5d2bcec50c/extract-content/0.log" Oct 07 12:29:36 crc kubenswrapper[4700]: I1007 12:29:36.379811 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ngg7w_152601b0-4148-4077-bf15-899a1ee66ce7/registry-server/0.log" Oct 07 12:29:37 crc kubenswrapper[4700]: I1007 12:29:37.128036 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k5jbs_4a424dee-5265-41d8-8ef7-fe5d2bcec50c/extract-utilities/0.log" Oct 07 12:29:37 crc kubenswrapper[4700]: I1007 12:29:37.142875 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k5jbs_4a424dee-5265-41d8-8ef7-fe5d2bcec50c/extract-content/0.log" Oct 07 12:29:37 crc kubenswrapper[4700]: I1007 12:29:37.288424 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7_012580db-7236-454c-a0c9-e53ca0cefe4c/util/0.log" Oct 07 12:29:37 crc kubenswrapper[4700]: I1007 12:29:37.368200 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7_012580db-7236-454c-a0c9-e53ca0cefe4c/util/0.log" Oct 07 12:29:37 crc kubenswrapper[4700]: I1007 12:29:37.442533 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7_012580db-7236-454c-a0c9-e53ca0cefe4c/pull/0.log" Oct 07 12:29:37 crc kubenswrapper[4700]: I1007 12:29:37.487178 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7_012580db-7236-454c-a0c9-e53ca0cefe4c/pull/0.log" Oct 07 12:29:37 crc kubenswrapper[4700]: I1007 12:29:37.666027 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7_012580db-7236-454c-a0c9-e53ca0cefe4c/pull/0.log" Oct 07 12:29:37 crc kubenswrapper[4700]: I1007 12:29:37.668174 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k5jbs_4a424dee-5265-41d8-8ef7-fe5d2bcec50c/registry-server/0.log" Oct 07 12:29:37 crc kubenswrapper[4700]: I1007 12:29:37.682661 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7_012580db-7236-454c-a0c9-e53ca0cefe4c/util/0.log" Oct 07 12:29:37 crc kubenswrapper[4700]: I1007 12:29:37.747165 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7_012580db-7236-454c-a0c9-e53ca0cefe4c/extract/0.log" Oct 07 12:29:37 crc kubenswrapper[4700]: I1007 12:29:37.839730 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fvvh8_32b97e41-e040-497c-8feb-312e9b11364a/extract-utilities/0.log" Oct 07 12:29:37 crc kubenswrapper[4700]: I1007 12:29:37.851448 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-4vqf8_280bc2c9-6204-4779-b7c8-a09260dd2a66/marketplace-operator/0.log" Oct 07 12:29:38 crc kubenswrapper[4700]: I1007 12:29:38.056443 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fvvh8_32b97e41-e040-497c-8feb-312e9b11364a/extract-utilities/0.log" Oct 07 12:29:38 crc kubenswrapper[4700]: I1007 12:29:38.069354 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fvvh8_32b97e41-e040-497c-8feb-312e9b11364a/extract-content/0.log" Oct 07 12:29:38 crc kubenswrapper[4700]: I1007 12:29:38.222580 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fvvh8_32b97e41-e040-497c-8feb-312e9b11364a/extract-content/0.log" Oct 07 12:29:38 crc kubenswrapper[4700]: I1007 12:29:38.353107 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fvvh8_32b97e41-e040-497c-8feb-312e9b11364a/extract-content/0.log" Oct 07 12:29:38 crc kubenswrapper[4700]: I1007 12:29:38.396246 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pw4v8_3df6a670-4c88-45f9-a160-f35e4b7b0b64/extract-utilities/0.log" Oct 07 12:29:38 crc kubenswrapper[4700]: I1007 12:29:38.402454 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fvvh8_32b97e41-e040-497c-8feb-312e9b11364a/extract-utilities/0.log" Oct 07 12:29:38 crc kubenswrapper[4700]: I1007 12:29:38.504255 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fvvh8_32b97e41-e040-497c-8feb-312e9b11364a/registry-server/0.log" Oct 07 12:29:38 crc kubenswrapper[4700]: I1007 12:29:38.638461 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pw4v8_3df6a670-4c88-45f9-a160-f35e4b7b0b64/extract-content/0.log" Oct 07 12:29:38 crc kubenswrapper[4700]: I1007 12:29:38.638461 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pw4v8_3df6a670-4c88-45f9-a160-f35e4b7b0b64/extract-content/0.log" Oct 07 12:29:38 crc kubenswrapper[4700]: I1007 12:29:38.639988 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pw4v8_3df6a670-4c88-45f9-a160-f35e4b7b0b64/extract-utilities/0.log" Oct 07 12:29:38 crc kubenswrapper[4700]: I1007 12:29:38.806043 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pw4v8_3df6a670-4c88-45f9-a160-f35e4b7b0b64/extract-content/0.log" Oct 07 12:29:38 crc kubenswrapper[4700]: I1007 12:29:38.834564 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pw4v8_3df6a670-4c88-45f9-a160-f35e4b7b0b64/extract-utilities/0.log" Oct 07 12:29:39 crc kubenswrapper[4700]: I1007 12:29:39.292887 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pw4v8_3df6a670-4c88-45f9-a160-f35e4b7b0b64/registry-server/0.log" Oct 07 12:29:51 crc kubenswrapper[4700]: I1007 12:29:51.708630 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-8xvhj_398ce44d-03fb-4ee9-ac61-2ca3fd52074e/prometheus-operator/0.log" Oct 07 12:29:51 crc kubenswrapper[4700]: I1007 12:29:51.836689 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-76fc5d7d48-7mcmz_0319fc60-cd28-49d8-af70-3a2306fe89fd/prometheus-operator-admission-webhook/0.log" Oct 07 12:29:51 crc kubenswrapper[4700]: I1007 12:29:51.903287 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-76fc5d7d48-82r7d_7d619134-8aae-4140-b5ca-33deeac1a66c/prometheus-operator-admission-webhook/0.log" Oct 07 12:29:52 crc kubenswrapper[4700]: I1007 12:29:52.058689 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-w5vqc_c82137a0-2748-492d-bd33-39b03e9c8139/operator/0.log" Oct 07 12:29:52 crc kubenswrapper[4700]: I1007 12:29:52.099950 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-fnrbp_45384a63-61c2-4d8b-906a-e7545addde11/perses-operator/0.log" Oct 07 12:30:00 crc kubenswrapper[4700]: I1007 12:30:00.146241 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330670-68vwc"] Oct 07 12:30:00 crc kubenswrapper[4700]: E1007 12:30:00.147107 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e1fa560-dcc4-4cee-bb3c-0efdf44cd401" containerName="container-00" Oct 07 12:30:00 crc kubenswrapper[4700]: I1007 12:30:00.147120 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e1fa560-dcc4-4cee-bb3c-0efdf44cd401" containerName="container-00" Oct 07 12:30:00 crc kubenswrapper[4700]: I1007 12:30:00.147367 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e1fa560-dcc4-4cee-bb3c-0efdf44cd401" containerName="container-00" Oct 07 12:30:00 crc kubenswrapper[4700]: I1007 12:30:00.147997 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330670-68vwc" Oct 07 12:30:00 crc kubenswrapper[4700]: I1007 12:30:00.150139 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 12:30:00 crc kubenswrapper[4700]: I1007 12:30:00.150227 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 12:30:00 crc kubenswrapper[4700]: I1007 12:30:00.164846 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330670-68vwc"] Oct 07 12:30:00 crc kubenswrapper[4700]: I1007 12:30:00.239238 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9jf2\" (UniqueName: \"kubernetes.io/projected/73ee73a3-c9bb-487e-846e-593e896e03c4-kube-api-access-n9jf2\") pod \"collect-profiles-29330670-68vwc\" (UID: \"73ee73a3-c9bb-487e-846e-593e896e03c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330670-68vwc" Oct 07 12:30:00 crc kubenswrapper[4700]: I1007 12:30:00.239335 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73ee73a3-c9bb-487e-846e-593e896e03c4-secret-volume\") pod \"collect-profiles-29330670-68vwc\" (UID: \"73ee73a3-c9bb-487e-846e-593e896e03c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330670-68vwc" Oct 07 12:30:00 crc kubenswrapper[4700]: I1007 12:30:00.239402 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73ee73a3-c9bb-487e-846e-593e896e03c4-config-volume\") pod \"collect-profiles-29330670-68vwc\" (UID: \"73ee73a3-c9bb-487e-846e-593e896e03c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330670-68vwc" Oct 07 12:30:00 crc kubenswrapper[4700]: I1007 12:30:00.341075 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9jf2\" (UniqueName: \"kubernetes.io/projected/73ee73a3-c9bb-487e-846e-593e896e03c4-kube-api-access-n9jf2\") pod \"collect-profiles-29330670-68vwc\" (UID: \"73ee73a3-c9bb-487e-846e-593e896e03c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330670-68vwc" Oct 07 12:30:00 crc kubenswrapper[4700]: I1007 12:30:00.341172 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73ee73a3-c9bb-487e-846e-593e896e03c4-secret-volume\") pod \"collect-profiles-29330670-68vwc\" (UID: \"73ee73a3-c9bb-487e-846e-593e896e03c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330670-68vwc" Oct 07 12:30:00 crc kubenswrapper[4700]: I1007 12:30:00.341237 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73ee73a3-c9bb-487e-846e-593e896e03c4-config-volume\") pod \"collect-profiles-29330670-68vwc\" (UID: \"73ee73a3-c9bb-487e-846e-593e896e03c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330670-68vwc" Oct 07 12:30:00 crc kubenswrapper[4700]: I1007 12:30:00.342058 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73ee73a3-c9bb-487e-846e-593e896e03c4-config-volume\") pod \"collect-profiles-29330670-68vwc\" (UID: \"73ee73a3-c9bb-487e-846e-593e896e03c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330670-68vwc" Oct 07 12:30:00 crc kubenswrapper[4700]: I1007 12:30:00.346869 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73ee73a3-c9bb-487e-846e-593e896e03c4-secret-volume\") pod \"collect-profiles-29330670-68vwc\" (UID: \"73ee73a3-c9bb-487e-846e-593e896e03c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330670-68vwc" Oct 07 12:30:00 crc kubenswrapper[4700]: I1007 12:30:00.360225 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9jf2\" (UniqueName: \"kubernetes.io/projected/73ee73a3-c9bb-487e-846e-593e896e03c4-kube-api-access-n9jf2\") pod \"collect-profiles-29330670-68vwc\" (UID: \"73ee73a3-c9bb-487e-846e-593e896e03c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330670-68vwc" Oct 07 12:30:00 crc kubenswrapper[4700]: I1007 12:30:00.469006 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330670-68vwc" Oct 07 12:30:00 crc kubenswrapper[4700]: I1007 12:30:00.950354 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330670-68vwc"] Oct 07 12:30:01 crc kubenswrapper[4700]: I1007 12:30:01.060231 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330670-68vwc" event={"ID":"73ee73a3-c9bb-487e-846e-593e896e03c4","Type":"ContainerStarted","Data":"ef3df081814e6847da9803da9b39268b5d499077872ebbe8db51ef7fa48273a2"} Oct 07 12:30:02 crc kubenswrapper[4700]: I1007 12:30:02.071266 4700 generic.go:334] "Generic (PLEG): container finished" podID="73ee73a3-c9bb-487e-846e-593e896e03c4" containerID="c75644b0b54589c9ebac1e697456bb81ed14e9f9fd705d53a9f5072c1be6e88b" exitCode=0 Oct 07 12:30:02 crc kubenswrapper[4700]: I1007 12:30:02.071331 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330670-68vwc" event={"ID":"73ee73a3-c9bb-487e-846e-593e896e03c4","Type":"ContainerDied","Data":"c75644b0b54589c9ebac1e697456bb81ed14e9f9fd705d53a9f5072c1be6e88b"} Oct 07 12:30:03 crc kubenswrapper[4700]: I1007 12:30:03.495923 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330670-68vwc" Oct 07 12:30:03 crc kubenswrapper[4700]: I1007 12:30:03.505818 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9jf2\" (UniqueName: \"kubernetes.io/projected/73ee73a3-c9bb-487e-846e-593e896e03c4-kube-api-access-n9jf2\") pod \"73ee73a3-c9bb-487e-846e-593e896e03c4\" (UID: \"73ee73a3-c9bb-487e-846e-593e896e03c4\") " Oct 07 12:30:03 crc kubenswrapper[4700]: I1007 12:30:03.506109 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73ee73a3-c9bb-487e-846e-593e896e03c4-secret-volume\") pod \"73ee73a3-c9bb-487e-846e-593e896e03c4\" (UID: \"73ee73a3-c9bb-487e-846e-593e896e03c4\") " Oct 07 12:30:03 crc kubenswrapper[4700]: I1007 12:30:03.506156 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73ee73a3-c9bb-487e-846e-593e896e03c4-config-volume\") pod \"73ee73a3-c9bb-487e-846e-593e896e03c4\" (UID: \"73ee73a3-c9bb-487e-846e-593e896e03c4\") " Oct 07 12:30:03 crc kubenswrapper[4700]: I1007 12:30:03.506938 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73ee73a3-c9bb-487e-846e-593e896e03c4-config-volume" (OuterVolumeSpecName: "config-volume") pod "73ee73a3-c9bb-487e-846e-593e896e03c4" (UID: "73ee73a3-c9bb-487e-846e-593e896e03c4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:30:03 crc kubenswrapper[4700]: I1007 12:30:03.608538 4700 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73ee73a3-c9bb-487e-846e-593e896e03c4-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 12:30:04 crc kubenswrapper[4700]: I1007 12:30:04.095607 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330670-68vwc" event={"ID":"73ee73a3-c9bb-487e-846e-593e896e03c4","Type":"ContainerDied","Data":"ef3df081814e6847da9803da9b39268b5d499077872ebbe8db51ef7fa48273a2"} Oct 07 12:30:04 crc kubenswrapper[4700]: I1007 12:30:04.095650 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef3df081814e6847da9803da9b39268b5d499077872ebbe8db51ef7fa48273a2" Oct 07 12:30:04 crc kubenswrapper[4700]: I1007 12:30:04.095668 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330670-68vwc" Oct 07 12:30:04 crc kubenswrapper[4700]: I1007 12:30:04.281591 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73ee73a3-c9bb-487e-846e-593e896e03c4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "73ee73a3-c9bb-487e-846e-593e896e03c4" (UID: "73ee73a3-c9bb-487e-846e-593e896e03c4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:30:04 crc kubenswrapper[4700]: I1007 12:30:04.281618 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73ee73a3-c9bb-487e-846e-593e896e03c4-kube-api-access-n9jf2" (OuterVolumeSpecName: "kube-api-access-n9jf2") pod "73ee73a3-c9bb-487e-846e-593e896e03c4" (UID: "73ee73a3-c9bb-487e-846e-593e896e03c4"). InnerVolumeSpecName "kube-api-access-n9jf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:30:04 crc kubenswrapper[4700]: I1007 12:30:04.319443 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9jf2\" (UniqueName: \"kubernetes.io/projected/73ee73a3-c9bb-487e-846e-593e896e03c4-kube-api-access-n9jf2\") on node \"crc\" DevicePath \"\"" Oct 07 12:30:04 crc kubenswrapper[4700]: I1007 12:30:04.319494 4700 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73ee73a3-c9bb-487e-846e-593e896e03c4-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 12:30:04 crc kubenswrapper[4700]: I1007 12:30:04.580060 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330625-frtmq"] Oct 07 12:30:04 crc kubenswrapper[4700]: I1007 12:30:04.603122 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330625-frtmq"] Oct 07 12:30:05 crc kubenswrapper[4700]: I1007 12:30:05.968561 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8998a50a-5077-4ee0-aa24-54ee046989b3" path="/var/lib/kubelet/pods/8998a50a-5077-4ee0-aa24-54ee046989b3/volumes" Oct 07 12:30:08 crc kubenswrapper[4700]: I1007 12:30:08.335122 4700 scope.go:117] "RemoveContainer" containerID="3ba6614b3cd4d1e380f606f9ac8ac08a8fe47511de2dbd7f544641ed0a36fc2f" Oct 07 12:30:15 crc kubenswrapper[4700]: E1007 12:30:15.309413 4700 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.13:58194->38.102.83.13:35247: write tcp 38.102.83.13:58194->38.102.83.13:35247: write: broken pipe Oct 07 12:31:22 crc kubenswrapper[4700]: I1007 12:31:22.881576 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tlbvn"] Oct 07 12:31:22 crc kubenswrapper[4700]: E1007 12:31:22.882602 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73ee73a3-c9bb-487e-846e-593e896e03c4" containerName="collect-profiles" Oct 07 12:31:22 crc kubenswrapper[4700]: I1007 12:31:22.882622 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="73ee73a3-c9bb-487e-846e-593e896e03c4" containerName="collect-profiles" Oct 07 12:31:22 crc kubenswrapper[4700]: I1007 12:31:22.882921 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="73ee73a3-c9bb-487e-846e-593e896e03c4" containerName="collect-profiles" Oct 07 12:31:22 crc kubenswrapper[4700]: I1007 12:31:22.884731 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tlbvn" Oct 07 12:31:22 crc kubenswrapper[4700]: I1007 12:31:22.899748 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tlbvn"] Oct 07 12:31:23 crc kubenswrapper[4700]: I1007 12:31:23.021264 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57e102de-28fa-4c52-8ef5-418d319f54a3-catalog-content\") pod \"community-operators-tlbvn\" (UID: \"57e102de-28fa-4c52-8ef5-418d319f54a3\") " pod="openshift-marketplace/community-operators-tlbvn" Oct 07 12:31:23 crc kubenswrapper[4700]: I1007 12:31:23.021460 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfhpm\" (UniqueName: \"kubernetes.io/projected/57e102de-28fa-4c52-8ef5-418d319f54a3-kube-api-access-rfhpm\") pod \"community-operators-tlbvn\" (UID: \"57e102de-28fa-4c52-8ef5-418d319f54a3\") " pod="openshift-marketplace/community-operators-tlbvn" Oct 07 12:31:23 crc kubenswrapper[4700]: I1007 12:31:23.021802 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57e102de-28fa-4c52-8ef5-418d319f54a3-utilities\") pod \"community-operators-tlbvn\" (UID: \"57e102de-28fa-4c52-8ef5-418d319f54a3\") " pod="openshift-marketplace/community-operators-tlbvn" Oct 07 12:31:23 crc kubenswrapper[4700]: I1007 12:31:23.123364 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfhpm\" (UniqueName: \"kubernetes.io/projected/57e102de-28fa-4c52-8ef5-418d319f54a3-kube-api-access-rfhpm\") pod \"community-operators-tlbvn\" (UID: \"57e102de-28fa-4c52-8ef5-418d319f54a3\") " pod="openshift-marketplace/community-operators-tlbvn" Oct 07 12:31:23 crc kubenswrapper[4700]: I1007 12:31:23.123517 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57e102de-28fa-4c52-8ef5-418d319f54a3-utilities\") pod \"community-operators-tlbvn\" (UID: \"57e102de-28fa-4c52-8ef5-418d319f54a3\") " pod="openshift-marketplace/community-operators-tlbvn" Oct 07 12:31:23 crc kubenswrapper[4700]: I1007 12:31:23.123581 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57e102de-28fa-4c52-8ef5-418d319f54a3-catalog-content\") pod \"community-operators-tlbvn\" (UID: \"57e102de-28fa-4c52-8ef5-418d319f54a3\") " pod="openshift-marketplace/community-operators-tlbvn" Oct 07 12:31:23 crc kubenswrapper[4700]: I1007 12:31:23.124077 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57e102de-28fa-4c52-8ef5-418d319f54a3-catalog-content\") pod \"community-operators-tlbvn\" (UID: \"57e102de-28fa-4c52-8ef5-418d319f54a3\") " pod="openshift-marketplace/community-operators-tlbvn" Oct 07 12:31:23 crc kubenswrapper[4700]: I1007 12:31:23.124163 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57e102de-28fa-4c52-8ef5-418d319f54a3-utilities\") pod \"community-operators-tlbvn\" (UID: \"57e102de-28fa-4c52-8ef5-418d319f54a3\") " pod="openshift-marketplace/community-operators-tlbvn" Oct 07 12:31:23 crc kubenswrapper[4700]: I1007 12:31:23.143440 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfhpm\" (UniqueName: \"kubernetes.io/projected/57e102de-28fa-4c52-8ef5-418d319f54a3-kube-api-access-rfhpm\") pod \"community-operators-tlbvn\" (UID: \"57e102de-28fa-4c52-8ef5-418d319f54a3\") " pod="openshift-marketplace/community-operators-tlbvn" Oct 07 12:31:23 crc kubenswrapper[4700]: I1007 12:31:23.220930 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tlbvn" Oct 07 12:31:23 crc kubenswrapper[4700]: I1007 12:31:23.755443 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tlbvn"] Oct 07 12:31:24 crc kubenswrapper[4700]: I1007 12:31:24.061670 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tlbvn" event={"ID":"57e102de-28fa-4c52-8ef5-418d319f54a3","Type":"ContainerStarted","Data":"7f53afe6243aa8a6b4b0d5b40082c75faf127571956b2cf229917fbbdf79987e"} Oct 07 12:31:24 crc kubenswrapper[4700]: I1007 12:31:24.062040 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tlbvn" event={"ID":"57e102de-28fa-4c52-8ef5-418d319f54a3","Type":"ContainerStarted","Data":"03c15b0978f0a7c5594cb9a1cedbe13402aae03118af6a427cbf6582b3ab594b"} Oct 07 12:31:25 crc kubenswrapper[4700]: I1007 12:31:25.073332 4700 generic.go:334] "Generic (PLEG): container finished" podID="57e102de-28fa-4c52-8ef5-418d319f54a3" containerID="7f53afe6243aa8a6b4b0d5b40082c75faf127571956b2cf229917fbbdf79987e" exitCode=0 Oct 07 12:31:25 crc kubenswrapper[4700]: I1007 12:31:25.073422 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tlbvn" event={"ID":"57e102de-28fa-4c52-8ef5-418d319f54a3","Type":"ContainerDied","Data":"7f53afe6243aa8a6b4b0d5b40082c75faf127571956b2cf229917fbbdf79987e"} Oct 07 12:31:27 crc kubenswrapper[4700]: I1007 12:31:27.076438 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-99kd8"] Oct 07 12:31:27 crc kubenswrapper[4700]: I1007 12:31:27.080735 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-99kd8" Oct 07 12:31:27 crc kubenswrapper[4700]: I1007 12:31:27.089952 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-99kd8"] Oct 07 12:31:27 crc kubenswrapper[4700]: I1007 12:31:27.125557 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tlbvn" event={"ID":"57e102de-28fa-4c52-8ef5-418d319f54a3","Type":"ContainerStarted","Data":"144639db5a4cd9f6f1cd9f5acbcad3af941866b4bf03e9b36d43526ec0c90ab8"} Oct 07 12:31:27 crc kubenswrapper[4700]: I1007 12:31:27.223564 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/126aef62-4d97-41ab-9155-7ee87bae31b3-utilities\") pod \"redhat-marketplace-99kd8\" (UID: \"126aef62-4d97-41ab-9155-7ee87bae31b3\") " pod="openshift-marketplace/redhat-marketplace-99kd8" Oct 07 12:31:27 crc kubenswrapper[4700]: I1007 12:31:27.223639 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2rvj\" (UniqueName: \"kubernetes.io/projected/126aef62-4d97-41ab-9155-7ee87bae31b3-kube-api-access-b2rvj\") pod \"redhat-marketplace-99kd8\" (UID: \"126aef62-4d97-41ab-9155-7ee87bae31b3\") " pod="openshift-marketplace/redhat-marketplace-99kd8" Oct 07 12:31:27 crc kubenswrapper[4700]: I1007 12:31:27.223766 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/126aef62-4d97-41ab-9155-7ee87bae31b3-catalog-content\") pod \"redhat-marketplace-99kd8\" (UID: \"126aef62-4d97-41ab-9155-7ee87bae31b3\") " pod="openshift-marketplace/redhat-marketplace-99kd8" Oct 07 12:31:27 crc kubenswrapper[4700]: I1007 12:31:27.325872 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/126aef62-4d97-41ab-9155-7ee87bae31b3-utilities\") pod \"redhat-marketplace-99kd8\" (UID: \"126aef62-4d97-41ab-9155-7ee87bae31b3\") " pod="openshift-marketplace/redhat-marketplace-99kd8" Oct 07 12:31:27 crc kubenswrapper[4700]: I1007 12:31:27.326403 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/126aef62-4d97-41ab-9155-7ee87bae31b3-utilities\") pod \"redhat-marketplace-99kd8\" (UID: \"126aef62-4d97-41ab-9155-7ee87bae31b3\") " pod="openshift-marketplace/redhat-marketplace-99kd8" Oct 07 12:31:27 crc kubenswrapper[4700]: I1007 12:31:27.326567 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2rvj\" (UniqueName: \"kubernetes.io/projected/126aef62-4d97-41ab-9155-7ee87bae31b3-kube-api-access-b2rvj\") pod \"redhat-marketplace-99kd8\" (UID: \"126aef62-4d97-41ab-9155-7ee87bae31b3\") " pod="openshift-marketplace/redhat-marketplace-99kd8" Oct 07 12:31:27 crc kubenswrapper[4700]: I1007 12:31:27.326603 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/126aef62-4d97-41ab-9155-7ee87bae31b3-catalog-content\") pod \"redhat-marketplace-99kd8\" (UID: \"126aef62-4d97-41ab-9155-7ee87bae31b3\") " pod="openshift-marketplace/redhat-marketplace-99kd8" Oct 07 12:31:27 crc kubenswrapper[4700]: I1007 12:31:27.326950 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/126aef62-4d97-41ab-9155-7ee87bae31b3-catalog-content\") pod \"redhat-marketplace-99kd8\" (UID: \"126aef62-4d97-41ab-9155-7ee87bae31b3\") " pod="openshift-marketplace/redhat-marketplace-99kd8" Oct 07 12:31:27 crc kubenswrapper[4700]: I1007 12:31:27.345662 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2rvj\" (UniqueName: \"kubernetes.io/projected/126aef62-4d97-41ab-9155-7ee87bae31b3-kube-api-access-b2rvj\") pod \"redhat-marketplace-99kd8\" (UID: \"126aef62-4d97-41ab-9155-7ee87bae31b3\") " pod="openshift-marketplace/redhat-marketplace-99kd8" Oct 07 12:31:27 crc kubenswrapper[4700]: I1007 12:31:27.417391 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-99kd8" Oct 07 12:31:27 crc kubenswrapper[4700]: I1007 12:31:27.906492 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-99kd8"] Oct 07 12:31:27 crc kubenswrapper[4700]: W1007 12:31:27.913853 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod126aef62_4d97_41ab_9155_7ee87bae31b3.slice/crio-3a55fe7a714fc76de1e721750a9fb162e0bde73a966c8d28bd95ab318a960dd5 WatchSource:0}: Error finding container 3a55fe7a714fc76de1e721750a9fb162e0bde73a966c8d28bd95ab318a960dd5: Status 404 returned error can't find the container with id 3a55fe7a714fc76de1e721750a9fb162e0bde73a966c8d28bd95ab318a960dd5 Oct 07 12:31:28 crc kubenswrapper[4700]: I1007 12:31:28.148460 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-99kd8" event={"ID":"126aef62-4d97-41ab-9155-7ee87bae31b3","Type":"ContainerStarted","Data":"3a55fe7a714fc76de1e721750a9fb162e0bde73a966c8d28bd95ab318a960dd5"} Oct 07 12:31:29 crc kubenswrapper[4700]: I1007 12:31:29.161685 4700 generic.go:334] "Generic (PLEG): container finished" podID="126aef62-4d97-41ab-9155-7ee87bae31b3" containerID="f612bf6b2e44417aad4ff8ffece18c4cf246f47e483eca9172ecbe8a656aa395" exitCode=0 Oct 07 12:31:29 crc kubenswrapper[4700]: I1007 12:31:29.162011 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-99kd8" event={"ID":"126aef62-4d97-41ab-9155-7ee87bae31b3","Type":"ContainerDied","Data":"f612bf6b2e44417aad4ff8ffece18c4cf246f47e483eca9172ecbe8a656aa395"} Oct 07 12:31:31 crc kubenswrapper[4700]: I1007 12:31:31.199474 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-99kd8" event={"ID":"126aef62-4d97-41ab-9155-7ee87bae31b3","Type":"ContainerStarted","Data":"5272faf9132e5d3fff60043951274c3a9ea79d6e02444c29ef90dae3b543597f"} Oct 07 12:31:32 crc kubenswrapper[4700]: I1007 12:31:32.215336 4700 generic.go:334] "Generic (PLEG): container finished" podID="126aef62-4d97-41ab-9155-7ee87bae31b3" containerID="5272faf9132e5d3fff60043951274c3a9ea79d6e02444c29ef90dae3b543597f" exitCode=0 Oct 07 12:31:32 crc kubenswrapper[4700]: I1007 12:31:32.215405 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-99kd8" event={"ID":"126aef62-4d97-41ab-9155-7ee87bae31b3","Type":"ContainerDied","Data":"5272faf9132e5d3fff60043951274c3a9ea79d6e02444c29ef90dae3b543597f"} Oct 07 12:31:32 crc kubenswrapper[4700]: I1007 12:31:32.217693 4700 generic.go:334] "Generic (PLEG): container finished" podID="57e102de-28fa-4c52-8ef5-418d319f54a3" containerID="144639db5a4cd9f6f1cd9f5acbcad3af941866b4bf03e9b36d43526ec0c90ab8" exitCode=0 Oct 07 12:31:32 crc kubenswrapper[4700]: I1007 12:31:32.217732 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tlbvn" event={"ID":"57e102de-28fa-4c52-8ef5-418d319f54a3","Type":"ContainerDied","Data":"144639db5a4cd9f6f1cd9f5acbcad3af941866b4bf03e9b36d43526ec0c90ab8"} Oct 07 12:31:33 crc kubenswrapper[4700]: I1007 12:31:33.234682 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tlbvn" event={"ID":"57e102de-28fa-4c52-8ef5-418d319f54a3","Type":"ContainerStarted","Data":"f3b24f33fbf15ce2f585bcec6067e061f34ce08c24314a0f8782bdff1950e700"} Oct 07 12:31:33 crc kubenswrapper[4700]: I1007 12:31:33.237898 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-99kd8" event={"ID":"126aef62-4d97-41ab-9155-7ee87bae31b3","Type":"ContainerStarted","Data":"2ee4d13aa5aa2296a09dc13f855ebb98cc2ffa63f098c086add69dbf011cfd88"} Oct 07 12:31:33 crc kubenswrapper[4700]: I1007 12:31:33.266340 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tlbvn" podStartSLOduration=3.558448377 podStartE2EDuration="11.266319524s" podCreationTimestamp="2025-10-07 12:31:22 +0000 UTC" firstStartedPulling="2025-10-07 12:31:25.086621717 +0000 UTC m=+4251.883020716" lastFinishedPulling="2025-10-07 12:31:32.794492834 +0000 UTC m=+4259.590891863" observedRunningTime="2025-10-07 12:31:33.257102875 +0000 UTC m=+4260.053501874" watchObservedRunningTime="2025-10-07 12:31:33.266319524 +0000 UTC m=+4260.062718503" Oct 07 12:31:33 crc kubenswrapper[4700]: I1007 12:31:33.278707 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-99kd8" podStartSLOduration=2.503000745 podStartE2EDuration="6.278687315s" podCreationTimestamp="2025-10-07 12:31:27 +0000 UTC" firstStartedPulling="2025-10-07 12:31:29.166740692 +0000 UTC m=+4255.963139681" lastFinishedPulling="2025-10-07 12:31:32.942427242 +0000 UTC m=+4259.738826251" observedRunningTime="2025-10-07 12:31:33.276157919 +0000 UTC m=+4260.072556928" watchObservedRunningTime="2025-10-07 12:31:33.278687315 +0000 UTC m=+4260.075086304" Oct 07 12:31:37 crc kubenswrapper[4700]: I1007 12:31:37.418914 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-99kd8" Oct 07 12:31:37 crc kubenswrapper[4700]: I1007 12:31:37.419366 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-99kd8" Oct 07 12:31:37 crc kubenswrapper[4700]: I1007 12:31:37.496657 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-99kd8" Oct 07 12:31:38 crc kubenswrapper[4700]: I1007 12:31:38.357183 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-99kd8" Oct 07 12:31:40 crc kubenswrapper[4700]: I1007 12:31:40.472176 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-99kd8"] Oct 07 12:31:40 crc kubenswrapper[4700]: I1007 12:31:40.472821 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-99kd8" podUID="126aef62-4d97-41ab-9155-7ee87bae31b3" containerName="registry-server" containerID="cri-o://2ee4d13aa5aa2296a09dc13f855ebb98cc2ffa63f098c086add69dbf011cfd88" gracePeriod=2 Oct 07 12:31:41 crc kubenswrapper[4700]: I1007 12:31:41.334002 4700 generic.go:334] "Generic (PLEG): container finished" podID="cf42a61d-59d5-4ab2-9980-4294ed138adb" containerID="b20e265846b22b583cc2e603b2350d4fe1fbf35d4807804e6c45f1eb71b295f7" exitCode=0 Oct 07 12:31:41 crc kubenswrapper[4700]: I1007 12:31:41.334076 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dcccm/must-gather-69xtq" event={"ID":"cf42a61d-59d5-4ab2-9980-4294ed138adb","Type":"ContainerDied","Data":"b20e265846b22b583cc2e603b2350d4fe1fbf35d4807804e6c45f1eb71b295f7"} Oct 07 12:31:41 crc kubenswrapper[4700]: I1007 12:31:41.334913 4700 scope.go:117] "RemoveContainer" containerID="b20e265846b22b583cc2e603b2350d4fe1fbf35d4807804e6c45f1eb71b295f7" Oct 07 12:31:41 crc kubenswrapper[4700]: I1007 12:31:41.338160 4700 generic.go:334] "Generic (PLEG): container finished" podID="126aef62-4d97-41ab-9155-7ee87bae31b3" containerID="2ee4d13aa5aa2296a09dc13f855ebb98cc2ffa63f098c086add69dbf011cfd88" exitCode=0 Oct 07 12:31:41 crc kubenswrapper[4700]: I1007 12:31:41.338222 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-99kd8" event={"ID":"126aef62-4d97-41ab-9155-7ee87bae31b3","Type":"ContainerDied","Data":"2ee4d13aa5aa2296a09dc13f855ebb98cc2ffa63f098c086add69dbf011cfd88"} Oct 07 12:31:41 crc kubenswrapper[4700]: I1007 12:31:41.865459 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-99kd8" Oct 07 12:31:41 crc kubenswrapper[4700]: I1007 12:31:41.971186 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2rvj\" (UniqueName: \"kubernetes.io/projected/126aef62-4d97-41ab-9155-7ee87bae31b3-kube-api-access-b2rvj\") pod \"126aef62-4d97-41ab-9155-7ee87bae31b3\" (UID: \"126aef62-4d97-41ab-9155-7ee87bae31b3\") " Oct 07 12:31:41 crc kubenswrapper[4700]: I1007 12:31:41.971342 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/126aef62-4d97-41ab-9155-7ee87bae31b3-catalog-content\") pod \"126aef62-4d97-41ab-9155-7ee87bae31b3\" (UID: \"126aef62-4d97-41ab-9155-7ee87bae31b3\") " Oct 07 12:31:41 crc kubenswrapper[4700]: I1007 12:31:41.971455 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/126aef62-4d97-41ab-9155-7ee87bae31b3-utilities\") pod \"126aef62-4d97-41ab-9155-7ee87bae31b3\" (UID: \"126aef62-4d97-41ab-9155-7ee87bae31b3\") " Oct 07 12:31:41 crc kubenswrapper[4700]: I1007 12:31:41.972886 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/126aef62-4d97-41ab-9155-7ee87bae31b3-utilities" (OuterVolumeSpecName: "utilities") pod "126aef62-4d97-41ab-9155-7ee87bae31b3" (UID: "126aef62-4d97-41ab-9155-7ee87bae31b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:31:41 crc kubenswrapper[4700]: I1007 12:31:41.982980 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/126aef62-4d97-41ab-9155-7ee87bae31b3-kube-api-access-b2rvj" (OuterVolumeSpecName: "kube-api-access-b2rvj") pod "126aef62-4d97-41ab-9155-7ee87bae31b3" (UID: "126aef62-4d97-41ab-9155-7ee87bae31b3"). InnerVolumeSpecName "kube-api-access-b2rvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:31:41 crc kubenswrapper[4700]: I1007 12:31:41.997625 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/126aef62-4d97-41ab-9155-7ee87bae31b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "126aef62-4d97-41ab-9155-7ee87bae31b3" (UID: "126aef62-4d97-41ab-9155-7ee87bae31b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:31:42 crc kubenswrapper[4700]: I1007 12:31:42.030865 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dcccm_must-gather-69xtq_cf42a61d-59d5-4ab2-9980-4294ed138adb/gather/0.log" Oct 07 12:31:42 crc kubenswrapper[4700]: I1007 12:31:42.077236 4700 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/126aef62-4d97-41ab-9155-7ee87bae31b3-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 12:31:42 crc kubenswrapper[4700]: I1007 12:31:42.077276 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2rvj\" (UniqueName: \"kubernetes.io/projected/126aef62-4d97-41ab-9155-7ee87bae31b3-kube-api-access-b2rvj\") on node \"crc\" DevicePath \"\"" Oct 07 12:31:42 crc kubenswrapper[4700]: I1007 12:31:42.077293 4700 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/126aef62-4d97-41ab-9155-7ee87bae31b3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 12:31:42 crc kubenswrapper[4700]: I1007 12:31:42.358743 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-99kd8" event={"ID":"126aef62-4d97-41ab-9155-7ee87bae31b3","Type":"ContainerDied","Data":"3a55fe7a714fc76de1e721750a9fb162e0bde73a966c8d28bd95ab318a960dd5"} Oct 07 12:31:42 crc kubenswrapper[4700]: I1007 12:31:42.358801 4700 scope.go:117] "RemoveContainer" containerID="2ee4d13aa5aa2296a09dc13f855ebb98cc2ffa63f098c086add69dbf011cfd88" Oct 07 12:31:42 crc kubenswrapper[4700]: I1007 12:31:42.358816 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-99kd8" Oct 07 12:31:42 crc kubenswrapper[4700]: I1007 12:31:42.382080 4700 scope.go:117] "RemoveContainer" containerID="5272faf9132e5d3fff60043951274c3a9ea79d6e02444c29ef90dae3b543597f" Oct 07 12:31:42 crc kubenswrapper[4700]: I1007 12:31:42.408515 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-99kd8"] Oct 07 12:31:42 crc kubenswrapper[4700]: I1007 12:31:42.417798 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-99kd8"] Oct 07 12:31:42 crc kubenswrapper[4700]: I1007 12:31:42.993234 4700 scope.go:117] "RemoveContainer" containerID="f612bf6b2e44417aad4ff8ffece18c4cf246f47e483eca9172ecbe8a656aa395" Oct 07 12:31:43 crc kubenswrapper[4700]: I1007 12:31:43.221419 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tlbvn" Oct 07 12:31:43 crc kubenswrapper[4700]: I1007 12:31:43.221802 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tlbvn" Oct 07 12:31:43 crc kubenswrapper[4700]: I1007 12:31:43.305929 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tlbvn" Oct 07 12:31:43 crc kubenswrapper[4700]: I1007 12:31:43.414096 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tlbvn" Oct 07 12:31:43 crc kubenswrapper[4700]: I1007 12:31:43.979898 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="126aef62-4d97-41ab-9155-7ee87bae31b3" path="/var/lib/kubelet/pods/126aef62-4d97-41ab-9155-7ee87bae31b3/volumes" Oct 07 12:31:44 crc kubenswrapper[4700]: I1007 12:31:44.869634 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tlbvn"] Oct 07 12:31:45 crc kubenswrapper[4700]: I1007 12:31:45.333709 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:31:45 crc kubenswrapper[4700]: I1007 12:31:45.334040 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:31:45 crc kubenswrapper[4700]: I1007 12:31:45.388755 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tlbvn" podUID="57e102de-28fa-4c52-8ef5-418d319f54a3" containerName="registry-server" containerID="cri-o://f3b24f33fbf15ce2f585bcec6067e061f34ce08c24314a0f8782bdff1950e700" gracePeriod=2 Oct 07 12:31:46 crc kubenswrapper[4700]: I1007 12:31:46.243039 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tlbvn" Oct 07 12:31:46 crc kubenswrapper[4700]: I1007 12:31:46.365601 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57e102de-28fa-4c52-8ef5-418d319f54a3-catalog-content\") pod \"57e102de-28fa-4c52-8ef5-418d319f54a3\" (UID: \"57e102de-28fa-4c52-8ef5-418d319f54a3\") " Oct 07 12:31:46 crc kubenswrapper[4700]: I1007 12:31:46.366438 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfhpm\" (UniqueName: \"kubernetes.io/projected/57e102de-28fa-4c52-8ef5-418d319f54a3-kube-api-access-rfhpm\") pod \"57e102de-28fa-4c52-8ef5-418d319f54a3\" (UID: \"57e102de-28fa-4c52-8ef5-418d319f54a3\") " Oct 07 12:31:46 crc kubenswrapper[4700]: I1007 12:31:46.366478 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57e102de-28fa-4c52-8ef5-418d319f54a3-utilities\") pod \"57e102de-28fa-4c52-8ef5-418d319f54a3\" (UID: \"57e102de-28fa-4c52-8ef5-418d319f54a3\") " Oct 07 12:31:46 crc kubenswrapper[4700]: I1007 12:31:46.368397 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57e102de-28fa-4c52-8ef5-418d319f54a3-utilities" (OuterVolumeSpecName: "utilities") pod "57e102de-28fa-4c52-8ef5-418d319f54a3" (UID: "57e102de-28fa-4c52-8ef5-418d319f54a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:31:46 crc kubenswrapper[4700]: I1007 12:31:46.373681 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57e102de-28fa-4c52-8ef5-418d319f54a3-kube-api-access-rfhpm" (OuterVolumeSpecName: "kube-api-access-rfhpm") pod "57e102de-28fa-4c52-8ef5-418d319f54a3" (UID: "57e102de-28fa-4c52-8ef5-418d319f54a3"). InnerVolumeSpecName "kube-api-access-rfhpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:31:46 crc kubenswrapper[4700]: I1007 12:31:46.408763 4700 generic.go:334] "Generic (PLEG): container finished" podID="57e102de-28fa-4c52-8ef5-418d319f54a3" containerID="f3b24f33fbf15ce2f585bcec6067e061f34ce08c24314a0f8782bdff1950e700" exitCode=0 Oct 07 12:31:46 crc kubenswrapper[4700]: I1007 12:31:46.408822 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tlbvn" event={"ID":"57e102de-28fa-4c52-8ef5-418d319f54a3","Type":"ContainerDied","Data":"f3b24f33fbf15ce2f585bcec6067e061f34ce08c24314a0f8782bdff1950e700"} Oct 07 12:31:46 crc kubenswrapper[4700]: I1007 12:31:46.408857 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tlbvn" event={"ID":"57e102de-28fa-4c52-8ef5-418d319f54a3","Type":"ContainerDied","Data":"03c15b0978f0a7c5594cb9a1cedbe13402aae03118af6a427cbf6582b3ab594b"} Oct 07 12:31:46 crc kubenswrapper[4700]: I1007 12:31:46.408879 4700 scope.go:117] "RemoveContainer" containerID="f3b24f33fbf15ce2f585bcec6067e061f34ce08c24314a0f8782bdff1950e700" Oct 07 12:31:46 crc kubenswrapper[4700]: I1007 12:31:46.409028 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tlbvn" Oct 07 12:31:46 crc kubenswrapper[4700]: I1007 12:31:46.415596 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57e102de-28fa-4c52-8ef5-418d319f54a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57e102de-28fa-4c52-8ef5-418d319f54a3" (UID: "57e102de-28fa-4c52-8ef5-418d319f54a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:31:46 crc kubenswrapper[4700]: I1007 12:31:46.468593 4700 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57e102de-28fa-4c52-8ef5-418d319f54a3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 12:31:46 crc kubenswrapper[4700]: I1007 12:31:46.468622 4700 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57e102de-28fa-4c52-8ef5-418d319f54a3-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 12:31:46 crc kubenswrapper[4700]: I1007 12:31:46.468632 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfhpm\" (UniqueName: \"kubernetes.io/projected/57e102de-28fa-4c52-8ef5-418d319f54a3-kube-api-access-rfhpm\") on node \"crc\" DevicePath \"\"" Oct 07 12:31:46 crc kubenswrapper[4700]: I1007 12:31:46.477008 4700 scope.go:117] "RemoveContainer" containerID="144639db5a4cd9f6f1cd9f5acbcad3af941866b4bf03e9b36d43526ec0c90ab8" Oct 07 12:31:46 crc kubenswrapper[4700]: I1007 12:31:46.498966 4700 scope.go:117] "RemoveContainer" containerID="7f53afe6243aa8a6b4b0d5b40082c75faf127571956b2cf229917fbbdf79987e" Oct 07 12:31:46 crc kubenswrapper[4700]: I1007 12:31:46.556515 4700 scope.go:117] "RemoveContainer" containerID="f3b24f33fbf15ce2f585bcec6067e061f34ce08c24314a0f8782bdff1950e700" Oct 07 12:31:46 crc kubenswrapper[4700]: E1007 12:31:46.556895 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3b24f33fbf15ce2f585bcec6067e061f34ce08c24314a0f8782bdff1950e700\": container with ID starting with f3b24f33fbf15ce2f585bcec6067e061f34ce08c24314a0f8782bdff1950e700 not found: ID does not exist" containerID="f3b24f33fbf15ce2f585bcec6067e061f34ce08c24314a0f8782bdff1950e700" Oct 07 12:31:46 crc kubenswrapper[4700]: I1007 12:31:46.556925 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3b24f33fbf15ce2f585bcec6067e061f34ce08c24314a0f8782bdff1950e700"} err="failed to get container status \"f3b24f33fbf15ce2f585bcec6067e061f34ce08c24314a0f8782bdff1950e700\": rpc error: code = NotFound desc = could not find container \"f3b24f33fbf15ce2f585bcec6067e061f34ce08c24314a0f8782bdff1950e700\": container with ID starting with f3b24f33fbf15ce2f585bcec6067e061f34ce08c24314a0f8782bdff1950e700 not found: ID does not exist" Oct 07 12:31:46 crc kubenswrapper[4700]: I1007 12:31:46.556946 4700 scope.go:117] "RemoveContainer" containerID="144639db5a4cd9f6f1cd9f5acbcad3af941866b4bf03e9b36d43526ec0c90ab8" Oct 07 12:31:46 crc kubenswrapper[4700]: E1007 12:31:46.557590 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"144639db5a4cd9f6f1cd9f5acbcad3af941866b4bf03e9b36d43526ec0c90ab8\": container with ID starting with 144639db5a4cd9f6f1cd9f5acbcad3af941866b4bf03e9b36d43526ec0c90ab8 not found: ID does not exist" containerID="144639db5a4cd9f6f1cd9f5acbcad3af941866b4bf03e9b36d43526ec0c90ab8" Oct 07 12:31:46 crc kubenswrapper[4700]: I1007 12:31:46.557634 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"144639db5a4cd9f6f1cd9f5acbcad3af941866b4bf03e9b36d43526ec0c90ab8"} err="failed to get container status \"144639db5a4cd9f6f1cd9f5acbcad3af941866b4bf03e9b36d43526ec0c90ab8\": rpc error: code = NotFound desc = could not find container \"144639db5a4cd9f6f1cd9f5acbcad3af941866b4bf03e9b36d43526ec0c90ab8\": container with ID starting with 144639db5a4cd9f6f1cd9f5acbcad3af941866b4bf03e9b36d43526ec0c90ab8 not found: ID does not exist" Oct 07 12:31:46 crc kubenswrapper[4700]: I1007 12:31:46.557648 4700 scope.go:117] "RemoveContainer" containerID="7f53afe6243aa8a6b4b0d5b40082c75faf127571956b2cf229917fbbdf79987e" Oct 07 12:31:46 crc kubenswrapper[4700]: E1007 12:31:46.558023 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f53afe6243aa8a6b4b0d5b40082c75faf127571956b2cf229917fbbdf79987e\": container with ID starting with 7f53afe6243aa8a6b4b0d5b40082c75faf127571956b2cf229917fbbdf79987e not found: ID does not exist" containerID="7f53afe6243aa8a6b4b0d5b40082c75faf127571956b2cf229917fbbdf79987e" Oct 07 12:31:46 crc kubenswrapper[4700]: I1007 12:31:46.558048 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f53afe6243aa8a6b4b0d5b40082c75faf127571956b2cf229917fbbdf79987e"} err="failed to get container status \"7f53afe6243aa8a6b4b0d5b40082c75faf127571956b2cf229917fbbdf79987e\": rpc error: code = NotFound desc = could not find container \"7f53afe6243aa8a6b4b0d5b40082c75faf127571956b2cf229917fbbdf79987e\": container with ID starting with 7f53afe6243aa8a6b4b0d5b40082c75faf127571956b2cf229917fbbdf79987e not found: ID does not exist" Oct 07 12:31:46 crc kubenswrapper[4700]: I1007 12:31:46.755740 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tlbvn"] Oct 07 12:31:46 crc kubenswrapper[4700]: I1007 12:31:46.766764 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tlbvn"] Oct 07 12:31:47 crc kubenswrapper[4700]: I1007 12:31:47.970233 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57e102de-28fa-4c52-8ef5-418d319f54a3" path="/var/lib/kubelet/pods/57e102de-28fa-4c52-8ef5-418d319f54a3/volumes" Oct 07 12:31:50 crc kubenswrapper[4700]: I1007 12:31:50.533377 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dcccm/must-gather-69xtq"] Oct 07 12:31:50 crc kubenswrapper[4700]: I1007 12:31:50.534534 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-dcccm/must-gather-69xtq" podUID="cf42a61d-59d5-4ab2-9980-4294ed138adb" containerName="copy" containerID="cri-o://04be5a04d1e657216ee7afd448900987a84c49848cc6d603b81da949494850bd" gracePeriod=2 Oct 07 12:31:50 crc kubenswrapper[4700]: I1007 12:31:50.544959 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dcccm/must-gather-69xtq"] Oct 07 12:31:51 crc kubenswrapper[4700]: I1007 12:31:51.257403 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dcccm_must-gather-69xtq_cf42a61d-59d5-4ab2-9980-4294ed138adb/copy/0.log" Oct 07 12:31:51 crc kubenswrapper[4700]: I1007 12:31:51.258372 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dcccm/must-gather-69xtq" Oct 07 12:31:51 crc kubenswrapper[4700]: I1007 12:31:51.394148 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rn6fm\" (UniqueName: \"kubernetes.io/projected/cf42a61d-59d5-4ab2-9980-4294ed138adb-kube-api-access-rn6fm\") pod \"cf42a61d-59d5-4ab2-9980-4294ed138adb\" (UID: \"cf42a61d-59d5-4ab2-9980-4294ed138adb\") " Oct 07 12:31:51 crc kubenswrapper[4700]: I1007 12:31:51.394237 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cf42a61d-59d5-4ab2-9980-4294ed138adb-must-gather-output\") pod \"cf42a61d-59d5-4ab2-9980-4294ed138adb\" (UID: \"cf42a61d-59d5-4ab2-9980-4294ed138adb\") " Oct 07 12:31:51 crc kubenswrapper[4700]: I1007 12:31:51.401924 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf42a61d-59d5-4ab2-9980-4294ed138adb-kube-api-access-rn6fm" (OuterVolumeSpecName: "kube-api-access-rn6fm") pod "cf42a61d-59d5-4ab2-9980-4294ed138adb" (UID: "cf42a61d-59d5-4ab2-9980-4294ed138adb"). InnerVolumeSpecName "kube-api-access-rn6fm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:31:51 crc kubenswrapper[4700]: I1007 12:31:51.457687 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dcccm_must-gather-69xtq_cf42a61d-59d5-4ab2-9980-4294ed138adb/copy/0.log" Oct 07 12:31:51 crc kubenswrapper[4700]: I1007 12:31:51.458022 4700 generic.go:334] "Generic (PLEG): container finished" podID="cf42a61d-59d5-4ab2-9980-4294ed138adb" containerID="04be5a04d1e657216ee7afd448900987a84c49848cc6d603b81da949494850bd" exitCode=143 Oct 07 12:31:51 crc kubenswrapper[4700]: I1007 12:31:51.458260 4700 scope.go:117] "RemoveContainer" containerID="04be5a04d1e657216ee7afd448900987a84c49848cc6d603b81da949494850bd" Oct 07 12:31:51 crc kubenswrapper[4700]: I1007 12:31:51.458402 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dcccm/must-gather-69xtq" Oct 07 12:31:51 crc kubenswrapper[4700]: I1007 12:31:51.493943 4700 scope.go:117] "RemoveContainer" containerID="b20e265846b22b583cc2e603b2350d4fe1fbf35d4807804e6c45f1eb71b295f7" Oct 07 12:31:51 crc kubenswrapper[4700]: I1007 12:31:51.496220 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rn6fm\" (UniqueName: \"kubernetes.io/projected/cf42a61d-59d5-4ab2-9980-4294ed138adb-kube-api-access-rn6fm\") on node \"crc\" DevicePath \"\"" Oct 07 12:31:51 crc kubenswrapper[4700]: I1007 12:31:51.550853 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf42a61d-59d5-4ab2-9980-4294ed138adb-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "cf42a61d-59d5-4ab2-9980-4294ed138adb" (UID: "cf42a61d-59d5-4ab2-9980-4294ed138adb"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:31:51 crc kubenswrapper[4700]: I1007 12:31:51.574867 4700 scope.go:117] "RemoveContainer" containerID="04be5a04d1e657216ee7afd448900987a84c49848cc6d603b81da949494850bd" Oct 07 12:31:51 crc kubenswrapper[4700]: E1007 12:31:51.575347 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04be5a04d1e657216ee7afd448900987a84c49848cc6d603b81da949494850bd\": container with ID starting with 04be5a04d1e657216ee7afd448900987a84c49848cc6d603b81da949494850bd not found: ID does not exist" containerID="04be5a04d1e657216ee7afd448900987a84c49848cc6d603b81da949494850bd" Oct 07 12:31:51 crc kubenswrapper[4700]: I1007 12:31:51.575391 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04be5a04d1e657216ee7afd448900987a84c49848cc6d603b81da949494850bd"} err="failed to get container status \"04be5a04d1e657216ee7afd448900987a84c49848cc6d603b81da949494850bd\": rpc error: code = NotFound desc = could not find container \"04be5a04d1e657216ee7afd448900987a84c49848cc6d603b81da949494850bd\": container with ID starting with 04be5a04d1e657216ee7afd448900987a84c49848cc6d603b81da949494850bd not found: ID does not exist" Oct 07 12:31:51 crc kubenswrapper[4700]: I1007 12:31:51.575420 4700 scope.go:117] "RemoveContainer" containerID="b20e265846b22b583cc2e603b2350d4fe1fbf35d4807804e6c45f1eb71b295f7" Oct 07 12:31:51 crc kubenswrapper[4700]: E1007 12:31:51.575711 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b20e265846b22b583cc2e603b2350d4fe1fbf35d4807804e6c45f1eb71b295f7\": container with ID starting with b20e265846b22b583cc2e603b2350d4fe1fbf35d4807804e6c45f1eb71b295f7 not found: ID does not exist" containerID="b20e265846b22b583cc2e603b2350d4fe1fbf35d4807804e6c45f1eb71b295f7" Oct 07 12:31:51 crc kubenswrapper[4700]: I1007 12:31:51.575740 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b20e265846b22b583cc2e603b2350d4fe1fbf35d4807804e6c45f1eb71b295f7"} err="failed to get container status \"b20e265846b22b583cc2e603b2350d4fe1fbf35d4807804e6c45f1eb71b295f7\": rpc error: code = NotFound desc = could not find container \"b20e265846b22b583cc2e603b2350d4fe1fbf35d4807804e6c45f1eb71b295f7\": container with ID starting with b20e265846b22b583cc2e603b2350d4fe1fbf35d4807804e6c45f1eb71b295f7 not found: ID does not exist" Oct 07 12:31:51 crc kubenswrapper[4700]: I1007 12:31:51.597709 4700 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cf42a61d-59d5-4ab2-9980-4294ed138adb-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 07 12:31:51 crc kubenswrapper[4700]: I1007 12:31:51.976814 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf42a61d-59d5-4ab2-9980-4294ed138adb" path="/var/lib/kubelet/pods/cf42a61d-59d5-4ab2-9980-4294ed138adb/volumes" Oct 07 12:32:15 crc kubenswrapper[4700]: I1007 12:32:15.334658 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:32:15 crc kubenswrapper[4700]: I1007 12:32:15.335588 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:32:27 crc kubenswrapper[4700]: I1007 12:32:27.495826 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d2n8h/must-gather-zzx7b"] Oct 07 12:32:27 crc kubenswrapper[4700]: E1007 12:32:27.498705 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="126aef62-4d97-41ab-9155-7ee87bae31b3" containerName="extract-content" Oct 07 12:32:27 crc kubenswrapper[4700]: I1007 12:32:27.498823 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="126aef62-4d97-41ab-9155-7ee87bae31b3" containerName="extract-content" Oct 07 12:32:27 crc kubenswrapper[4700]: E1007 12:32:27.498906 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf42a61d-59d5-4ab2-9980-4294ed138adb" containerName="gather" Oct 07 12:32:27 crc kubenswrapper[4700]: I1007 12:32:27.498978 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf42a61d-59d5-4ab2-9980-4294ed138adb" containerName="gather" Oct 07 12:32:27 crc kubenswrapper[4700]: E1007 12:32:27.499053 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57e102de-28fa-4c52-8ef5-418d319f54a3" containerName="extract-content" Oct 07 12:32:27 crc kubenswrapper[4700]: I1007 12:32:27.499118 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="57e102de-28fa-4c52-8ef5-418d319f54a3" containerName="extract-content" Oct 07 12:32:27 crc kubenswrapper[4700]: E1007 12:32:27.499331 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="126aef62-4d97-41ab-9155-7ee87bae31b3" containerName="extract-utilities" Oct 07 12:32:27 crc kubenswrapper[4700]: I1007 12:32:27.499431 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="126aef62-4d97-41ab-9155-7ee87bae31b3" containerName="extract-utilities" Oct 07 12:32:27 crc kubenswrapper[4700]: E1007 12:32:27.499505 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57e102de-28fa-4c52-8ef5-418d319f54a3" containerName="registry-server" Oct 07 12:32:27 crc kubenswrapper[4700]: I1007 12:32:27.499575 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="57e102de-28fa-4c52-8ef5-418d319f54a3" containerName="registry-server" Oct 07 12:32:27 crc kubenswrapper[4700]: E1007 12:32:27.499650 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="126aef62-4d97-41ab-9155-7ee87bae31b3" containerName="registry-server" Oct 07 12:32:27 crc kubenswrapper[4700]: I1007 12:32:27.499718 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="126aef62-4d97-41ab-9155-7ee87bae31b3" containerName="registry-server" Oct 07 12:32:27 crc kubenswrapper[4700]: E1007 12:32:27.499828 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf42a61d-59d5-4ab2-9980-4294ed138adb" containerName="copy" Oct 07 12:32:27 crc kubenswrapper[4700]: I1007 12:32:27.499900 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf42a61d-59d5-4ab2-9980-4294ed138adb" containerName="copy" Oct 07 12:32:27 crc kubenswrapper[4700]: E1007 12:32:27.499987 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57e102de-28fa-4c52-8ef5-418d319f54a3" containerName="extract-utilities" Oct 07 12:32:27 crc kubenswrapper[4700]: I1007 12:32:27.500056 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="57e102de-28fa-4c52-8ef5-418d319f54a3" containerName="extract-utilities" Oct 07 12:32:27 crc kubenswrapper[4700]: I1007 12:32:27.500560 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf42a61d-59d5-4ab2-9980-4294ed138adb" containerName="gather" Oct 07 12:32:27 crc kubenswrapper[4700]: I1007 12:32:27.500691 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf42a61d-59d5-4ab2-9980-4294ed138adb" containerName="copy" Oct 07 12:32:27 crc kubenswrapper[4700]: I1007 12:32:27.500766 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="57e102de-28fa-4c52-8ef5-418d319f54a3" containerName="registry-server" Oct 07 12:32:27 crc kubenswrapper[4700]: I1007 12:32:27.505049 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="126aef62-4d97-41ab-9155-7ee87bae31b3" containerName="registry-server" Oct 07 12:32:27 crc kubenswrapper[4700]: I1007 12:32:27.506923 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2n8h/must-gather-zzx7b" Oct 07 12:32:27 crc kubenswrapper[4700]: I1007 12:32:27.510953 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-d2n8h"/"openshift-service-ca.crt" Oct 07 12:32:27 crc kubenswrapper[4700]: I1007 12:32:27.511196 4700 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-d2n8h"/"kube-root-ca.crt" Oct 07 12:32:27 crc kubenswrapper[4700]: I1007 12:32:27.548007 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d2n8h/must-gather-zzx7b"] Oct 07 12:32:27 crc kubenswrapper[4700]: I1007 12:32:27.688044 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/aca758f7-c3a4-4f0e-b775-6b563550711f-must-gather-output\") pod \"must-gather-zzx7b\" (UID: \"aca758f7-c3a4-4f0e-b775-6b563550711f\") " pod="openshift-must-gather-d2n8h/must-gather-zzx7b" Oct 07 12:32:27 crc kubenswrapper[4700]: I1007 12:32:27.688120 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xtsm\" (UniqueName: \"kubernetes.io/projected/aca758f7-c3a4-4f0e-b775-6b563550711f-kube-api-access-8xtsm\") pod \"must-gather-zzx7b\" (UID: \"aca758f7-c3a4-4f0e-b775-6b563550711f\") " pod="openshift-must-gather-d2n8h/must-gather-zzx7b" Oct 07 12:32:27 crc kubenswrapper[4700]: I1007 12:32:27.791209 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/aca758f7-c3a4-4f0e-b775-6b563550711f-must-gather-output\") pod \"must-gather-zzx7b\" (UID: \"aca758f7-c3a4-4f0e-b775-6b563550711f\") " pod="openshift-must-gather-d2n8h/must-gather-zzx7b" Oct 07 12:32:27 crc kubenswrapper[4700]: I1007 12:32:27.791570 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xtsm\" (UniqueName: \"kubernetes.io/projected/aca758f7-c3a4-4f0e-b775-6b563550711f-kube-api-access-8xtsm\") pod \"must-gather-zzx7b\" (UID: \"aca758f7-c3a4-4f0e-b775-6b563550711f\") " pod="openshift-must-gather-d2n8h/must-gather-zzx7b" Oct 07 12:32:27 crc kubenswrapper[4700]: I1007 12:32:27.791852 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/aca758f7-c3a4-4f0e-b775-6b563550711f-must-gather-output\") pod \"must-gather-zzx7b\" (UID: \"aca758f7-c3a4-4f0e-b775-6b563550711f\") " pod="openshift-must-gather-d2n8h/must-gather-zzx7b" Oct 07 12:32:27 crc kubenswrapper[4700]: I1007 12:32:27.822265 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xtsm\" (UniqueName: \"kubernetes.io/projected/aca758f7-c3a4-4f0e-b775-6b563550711f-kube-api-access-8xtsm\") pod \"must-gather-zzx7b\" (UID: \"aca758f7-c3a4-4f0e-b775-6b563550711f\") " pod="openshift-must-gather-d2n8h/must-gather-zzx7b" Oct 07 12:32:27 crc kubenswrapper[4700]: I1007 12:32:27.833180 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2n8h/must-gather-zzx7b" Oct 07 12:32:28 crc kubenswrapper[4700]: I1007 12:32:28.519244 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d2n8h/must-gather-zzx7b"] Oct 07 12:32:28 crc kubenswrapper[4700]: I1007 12:32:28.939414 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2n8h/must-gather-zzx7b" event={"ID":"aca758f7-c3a4-4f0e-b775-6b563550711f","Type":"ContainerStarted","Data":"60212ce629dd4e0bff4c9d6f7d43e25433be8c795cbebf99155d047e626eb692"} Oct 07 12:32:28 crc kubenswrapper[4700]: I1007 12:32:28.939786 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2n8h/must-gather-zzx7b" event={"ID":"aca758f7-c3a4-4f0e-b775-6b563550711f","Type":"ContainerStarted","Data":"8a57aad41667682eb93bba7aa239e1afb343daa3544ba72d8cc2900d6e095545"} Oct 07 12:32:29 crc kubenswrapper[4700]: I1007 12:32:29.951416 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2n8h/must-gather-zzx7b" event={"ID":"aca758f7-c3a4-4f0e-b775-6b563550711f","Type":"ContainerStarted","Data":"5690ab09315eeef0f9125f29e6755a99a20a88655b8145566c4d619e5a1c9fcb"} Oct 07 12:32:29 crc kubenswrapper[4700]: I1007 12:32:29.969638 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-d2n8h/must-gather-zzx7b" podStartSLOduration=2.969614938 podStartE2EDuration="2.969614938s" podCreationTimestamp="2025-10-07 12:32:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:32:29.969234788 +0000 UTC m=+4316.765633797" watchObservedRunningTime="2025-10-07 12:32:29.969614938 +0000 UTC m=+4316.766013937" Oct 07 12:32:33 crc kubenswrapper[4700]: I1007 12:32:33.489961 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d2n8h/crc-debug-8zh8b"] Oct 07 12:32:33 crc kubenswrapper[4700]: I1007 12:32:33.491766 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2n8h/crc-debug-8zh8b" Oct 07 12:32:33 crc kubenswrapper[4700]: I1007 12:32:33.493786 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-d2n8h"/"default-dockercfg-jqcts" Oct 07 12:32:33 crc kubenswrapper[4700]: I1007 12:32:33.537356 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7b3701aa-5891-4023-9d82-72487c196cd7-host\") pod \"crc-debug-8zh8b\" (UID: \"7b3701aa-5891-4023-9d82-72487c196cd7\") " pod="openshift-must-gather-d2n8h/crc-debug-8zh8b" Oct 07 12:32:33 crc kubenswrapper[4700]: I1007 12:32:33.537446 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zhxb\" (UniqueName: \"kubernetes.io/projected/7b3701aa-5891-4023-9d82-72487c196cd7-kube-api-access-2zhxb\") pod \"crc-debug-8zh8b\" (UID: \"7b3701aa-5891-4023-9d82-72487c196cd7\") " pod="openshift-must-gather-d2n8h/crc-debug-8zh8b" Oct 07 12:32:33 crc kubenswrapper[4700]: I1007 12:32:33.638976 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7b3701aa-5891-4023-9d82-72487c196cd7-host\") pod \"crc-debug-8zh8b\" (UID: \"7b3701aa-5891-4023-9d82-72487c196cd7\") " pod="openshift-must-gather-d2n8h/crc-debug-8zh8b" Oct 07 12:32:33 crc kubenswrapper[4700]: I1007 12:32:33.639067 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zhxb\" (UniqueName: \"kubernetes.io/projected/7b3701aa-5891-4023-9d82-72487c196cd7-kube-api-access-2zhxb\") pod \"crc-debug-8zh8b\" (UID: \"7b3701aa-5891-4023-9d82-72487c196cd7\") " pod="openshift-must-gather-d2n8h/crc-debug-8zh8b" Oct 07 12:32:33 crc kubenswrapper[4700]: I1007 12:32:33.639561 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7b3701aa-5891-4023-9d82-72487c196cd7-host\") pod \"crc-debug-8zh8b\" (UID: \"7b3701aa-5891-4023-9d82-72487c196cd7\") " pod="openshift-must-gather-d2n8h/crc-debug-8zh8b" Oct 07 12:32:33 crc kubenswrapper[4700]: I1007 12:32:33.664892 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zhxb\" (UniqueName: \"kubernetes.io/projected/7b3701aa-5891-4023-9d82-72487c196cd7-kube-api-access-2zhxb\") pod \"crc-debug-8zh8b\" (UID: \"7b3701aa-5891-4023-9d82-72487c196cd7\") " pod="openshift-must-gather-d2n8h/crc-debug-8zh8b" Oct 07 12:32:33 crc kubenswrapper[4700]: I1007 12:32:33.811752 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2n8h/crc-debug-8zh8b" Oct 07 12:32:33 crc kubenswrapper[4700]: W1007 12:32:33.868216 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b3701aa_5891_4023_9d82_72487c196cd7.slice/crio-b3fc96ab859c944257cead52d8fe3fdc068dbf084b78b967c5e02798ed652273 WatchSource:0}: Error finding container b3fc96ab859c944257cead52d8fe3fdc068dbf084b78b967c5e02798ed652273: Status 404 returned error can't find the container with id b3fc96ab859c944257cead52d8fe3fdc068dbf084b78b967c5e02798ed652273 Oct 07 12:32:34 crc kubenswrapper[4700]: I1007 12:32:34.014469 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2n8h/crc-debug-8zh8b" event={"ID":"7b3701aa-5891-4023-9d82-72487c196cd7","Type":"ContainerStarted","Data":"b3fc96ab859c944257cead52d8fe3fdc068dbf084b78b967c5e02798ed652273"} Oct 07 12:32:35 crc kubenswrapper[4700]: I1007 12:32:35.024494 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2n8h/crc-debug-8zh8b" event={"ID":"7b3701aa-5891-4023-9d82-72487c196cd7","Type":"ContainerStarted","Data":"77c25e2b945b0b6281ca00d66c0a048772f5386d6f2640ce80fe2e10f9f94331"} Oct 07 12:32:45 crc kubenswrapper[4700]: I1007 12:32:45.334266 4700 patch_prober.go:28] interesting pod/machine-config-daemon-v6h5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:32:45 crc kubenswrapper[4700]: I1007 12:32:45.334915 4700 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:32:45 crc kubenswrapper[4700]: I1007 12:32:45.334965 4700 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" Oct 07 12:32:45 crc kubenswrapper[4700]: I1007 12:32:45.335819 4700 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dd3569fef33a6369b30b4ac53f5a45541b5a0d27944d097fc6979cfc99c5dc6a"} pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 12:32:45 crc kubenswrapper[4700]: I1007 12:32:45.335882 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerName="machine-config-daemon" containerID="cri-o://dd3569fef33a6369b30b4ac53f5a45541b5a0d27944d097fc6979cfc99c5dc6a" gracePeriod=600 Oct 07 12:32:46 crc kubenswrapper[4700]: I1007 12:32:46.159153 4700 generic.go:334] "Generic (PLEG): container finished" podID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" containerID="dd3569fef33a6369b30b4ac53f5a45541b5a0d27944d097fc6979cfc99c5dc6a" exitCode=0 Oct 07 12:32:46 crc kubenswrapper[4700]: I1007 12:32:46.159218 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" event={"ID":"97a77b38-e9b1-4243-ac3a-28d83d87cf15","Type":"ContainerDied","Data":"dd3569fef33a6369b30b4ac53f5a45541b5a0d27944d097fc6979cfc99c5dc6a"} Oct 07 12:32:46 crc kubenswrapper[4700]: I1007 12:32:46.159927 4700 scope.go:117] "RemoveContainer" containerID="6854934ce5d0721a7ce9761f6927f75206363bc994778df1c1e2fd1114dbbdd4" Oct 07 12:32:46 crc kubenswrapper[4700]: E1007 12:32:46.167098 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:32:47 crc kubenswrapper[4700]: I1007 12:32:47.170518 4700 scope.go:117] "RemoveContainer" containerID="dd3569fef33a6369b30b4ac53f5a45541b5a0d27944d097fc6979cfc99c5dc6a" Oct 07 12:32:47 crc kubenswrapper[4700]: E1007 12:32:47.171073 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:32:47 crc kubenswrapper[4700]: I1007 12:32:47.196915 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-d2n8h/crc-debug-8zh8b" podStartSLOduration=14.1968945 podStartE2EDuration="14.1968945s" podCreationTimestamp="2025-10-07 12:32:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:32:35.041396171 +0000 UTC m=+4321.837795160" watchObservedRunningTime="2025-10-07 12:32:47.1968945 +0000 UTC m=+4333.993293499" Oct 07 12:32:58 crc kubenswrapper[4700]: I1007 12:32:58.956915 4700 scope.go:117] "RemoveContainer" containerID="dd3569fef33a6369b30b4ac53f5a45541b5a0d27944d097fc6979cfc99c5dc6a" Oct 07 12:32:58 crc kubenswrapper[4700]: E1007 12:32:58.957595 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:33:08 crc kubenswrapper[4700]: I1007 12:33:08.520188 4700 scope.go:117] "RemoveContainer" containerID="f1c599eeac98fc59e1c7ecf51fbb14b039a2d828fec462ba0a0cbec96dd8625c" Oct 07 12:33:11 crc kubenswrapper[4700]: I1007 12:33:11.959419 4700 scope.go:117] "RemoveContainer" containerID="dd3569fef33a6369b30b4ac53f5a45541b5a0d27944d097fc6979cfc99c5dc6a" Oct 07 12:33:11 crc kubenswrapper[4700]: E1007 12:33:11.960087 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:33:26 crc kubenswrapper[4700]: I1007 12:33:26.957407 4700 scope.go:117] "RemoveContainer" containerID="dd3569fef33a6369b30b4ac53f5a45541b5a0d27944d097fc6979cfc99c5dc6a" Oct 07 12:33:26 crc kubenswrapper[4700]: E1007 12:33:26.958127 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:33:38 crc kubenswrapper[4700]: I1007 12:33:38.958163 4700 scope.go:117] "RemoveContainer" containerID="dd3569fef33a6369b30b4ac53f5a45541b5a0d27944d097fc6979cfc99c5dc6a" Oct 07 12:33:38 crc kubenswrapper[4700]: E1007 12:33:38.959524 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:33:50 crc kubenswrapper[4700]: I1007 12:33:50.957165 4700 scope.go:117] "RemoveContainer" containerID="dd3569fef33a6369b30b4ac53f5a45541b5a0d27944d097fc6979cfc99c5dc6a" Oct 07 12:33:50 crc kubenswrapper[4700]: E1007 12:33:50.957961 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:34:04 crc kubenswrapper[4700]: I1007 12:34:04.298170 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_de281a78-c284-4c5e-8312-6661e2543668/init-config-reloader/0.log" Oct 07 12:34:04 crc kubenswrapper[4700]: I1007 12:34:04.603568 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_de281a78-c284-4c5e-8312-6661e2543668/alertmanager/0.log" Oct 07 12:34:04 crc kubenswrapper[4700]: I1007 12:34:04.619932 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_de281a78-c284-4c5e-8312-6661e2543668/init-config-reloader/0.log" Oct 07 12:34:04 crc kubenswrapper[4700]: I1007 12:34:04.734410 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_de281a78-c284-4c5e-8312-6661e2543668/config-reloader/0.log" Oct 07 12:34:04 crc kubenswrapper[4700]: I1007 12:34:04.867648 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_c868df58-1fc5-45c0-967d-d42bdb1390f5/aodh-api/0.log" Oct 07 12:34:04 crc kubenswrapper[4700]: I1007 12:34:04.955640 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_c868df58-1fc5-45c0-967d-d42bdb1390f5/aodh-evaluator/0.log" Oct 07 12:34:05 crc kubenswrapper[4700]: I1007 12:34:05.081643 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_c868df58-1fc5-45c0-967d-d42bdb1390f5/aodh-notifier/0.log" Oct 07 12:34:05 crc kubenswrapper[4700]: I1007 12:34:05.121628 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_c868df58-1fc5-45c0-967d-d42bdb1390f5/aodh-listener/0.log" Oct 07 12:34:05 crc kubenswrapper[4700]: I1007 12:34:05.957864 4700 scope.go:117] "RemoveContainer" containerID="dd3569fef33a6369b30b4ac53f5a45541b5a0d27944d097fc6979cfc99c5dc6a" Oct 07 12:34:05 crc kubenswrapper[4700]: E1007 12:34:05.958489 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:34:06 crc kubenswrapper[4700]: I1007 12:34:06.475006 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-746789dcd4-wsdtq_c1e6ae51-277f-403c-a01a-5786160b1298/barbican-api/0.log" Oct 07 12:34:06 crc kubenswrapper[4700]: I1007 12:34:06.489300 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-746789dcd4-wsdtq_c1e6ae51-277f-403c-a01a-5786160b1298/barbican-api-log/0.log" Oct 07 12:34:06 crc kubenswrapper[4700]: I1007 12:34:06.702882 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-65bb648478-f5h6h_62cb738e-4901-49c1-8516-02b0c2a44482/barbican-keystone-listener/0.log" Oct 07 12:34:06 crc kubenswrapper[4700]: I1007 12:34:06.720320 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-65bb648478-f5h6h_62cb738e-4901-49c1-8516-02b0c2a44482/barbican-keystone-listener-log/0.log" Oct 07 12:34:06 crc kubenswrapper[4700]: I1007 12:34:06.882016 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-55444c599f-s65df_f28d9836-f2c1-4a60-97fd-324ba6b0331b/barbican-worker/0.log" Oct 07 12:34:06 crc kubenswrapper[4700]: I1007 12:34:06.911191 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-55444c599f-s65df_f28d9836-f2c1-4a60-97fd-324ba6b0331b/barbican-worker-log/0.log" Oct 07 12:34:07 crc kubenswrapper[4700]: I1007 12:34:07.100389 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-2jkzx_112df1a0-e767-41be-a95e-4f7e62024fa2/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 12:34:07 crc kubenswrapper[4700]: I1007 12:34:07.301812 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_dff20986-65c2-4eb2-859c-55ea212165b5/ceilometer-notification-agent/0.log" Oct 07 12:34:07 crc kubenswrapper[4700]: I1007 12:34:07.302387 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_dff20986-65c2-4eb2-859c-55ea212165b5/ceilometer-central-agent/0.log" Oct 07 12:34:07 crc kubenswrapper[4700]: I1007 12:34:07.356510 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_dff20986-65c2-4eb2-859c-55ea212165b5/proxy-httpd/0.log" Oct 07 12:34:07 crc kubenswrapper[4700]: I1007 12:34:07.375455 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_dff20986-65c2-4eb2-859c-55ea212165b5/sg-core/0.log" Oct 07 12:34:07 crc kubenswrapper[4700]: I1007 12:34:07.550614 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9ab9a3d0-4f1c-4650-b766-836415e6cb40/cinder-api/0.log" Oct 07 12:34:07 crc kubenswrapper[4700]: I1007 12:34:07.568113 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9ab9a3d0-4f1c-4650-b766-836415e6cb40/cinder-api-log/0.log" Oct 07 12:34:07 crc kubenswrapper[4700]: I1007 12:34:07.766513 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_36ec56ec-a014-4027-a6c0-c817f5bda5ca/probe/0.log" Oct 07 12:34:07 crc kubenswrapper[4700]: I1007 12:34:07.796118 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_36ec56ec-a014-4027-a6c0-c817f5bda5ca/cinder-scheduler/0.log" Oct 07 12:34:08 crc kubenswrapper[4700]: I1007 12:34:08.259771 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-hxcml_e0170992-8798-4624-9953-368a237e9903/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 12:34:08 crc kubenswrapper[4700]: I1007 12:34:08.322803 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-6gh6p_efce44a2-43b7-497a-bd61-81d0bbb5259b/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 12:34:08 crc kubenswrapper[4700]: I1007 12:34:08.504731 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-9rwp9_abd6f193-5bf2-4c18-b7b7-d3395f8fe1ff/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 12:34:08 crc kubenswrapper[4700]: I1007 12:34:08.625285 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-bb85b8995-42kbj_33bd4ab3-f047-4932-a264-163e46ec9749/init/0.log" Oct 07 12:34:08 crc kubenswrapper[4700]: I1007 12:34:08.782874 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-bb85b8995-42kbj_33bd4ab3-f047-4932-a264-163e46ec9749/dnsmasq-dns/0.log" Oct 07 12:34:08 crc kubenswrapper[4700]: I1007 12:34:08.787114 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-bb85b8995-42kbj_33bd4ab3-f047-4932-a264-163e46ec9749/init/0.log" Oct 07 12:34:08 crc kubenswrapper[4700]: I1007 12:34:08.825832 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-wkfzn_f2864034-dca7-4ae9-b846-17c9ba11e35c/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 12:34:08 crc kubenswrapper[4700]: I1007 12:34:08.981208 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2c374f64-ff8f-42c4-b879-fc4a8462a252/glance-httpd/0.log" Oct 07 12:34:09 crc kubenswrapper[4700]: I1007 12:34:09.011197 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2c374f64-ff8f-42c4-b879-fc4a8462a252/glance-log/0.log" Oct 07 12:34:09 crc kubenswrapper[4700]: I1007 12:34:09.224767 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_91a6e182-e619-4e81-a9f1-4a31630788c5/glance-log/0.log" Oct 07 12:34:09 crc kubenswrapper[4700]: I1007 12:34:09.254739 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_91a6e182-e619-4e81-a9f1-4a31630788c5/glance-httpd/0.log" Oct 07 12:34:09 crc kubenswrapper[4700]: I1007 12:34:09.419470 4700 scope.go:117] "RemoveContainer" containerID="201793c0471af59593892fdd28d8a293164b068105dd09fd9958f1514baa47a6" Oct 07 12:34:09 crc kubenswrapper[4700]: I1007 12:34:09.742717 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-575565f88c-czn8g_83eadcce-bdaa-493b-b76c-91cdfd9f8b15/heat-engine/0.log" Oct 07 12:34:09 crc kubenswrapper[4700]: I1007 12:34:09.774725 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-5bd5586b7f-mt9tb_6d6d6a4d-b338-4b5f-8606-ecb9129b2a15/heat-api/0.log" Oct 07 12:34:09 crc kubenswrapper[4700]: I1007 12:34:09.899224 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-764bc4c4ff-fb769_5a792755-beef-4d08-a80d-8fd891e9027a/heat-cfnapi/0.log" Oct 07 12:34:09 crc kubenswrapper[4700]: I1007 12:34:09.904624 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-nwbdh_1cfb28ea-af7c-4518-8e0a-34a0e87fb5f5/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 12:34:10 crc kubenswrapper[4700]: I1007 12:34:10.003422 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-lnpzk_52767ee1-91e5-47e9-b945-70e2a4df6ec8/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 12:34:10 crc kubenswrapper[4700]: I1007 12:34:10.193226 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29330641-xmvkv_41b7be8c-afe7-4893-a50a-2e73d28bb1a9/keystone-cron/0.log" Oct 07 12:34:10 crc kubenswrapper[4700]: I1007 12:34:10.205596 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-8444f487fd-js794_2300bc48-b64d-42ea-bc78-be6ca9508d5b/keystone-api/0.log" Oct 07 12:34:10 crc kubenswrapper[4700]: I1007 12:34:10.352152 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_ed269e79-4083-4c3b-b44e-4986f2d82921/kube-state-metrics/0.log" Oct 07 12:34:10 crc kubenswrapper[4700]: I1007 12:34:10.433658 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-5kf4n_075a58e4-36cd-4194-a235-b75f63adb1e2/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 12:34:10 crc kubenswrapper[4700]: I1007 12:34:10.707398 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-744b8f5559-c67wh_3b103be5-6b3d-41f7-ba2e-34f1f5b2730a/neutron-api/0.log" Oct 07 12:34:10 crc kubenswrapper[4700]: I1007 12:34:10.765380 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-744b8f5559-c67wh_3b103be5-6b3d-41f7-ba2e-34f1f5b2730a/neutron-httpd/0.log" Oct 07 12:34:10 crc kubenswrapper[4700]: I1007 12:34:10.973450 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-kdt2d_e614fc07-932e-461a-9921-3471f4649838/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 12:34:11 crc kubenswrapper[4700]: I1007 12:34:11.373777 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88/nova-api-log/0.log" Oct 07 12:34:11 crc kubenswrapper[4700]: I1007 12:34:11.728545 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_606c2b50-ba1e-4181-8615-29f434e0597e/nova-cell0-conductor-conductor/0.log" Oct 07 12:34:11 crc kubenswrapper[4700]: I1007 12:34:11.736380 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2fff6d8a-ae4e-4240-b93a-2e3c6ef20f88/nova-api-api/0.log" Oct 07 12:34:12 crc kubenswrapper[4700]: I1007 12:34:12.062561 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_c70174ca-8b17-4f4a-9c6f-0df36cdd3fe1/nova-cell1-conductor-conductor/0.log" Oct 07 12:34:12 crc kubenswrapper[4700]: I1007 12:34:12.129753 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_a4b40ae6-2f36-447e-bc97-7cbcfd970bce/nova-cell1-novncproxy-novncproxy/0.log" Oct 07 12:34:12 crc kubenswrapper[4700]: I1007 12:34:12.326688 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-wlkdf_f28c07c7-b33b-4203-a814-25cc5156660b/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 12:34:12 crc kubenswrapper[4700]: I1007 12:34:12.442053 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_adba6450-c198-456b-a139-67d93e54847b/nova-metadata-log/0.log" Oct 07 12:34:12 crc kubenswrapper[4700]: I1007 12:34:12.921534 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_5785d364-839d-453a-a35f-b95ea89c2152/nova-scheduler-scheduler/0.log" Oct 07 12:34:13 crc kubenswrapper[4700]: I1007 12:34:13.181275 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_793ba797-8da0-4e56-8dcc-14d7d2b0e217/mysql-bootstrap/0.log" Oct 07 12:34:13 crc kubenswrapper[4700]: I1007 12:34:13.353472 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_793ba797-8da0-4e56-8dcc-14d7d2b0e217/mysql-bootstrap/0.log" Oct 07 12:34:13 crc kubenswrapper[4700]: I1007 12:34:13.371506 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_793ba797-8da0-4e56-8dcc-14d7d2b0e217/galera/0.log" Oct 07 12:34:13 crc kubenswrapper[4700]: I1007 12:34:13.576511 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0706b451-8379-454a-bf71-483b779cb17b/mysql-bootstrap/0.log" Oct 07 12:34:13 crc kubenswrapper[4700]: I1007 12:34:13.846892 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0706b451-8379-454a-bf71-483b779cb17b/mysql-bootstrap/0.log" Oct 07 12:34:13 crc kubenswrapper[4700]: I1007 12:34:13.867563 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0706b451-8379-454a-bf71-483b779cb17b/galera/0.log" Oct 07 12:34:14 crc kubenswrapper[4700]: I1007 12:34:14.106261 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_7474ed66-6936-4cd0-b7ca-0182eaeec767/openstackclient/0.log" Oct 07 12:34:14 crc kubenswrapper[4700]: I1007 12:34:14.269410 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_adba6450-c198-456b-a139-67d93e54847b/nova-metadata-metadata/0.log" Oct 07 12:34:14 crc kubenswrapper[4700]: I1007 12:34:14.308164 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-m9nzp_f39e97ad-dbbb-45d4-a595-8f675165ed7d/ovn-controller/0.log" Oct 07 12:34:14 crc kubenswrapper[4700]: I1007 12:34:14.493270 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-44drg_de65476d-b545-432c-a5d2-5b5bd95a9369/openstack-network-exporter/0.log" Oct 07 12:34:14 crc kubenswrapper[4700]: I1007 12:34:14.757415 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ksvhb_36a3b431-4387-4ba7-a2c1-e72622594a8c/ovsdb-server-init/0.log" Oct 07 12:34:14 crc kubenswrapper[4700]: I1007 12:34:14.939766 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ksvhb_36a3b431-4387-4ba7-a2c1-e72622594a8c/ovs-vswitchd/0.log" Oct 07 12:34:14 crc kubenswrapper[4700]: I1007 12:34:14.942199 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ksvhb_36a3b431-4387-4ba7-a2c1-e72622594a8c/ovsdb-server/0.log" Oct 07 12:34:15 crc kubenswrapper[4700]: I1007 12:34:15.085341 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ksvhb_36a3b431-4387-4ba7-a2c1-e72622594a8c/ovsdb-server-init/0.log" Oct 07 12:34:15 crc kubenswrapper[4700]: I1007 12:34:15.199215 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-s25ms_19561523-a9ea-4632-9aa7-6be23fa3eee5/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 12:34:15 crc kubenswrapper[4700]: I1007 12:34:15.778676 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_2b5f78cd-b302-4c30-87c5-82954e351d55/openstack-network-exporter/0.log" Oct 07 12:34:15 crc kubenswrapper[4700]: I1007 12:34:15.789374 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_2b5f78cd-b302-4c30-87c5-82954e351d55/ovn-northd/0.log" Oct 07 12:34:16 crc kubenswrapper[4700]: I1007 12:34:16.010080 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_57e7be90-ef51-432f-afa7-edbff56123e0/openstack-network-exporter/0.log" Oct 07 12:34:16 crc kubenswrapper[4700]: I1007 12:34:16.038220 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_57e7be90-ef51-432f-afa7-edbff56123e0/ovsdbserver-nb/0.log" Oct 07 12:34:16 crc kubenswrapper[4700]: I1007 12:34:16.262065 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f13aad27-7d23-4de6-8de0-c8a61809de5d/openstack-network-exporter/0.log" Oct 07 12:34:16 crc kubenswrapper[4700]: I1007 12:34:16.266476 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f13aad27-7d23-4de6-8de0-c8a61809de5d/ovsdbserver-sb/0.log" Oct 07 12:34:16 crc kubenswrapper[4700]: I1007 12:34:16.521999 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7d99cbbb56-xqb5x_36efa6df-bc80-48f6-8611-e8dff3530d8e/placement-api/0.log" Oct 07 12:34:16 crc kubenswrapper[4700]: I1007 12:34:16.630195 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7d99cbbb56-xqb5x_36efa6df-bc80-48f6-8611-e8dff3530d8e/placement-log/0.log" Oct 07 12:34:16 crc kubenswrapper[4700]: I1007 12:34:16.830557 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7b10d3cc-1670-4070-9e12-7049b2906d9d/init-config-reloader/0.log" Oct 07 12:34:17 crc kubenswrapper[4700]: I1007 12:34:17.578448 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7b10d3cc-1670-4070-9e12-7049b2906d9d/config-reloader/0.log" Oct 07 12:34:17 crc kubenswrapper[4700]: I1007 12:34:17.607326 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7b10d3cc-1670-4070-9e12-7049b2906d9d/init-config-reloader/0.log" Oct 07 12:34:17 crc kubenswrapper[4700]: I1007 12:34:17.617439 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7b10d3cc-1670-4070-9e12-7049b2906d9d/prometheus/0.log" Oct 07 12:34:17 crc kubenswrapper[4700]: I1007 12:34:17.788202 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7b10d3cc-1670-4070-9e12-7049b2906d9d/thanos-sidecar/0.log" Oct 07 12:34:17 crc kubenswrapper[4700]: I1007 12:34:17.850460 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_abea1f83-cad5-40e9-a9d7-543660436ae0/setup-container/0.log" Oct 07 12:34:18 crc kubenswrapper[4700]: I1007 12:34:18.048288 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_abea1f83-cad5-40e9-a9d7-543660436ae0/rabbitmq/0.log" Oct 07 12:34:18 crc kubenswrapper[4700]: I1007 12:34:18.062108 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_abea1f83-cad5-40e9-a9d7-543660436ae0/setup-container/0.log" Oct 07 12:34:18 crc kubenswrapper[4700]: I1007 12:34:18.251522 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_509a6d73-2ff1-43f5-aa66-97d3a7d10e88/setup-container/0.log" Oct 07 12:34:18 crc kubenswrapper[4700]: I1007 12:34:18.514266 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_509a6d73-2ff1-43f5-aa66-97d3a7d10e88/setup-container/0.log" Oct 07 12:34:18 crc kubenswrapper[4700]: I1007 12:34:18.543294 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_509a6d73-2ff1-43f5-aa66-97d3a7d10e88/rabbitmq/0.log" Oct 07 12:34:18 crc kubenswrapper[4700]: I1007 12:34:18.744005 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-2p4sb_5c0097f4-e71a-4bfe-8425-c87d93929a43/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 12:34:18 crc kubenswrapper[4700]: I1007 12:34:18.761759 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-7926j_d1dbb92e-36fc-47a7-8c0f-2b28c5f2d636/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 12:34:18 crc kubenswrapper[4700]: I1007 12:34:18.961250 4700 scope.go:117] "RemoveContainer" containerID="dd3569fef33a6369b30b4ac53f5a45541b5a0d27944d097fc6979cfc99c5dc6a" Oct 07 12:34:18 crc kubenswrapper[4700]: E1007 12:34:18.961555 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:34:19 crc kubenswrapper[4700]: I1007 12:34:19.172260 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-2dkc5_3da1015a-c431-4f0f-971a-98b31f112e53/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 12:34:19 crc kubenswrapper[4700]: I1007 12:34:19.449624 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-h694x_e0c8a166-6dac-4916-a2a7-9367a5ba765c/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 12:34:19 crc kubenswrapper[4700]: I1007 12:34:19.605593 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-zdwmq_046d1005-6634-4a0a-b10c-f5e3faf34ba6/ssh-known-hosts-edpm-deployment/0.log" Oct 07 12:34:19 crc kubenswrapper[4700]: I1007 12:34:19.826101 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7cd7d44d75-xs58b_2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81/proxy-server/0.log" Oct 07 12:34:19 crc kubenswrapper[4700]: I1007 12:34:19.891618 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7cd7d44d75-xs58b_2bbd8fd6-1da1-4b74-b79d-a438dc0d4c81/proxy-httpd/0.log" Oct 07 12:34:20 crc kubenswrapper[4700]: I1007 12:34:20.006229 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-pfsbd_3352f4b9-00aa-419c-a354-1fb7b7120ad5/swift-ring-rebalance/0.log" Oct 07 12:34:20 crc kubenswrapper[4700]: I1007 12:34:20.180244 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6555d4a9-f098-43b2-9b50-f7c9d855cf6a/account-auditor/0.log" Oct 07 12:34:20 crc kubenswrapper[4700]: I1007 12:34:20.222473 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6555d4a9-f098-43b2-9b50-f7c9d855cf6a/account-reaper/0.log" Oct 07 12:34:20 crc kubenswrapper[4700]: I1007 12:34:20.424498 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6555d4a9-f098-43b2-9b50-f7c9d855cf6a/account-server/0.log" Oct 07 12:34:20 crc kubenswrapper[4700]: I1007 12:34:20.424886 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6555d4a9-f098-43b2-9b50-f7c9d855cf6a/account-replicator/0.log" Oct 07 12:34:20 crc kubenswrapper[4700]: I1007 12:34:20.460551 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6555d4a9-f098-43b2-9b50-f7c9d855cf6a/container-auditor/0.log" Oct 07 12:34:20 crc kubenswrapper[4700]: I1007 12:34:20.640559 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6555d4a9-f098-43b2-9b50-f7c9d855cf6a/container-server/0.log" Oct 07 12:34:20 crc kubenswrapper[4700]: I1007 12:34:20.640599 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6555d4a9-f098-43b2-9b50-f7c9d855cf6a/container-updater/0.log" Oct 07 12:34:20 crc kubenswrapper[4700]: I1007 12:34:20.656256 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6555d4a9-f098-43b2-9b50-f7c9d855cf6a/container-replicator/0.log" Oct 07 12:34:20 crc kubenswrapper[4700]: I1007 12:34:20.829414 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6555d4a9-f098-43b2-9b50-f7c9d855cf6a/object-auditor/0.log" Oct 07 12:34:20 crc kubenswrapper[4700]: I1007 12:34:20.865987 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6555d4a9-f098-43b2-9b50-f7c9d855cf6a/object-expirer/0.log" Oct 07 12:34:20 crc kubenswrapper[4700]: I1007 12:34:20.921343 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6555d4a9-f098-43b2-9b50-f7c9d855cf6a/object-replicator/0.log" Oct 07 12:34:21 crc kubenswrapper[4700]: I1007 12:34:21.052614 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6555d4a9-f098-43b2-9b50-f7c9d855cf6a/object-server/0.log" Oct 07 12:34:21 crc kubenswrapper[4700]: I1007 12:34:21.124785 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6555d4a9-f098-43b2-9b50-f7c9d855cf6a/object-updater/0.log" Oct 07 12:34:21 crc kubenswrapper[4700]: I1007 12:34:21.140587 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6555d4a9-f098-43b2-9b50-f7c9d855cf6a/rsync/0.log" Oct 07 12:34:21 crc kubenswrapper[4700]: I1007 12:34:21.324760 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6555d4a9-f098-43b2-9b50-f7c9d855cf6a/swift-recon-cron/0.log" Oct 07 12:34:21 crc kubenswrapper[4700]: I1007 12:34:21.444546 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-v2p2q_3b253611-bde5-4dcd-9291-284951206e6f/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 12:34:21 crc kubenswrapper[4700]: I1007 12:34:21.608601 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-4xzsk_fb8189cc-a34b-4793-91c0-7c4d5b837374/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 12:34:29 crc kubenswrapper[4700]: I1007 12:34:29.441655 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_74b58ae8-a57b-4c4d-9ca1-97bfa7ca8a81/memcached/0.log" Oct 07 12:34:33 crc kubenswrapper[4700]: I1007 12:34:33.965751 4700 scope.go:117] "RemoveContainer" containerID="dd3569fef33a6369b30b4ac53f5a45541b5a0d27944d097fc6979cfc99c5dc6a" Oct 07 12:34:33 crc kubenswrapper[4700]: E1007 12:34:33.966654 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:34:37 crc kubenswrapper[4700]: I1007 12:34:37.286289 4700 generic.go:334] "Generic (PLEG): container finished" podID="7b3701aa-5891-4023-9d82-72487c196cd7" containerID="77c25e2b945b0b6281ca00d66c0a048772f5386d6f2640ce80fe2e10f9f94331" exitCode=0 Oct 07 12:34:37 crc kubenswrapper[4700]: I1007 12:34:37.286387 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2n8h/crc-debug-8zh8b" event={"ID":"7b3701aa-5891-4023-9d82-72487c196cd7","Type":"ContainerDied","Data":"77c25e2b945b0b6281ca00d66c0a048772f5386d6f2640ce80fe2e10f9f94331"} Oct 07 12:34:38 crc kubenswrapper[4700]: I1007 12:34:38.431548 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2n8h/crc-debug-8zh8b" Oct 07 12:34:38 crc kubenswrapper[4700]: I1007 12:34:38.466763 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-d2n8h/crc-debug-8zh8b"] Oct 07 12:34:38 crc kubenswrapper[4700]: I1007 12:34:38.475205 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-d2n8h/crc-debug-8zh8b"] Oct 07 12:34:38 crc kubenswrapper[4700]: I1007 12:34:38.530975 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7b3701aa-5891-4023-9d82-72487c196cd7-host\") pod \"7b3701aa-5891-4023-9d82-72487c196cd7\" (UID: \"7b3701aa-5891-4023-9d82-72487c196cd7\") " Oct 07 12:34:38 crc kubenswrapper[4700]: I1007 12:34:38.531086 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zhxb\" (UniqueName: \"kubernetes.io/projected/7b3701aa-5891-4023-9d82-72487c196cd7-kube-api-access-2zhxb\") pod \"7b3701aa-5891-4023-9d82-72487c196cd7\" (UID: \"7b3701aa-5891-4023-9d82-72487c196cd7\") " Oct 07 12:34:38 crc kubenswrapper[4700]: I1007 12:34:38.531253 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b3701aa-5891-4023-9d82-72487c196cd7-host" (OuterVolumeSpecName: "host") pod "7b3701aa-5891-4023-9d82-72487c196cd7" (UID: "7b3701aa-5891-4023-9d82-72487c196cd7"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:34:38 crc kubenswrapper[4700]: I1007 12:34:38.531595 4700 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7b3701aa-5891-4023-9d82-72487c196cd7-host\") on node \"crc\" DevicePath \"\"" Oct 07 12:34:38 crc kubenswrapper[4700]: I1007 12:34:38.538620 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b3701aa-5891-4023-9d82-72487c196cd7-kube-api-access-2zhxb" (OuterVolumeSpecName: "kube-api-access-2zhxb") pod "7b3701aa-5891-4023-9d82-72487c196cd7" (UID: "7b3701aa-5891-4023-9d82-72487c196cd7"). InnerVolumeSpecName "kube-api-access-2zhxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:34:38 crc kubenswrapper[4700]: I1007 12:34:38.633394 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zhxb\" (UniqueName: \"kubernetes.io/projected/7b3701aa-5891-4023-9d82-72487c196cd7-kube-api-access-2zhxb\") on node \"crc\" DevicePath \"\"" Oct 07 12:34:39 crc kubenswrapper[4700]: I1007 12:34:39.316078 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3fc96ab859c944257cead52d8fe3fdc068dbf084b78b967c5e02798ed652273" Oct 07 12:34:39 crc kubenswrapper[4700]: I1007 12:34:39.316634 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2n8h/crc-debug-8zh8b" Oct 07 12:34:39 crc kubenswrapper[4700]: I1007 12:34:39.651084 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d2n8h/crc-debug-dp76b"] Oct 07 12:34:39 crc kubenswrapper[4700]: E1007 12:34:39.651496 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b3701aa-5891-4023-9d82-72487c196cd7" containerName="container-00" Oct 07 12:34:39 crc kubenswrapper[4700]: I1007 12:34:39.651508 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b3701aa-5891-4023-9d82-72487c196cd7" containerName="container-00" Oct 07 12:34:39 crc kubenswrapper[4700]: I1007 12:34:39.651739 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b3701aa-5891-4023-9d82-72487c196cd7" containerName="container-00" Oct 07 12:34:39 crc kubenswrapper[4700]: I1007 12:34:39.652462 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2n8h/crc-debug-dp76b" Oct 07 12:34:39 crc kubenswrapper[4700]: I1007 12:34:39.655598 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-d2n8h"/"default-dockercfg-jqcts" Oct 07 12:34:39 crc kubenswrapper[4700]: I1007 12:34:39.754930 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f3aa39b-6121-4ebc-b80f-e288968627e4-host\") pod \"crc-debug-dp76b\" (UID: \"0f3aa39b-6121-4ebc-b80f-e288968627e4\") " pod="openshift-must-gather-d2n8h/crc-debug-dp76b" Oct 07 12:34:39 crc kubenswrapper[4700]: I1007 12:34:39.755293 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvjs4\" (UniqueName: \"kubernetes.io/projected/0f3aa39b-6121-4ebc-b80f-e288968627e4-kube-api-access-zvjs4\") pod \"crc-debug-dp76b\" (UID: \"0f3aa39b-6121-4ebc-b80f-e288968627e4\") " pod="openshift-must-gather-d2n8h/crc-debug-dp76b" Oct 07 12:34:39 crc kubenswrapper[4700]: I1007 12:34:39.857708 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f3aa39b-6121-4ebc-b80f-e288968627e4-host\") pod \"crc-debug-dp76b\" (UID: \"0f3aa39b-6121-4ebc-b80f-e288968627e4\") " pod="openshift-must-gather-d2n8h/crc-debug-dp76b" Oct 07 12:34:39 crc kubenswrapper[4700]: I1007 12:34:39.857838 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvjs4\" (UniqueName: \"kubernetes.io/projected/0f3aa39b-6121-4ebc-b80f-e288968627e4-kube-api-access-zvjs4\") pod \"crc-debug-dp76b\" (UID: \"0f3aa39b-6121-4ebc-b80f-e288968627e4\") " pod="openshift-must-gather-d2n8h/crc-debug-dp76b" Oct 07 12:34:39 crc kubenswrapper[4700]: I1007 12:34:39.857847 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f3aa39b-6121-4ebc-b80f-e288968627e4-host\") pod \"crc-debug-dp76b\" (UID: \"0f3aa39b-6121-4ebc-b80f-e288968627e4\") " pod="openshift-must-gather-d2n8h/crc-debug-dp76b" Oct 07 12:34:39 crc kubenswrapper[4700]: I1007 12:34:39.978214 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvjs4\" (UniqueName: \"kubernetes.io/projected/0f3aa39b-6121-4ebc-b80f-e288968627e4-kube-api-access-zvjs4\") pod \"crc-debug-dp76b\" (UID: \"0f3aa39b-6121-4ebc-b80f-e288968627e4\") " pod="openshift-must-gather-d2n8h/crc-debug-dp76b" Oct 07 12:34:39 crc kubenswrapper[4700]: I1007 12:34:39.978781 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b3701aa-5891-4023-9d82-72487c196cd7" path="/var/lib/kubelet/pods/7b3701aa-5891-4023-9d82-72487c196cd7/volumes" Oct 07 12:34:40 crc kubenswrapper[4700]: I1007 12:34:40.271581 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2n8h/crc-debug-dp76b" Oct 07 12:34:41 crc kubenswrapper[4700]: I1007 12:34:41.336704 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2n8h/crc-debug-dp76b" event={"ID":"0f3aa39b-6121-4ebc-b80f-e288968627e4","Type":"ContainerStarted","Data":"a52857fb60633e09284c47209e1be6c08f151c3968ed77018e3bfb8835f9379a"} Oct 07 12:34:41 crc kubenswrapper[4700]: I1007 12:34:41.337531 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2n8h/crc-debug-dp76b" event={"ID":"0f3aa39b-6121-4ebc-b80f-e288968627e4","Type":"ContainerStarted","Data":"450a541e1e5dd38cd331d2d69367ef376602bb2c7793c02ea04755bd0d5ce5fc"} Oct 07 12:34:42 crc kubenswrapper[4700]: I1007 12:34:42.350064 4700 generic.go:334] "Generic (PLEG): container finished" podID="0f3aa39b-6121-4ebc-b80f-e288968627e4" containerID="a52857fb60633e09284c47209e1be6c08f151c3968ed77018e3bfb8835f9379a" exitCode=0 Oct 07 12:34:42 crc kubenswrapper[4700]: I1007 12:34:42.350125 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2n8h/crc-debug-dp76b" event={"ID":"0f3aa39b-6121-4ebc-b80f-e288968627e4","Type":"ContainerDied","Data":"a52857fb60633e09284c47209e1be6c08f151c3968ed77018e3bfb8835f9379a"} Oct 07 12:34:42 crc kubenswrapper[4700]: I1007 12:34:42.468602 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2n8h/crc-debug-dp76b" Oct 07 12:34:42 crc kubenswrapper[4700]: I1007 12:34:42.628647 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvjs4\" (UniqueName: \"kubernetes.io/projected/0f3aa39b-6121-4ebc-b80f-e288968627e4-kube-api-access-zvjs4\") pod \"0f3aa39b-6121-4ebc-b80f-e288968627e4\" (UID: \"0f3aa39b-6121-4ebc-b80f-e288968627e4\") " Oct 07 12:34:42 crc kubenswrapper[4700]: I1007 12:34:42.629500 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f3aa39b-6121-4ebc-b80f-e288968627e4-host\") pod \"0f3aa39b-6121-4ebc-b80f-e288968627e4\" (UID: \"0f3aa39b-6121-4ebc-b80f-e288968627e4\") " Oct 07 12:34:42 crc kubenswrapper[4700]: I1007 12:34:42.629656 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f3aa39b-6121-4ebc-b80f-e288968627e4-host" (OuterVolumeSpecName: "host") pod "0f3aa39b-6121-4ebc-b80f-e288968627e4" (UID: "0f3aa39b-6121-4ebc-b80f-e288968627e4"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:34:42 crc kubenswrapper[4700]: I1007 12:34:42.630481 4700 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f3aa39b-6121-4ebc-b80f-e288968627e4-host\") on node \"crc\" DevicePath \"\"" Oct 07 12:34:42 crc kubenswrapper[4700]: I1007 12:34:42.654919 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f3aa39b-6121-4ebc-b80f-e288968627e4-kube-api-access-zvjs4" (OuterVolumeSpecName: "kube-api-access-zvjs4") pod "0f3aa39b-6121-4ebc-b80f-e288968627e4" (UID: "0f3aa39b-6121-4ebc-b80f-e288968627e4"). InnerVolumeSpecName "kube-api-access-zvjs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:34:42 crc kubenswrapper[4700]: I1007 12:34:42.731888 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvjs4\" (UniqueName: \"kubernetes.io/projected/0f3aa39b-6121-4ebc-b80f-e288968627e4-kube-api-access-zvjs4\") on node \"crc\" DevicePath \"\"" Oct 07 12:34:43 crc kubenswrapper[4700]: I1007 12:34:43.361077 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2n8h/crc-debug-dp76b" event={"ID":"0f3aa39b-6121-4ebc-b80f-e288968627e4","Type":"ContainerDied","Data":"450a541e1e5dd38cd331d2d69367ef376602bb2c7793c02ea04755bd0d5ce5fc"} Oct 07 12:34:43 crc kubenswrapper[4700]: I1007 12:34:43.361126 4700 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="450a541e1e5dd38cd331d2d69367ef376602bb2c7793c02ea04755bd0d5ce5fc" Oct 07 12:34:43 crc kubenswrapper[4700]: I1007 12:34:43.361192 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2n8h/crc-debug-dp76b" Oct 07 12:34:46 crc kubenswrapper[4700]: I1007 12:34:46.957531 4700 scope.go:117] "RemoveContainer" containerID="dd3569fef33a6369b30b4ac53f5a45541b5a0d27944d097fc6979cfc99c5dc6a" Oct 07 12:34:46 crc kubenswrapper[4700]: E1007 12:34:46.958664 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:34:47 crc kubenswrapper[4700]: I1007 12:34:47.725536 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-d2n8h/crc-debug-dp76b"] Oct 07 12:34:47 crc kubenswrapper[4700]: I1007 12:34:47.733497 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-d2n8h/crc-debug-dp76b"] Oct 07 12:34:47 crc kubenswrapper[4700]: I1007 12:34:47.972419 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f3aa39b-6121-4ebc-b80f-e288968627e4" path="/var/lib/kubelet/pods/0f3aa39b-6121-4ebc-b80f-e288968627e4/volumes" Oct 07 12:34:49 crc kubenswrapper[4700]: I1007 12:34:49.151005 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d2n8h/crc-debug-qpsbw"] Oct 07 12:34:49 crc kubenswrapper[4700]: E1007 12:34:49.151883 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f3aa39b-6121-4ebc-b80f-e288968627e4" containerName="container-00" Oct 07 12:34:49 crc kubenswrapper[4700]: I1007 12:34:49.151900 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f3aa39b-6121-4ebc-b80f-e288968627e4" containerName="container-00" Oct 07 12:34:49 crc kubenswrapper[4700]: I1007 12:34:49.152137 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f3aa39b-6121-4ebc-b80f-e288968627e4" containerName="container-00" Oct 07 12:34:49 crc kubenswrapper[4700]: I1007 12:34:49.152997 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2n8h/crc-debug-qpsbw" Oct 07 12:34:49 crc kubenswrapper[4700]: I1007 12:34:49.155999 4700 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-d2n8h"/"default-dockercfg-jqcts" Oct 07 12:34:49 crc kubenswrapper[4700]: I1007 12:34:49.247177 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8fc753e7-584c-4b3b-9800-7afe044ba968-host\") pod \"crc-debug-qpsbw\" (UID: \"8fc753e7-584c-4b3b-9800-7afe044ba968\") " pod="openshift-must-gather-d2n8h/crc-debug-qpsbw" Oct 07 12:34:49 crc kubenswrapper[4700]: I1007 12:34:49.247270 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgwxq\" (UniqueName: \"kubernetes.io/projected/8fc753e7-584c-4b3b-9800-7afe044ba968-kube-api-access-sgwxq\") pod \"crc-debug-qpsbw\" (UID: \"8fc753e7-584c-4b3b-9800-7afe044ba968\") " pod="openshift-must-gather-d2n8h/crc-debug-qpsbw" Oct 07 12:34:49 crc kubenswrapper[4700]: I1007 12:34:49.349093 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgwxq\" (UniqueName: \"kubernetes.io/projected/8fc753e7-584c-4b3b-9800-7afe044ba968-kube-api-access-sgwxq\") pod \"crc-debug-qpsbw\" (UID: \"8fc753e7-584c-4b3b-9800-7afe044ba968\") " pod="openshift-must-gather-d2n8h/crc-debug-qpsbw" Oct 07 12:34:49 crc kubenswrapper[4700]: I1007 12:34:49.349279 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8fc753e7-584c-4b3b-9800-7afe044ba968-host\") pod \"crc-debug-qpsbw\" (UID: \"8fc753e7-584c-4b3b-9800-7afe044ba968\") " pod="openshift-must-gather-d2n8h/crc-debug-qpsbw" Oct 07 12:34:49 crc kubenswrapper[4700]: I1007 12:34:49.349353 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8fc753e7-584c-4b3b-9800-7afe044ba968-host\") pod \"crc-debug-qpsbw\" (UID: \"8fc753e7-584c-4b3b-9800-7afe044ba968\") " pod="openshift-must-gather-d2n8h/crc-debug-qpsbw" Oct 07 12:34:49 crc kubenswrapper[4700]: I1007 12:34:49.976259 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgwxq\" (UniqueName: \"kubernetes.io/projected/8fc753e7-584c-4b3b-9800-7afe044ba968-kube-api-access-sgwxq\") pod \"crc-debug-qpsbw\" (UID: \"8fc753e7-584c-4b3b-9800-7afe044ba968\") " pod="openshift-must-gather-d2n8h/crc-debug-qpsbw" Oct 07 12:34:50 crc kubenswrapper[4700]: I1007 12:34:50.073597 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2n8h/crc-debug-qpsbw" Oct 07 12:34:50 crc kubenswrapper[4700]: I1007 12:34:50.426656 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2n8h/crc-debug-qpsbw" event={"ID":"8fc753e7-584c-4b3b-9800-7afe044ba968","Type":"ContainerStarted","Data":"ce8f5f25a358f4f144bb94788236e3a7e76156383734df83a2052aa192e71760"} Oct 07 12:34:51 crc kubenswrapper[4700]: I1007 12:34:51.441668 4700 generic.go:334] "Generic (PLEG): container finished" podID="8fc753e7-584c-4b3b-9800-7afe044ba968" containerID="904b297002bf4d724cbd734c944be8d2d3e704782f59d0e35131e7d1bd2340a2" exitCode=0 Oct 07 12:34:51 crc kubenswrapper[4700]: I1007 12:34:51.442591 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2n8h/crc-debug-qpsbw" event={"ID":"8fc753e7-584c-4b3b-9800-7afe044ba968","Type":"ContainerDied","Data":"904b297002bf4d724cbd734c944be8d2d3e704782f59d0e35131e7d1bd2340a2"} Oct 07 12:34:51 crc kubenswrapper[4700]: I1007 12:34:51.516119 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-d2n8h/crc-debug-qpsbw"] Oct 07 12:34:51 crc kubenswrapper[4700]: I1007 12:34:51.530842 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-d2n8h/crc-debug-qpsbw"] Oct 07 12:34:52 crc kubenswrapper[4700]: I1007 12:34:52.554561 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2n8h/crc-debug-qpsbw" Oct 07 12:34:52 crc kubenswrapper[4700]: I1007 12:34:52.616165 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8fc753e7-584c-4b3b-9800-7afe044ba968-host\") pod \"8fc753e7-584c-4b3b-9800-7afe044ba968\" (UID: \"8fc753e7-584c-4b3b-9800-7afe044ba968\") " Oct 07 12:34:52 crc kubenswrapper[4700]: I1007 12:34:52.616245 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fc753e7-584c-4b3b-9800-7afe044ba968-host" (OuterVolumeSpecName: "host") pod "8fc753e7-584c-4b3b-9800-7afe044ba968" (UID: "8fc753e7-584c-4b3b-9800-7afe044ba968"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:34:52 crc kubenswrapper[4700]: I1007 12:34:52.616685 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgwxq\" (UniqueName: \"kubernetes.io/projected/8fc753e7-584c-4b3b-9800-7afe044ba968-kube-api-access-sgwxq\") pod \"8fc753e7-584c-4b3b-9800-7afe044ba968\" (UID: \"8fc753e7-584c-4b3b-9800-7afe044ba968\") " Oct 07 12:34:52 crc kubenswrapper[4700]: I1007 12:34:52.617282 4700 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8fc753e7-584c-4b3b-9800-7afe044ba968-host\") on node \"crc\" DevicePath \"\"" Oct 07 12:34:52 crc kubenswrapper[4700]: I1007 12:34:52.622749 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fc753e7-584c-4b3b-9800-7afe044ba968-kube-api-access-sgwxq" (OuterVolumeSpecName: "kube-api-access-sgwxq") pod "8fc753e7-584c-4b3b-9800-7afe044ba968" (UID: "8fc753e7-584c-4b3b-9800-7afe044ba968"). InnerVolumeSpecName "kube-api-access-sgwxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:34:52 crc kubenswrapper[4700]: I1007 12:34:52.719807 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgwxq\" (UniqueName: \"kubernetes.io/projected/8fc753e7-584c-4b3b-9800-7afe044ba968-kube-api-access-sgwxq\") on node \"crc\" DevicePath \"\"" Oct 07 12:34:53 crc kubenswrapper[4700]: I1007 12:34:53.462452 4700 scope.go:117] "RemoveContainer" containerID="904b297002bf4d724cbd734c944be8d2d3e704782f59d0e35131e7d1bd2340a2" Oct 07 12:34:53 crc kubenswrapper[4700]: I1007 12:34:53.462519 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2n8h/crc-debug-qpsbw" Oct 07 12:34:53 crc kubenswrapper[4700]: I1007 12:34:53.973073 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fc753e7-584c-4b3b-9800-7afe044ba968" path="/var/lib/kubelet/pods/8fc753e7-584c-4b3b-9800-7afe044ba968/volumes" Oct 07 12:34:55 crc kubenswrapper[4700]: I1007 12:34:55.994527 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml_6227f006-b587-451f-bff5-cf97da256b9f/util/0.log" Oct 07 12:34:56 crc kubenswrapper[4700]: I1007 12:34:56.164083 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml_6227f006-b587-451f-bff5-cf97da256b9f/pull/0.log" Oct 07 12:34:56 crc kubenswrapper[4700]: I1007 12:34:56.179420 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml_6227f006-b587-451f-bff5-cf97da256b9f/pull/0.log" Oct 07 12:34:56 crc kubenswrapper[4700]: I1007 12:34:56.310976 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml_6227f006-b587-451f-bff5-cf97da256b9f/util/0.log" Oct 07 12:34:56 crc kubenswrapper[4700]: I1007 12:34:56.313119 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml_6227f006-b587-451f-bff5-cf97da256b9f/util/0.log" Oct 07 12:34:56 crc kubenswrapper[4700]: I1007 12:34:56.416365 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml_6227f006-b587-451f-bff5-cf97da256b9f/extract/0.log" Oct 07 12:34:56 crc kubenswrapper[4700]: I1007 12:34:56.446976 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_673cb1016b9a8da3abb39c6f91d73e044ad6743308c34072552731c296466ml_6227f006-b587-451f-bff5-cf97da256b9f/pull/0.log" Oct 07 12:34:56 crc kubenswrapper[4700]: I1007 12:34:56.614961 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-58c4cd55f4-vvp7c_b69da5cc-fa66-4adb-b136-1efe25092b40/kube-rbac-proxy/0.log" Oct 07 12:34:56 crc kubenswrapper[4700]: I1007 12:34:56.664186 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-58c4cd55f4-vvp7c_b69da5cc-fa66-4adb-b136-1efe25092b40/manager/0.log" Oct 07 12:34:56 crc kubenswrapper[4700]: I1007 12:34:56.784446 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-mg8jv_b5298439-5a42-4bca-aa5b-c3fb26b2e5e3/kube-rbac-proxy/0.log" Oct 07 12:34:56 crc kubenswrapper[4700]: I1007 12:34:56.823433 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-mg8jv_b5298439-5a42-4bca-aa5b-c3fb26b2e5e3/manager/0.log" Oct 07 12:34:56 crc kubenswrapper[4700]: I1007 12:34:56.875958 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-9xkbh_6b424cb2-c37e-4db9-86f4-75132c345127/kube-rbac-proxy/0.log" Oct 07 12:34:56 crc kubenswrapper[4700]: I1007 12:34:56.972476 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-9xkbh_6b424cb2-c37e-4db9-86f4-75132c345127/manager/0.log" Oct 07 12:34:57 crc kubenswrapper[4700]: I1007 12:34:57.002948 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5dc44df7d5-8dhrp_0ef02b09-6290-414e-b0b3-f9d52138d53d/kube-rbac-proxy/0.log" Oct 07 12:34:57 crc kubenswrapper[4700]: I1007 12:34:57.144264 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5dc44df7d5-8dhrp_0ef02b09-6290-414e-b0b3-f9d52138d53d/manager/0.log" Oct 07 12:34:57 crc kubenswrapper[4700]: I1007 12:34:57.212179 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-54b4974c45-s2qwp_6992fc43-2f9e-414d-8c14-f08185ed395a/kube-rbac-proxy/0.log" Oct 07 12:34:57 crc kubenswrapper[4700]: I1007 12:34:57.276631 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-54b4974c45-s2qwp_6992fc43-2f9e-414d-8c14-f08185ed395a/manager/0.log" Oct 07 12:34:57 crc kubenswrapper[4700]: I1007 12:34:57.384656 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-76d5b87f47-69f5t_6f56ee78-9e72-4fbb-abff-985e142a17cb/kube-rbac-proxy/0.log" Oct 07 12:34:57 crc kubenswrapper[4700]: I1007 12:34:57.397430 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-76d5b87f47-69f5t_6f56ee78-9e72-4fbb-abff-985e142a17cb/manager/0.log" Oct 07 12:34:57 crc kubenswrapper[4700]: I1007 12:34:57.698652 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-5nnwj_0df5f995-5a5e-4c40-a498-9dd5ffd4381c/kube-rbac-proxy/0.log" Oct 07 12:34:57 crc kubenswrapper[4700]: I1007 12:34:57.748577 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-5nnwj_0df5f995-5a5e-4c40-a498-9dd5ffd4381c/manager/0.log" Oct 07 12:34:57 crc kubenswrapper[4700]: I1007 12:34:57.812062 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-649675d675-dfs4r_42762f72-5039-4394-a311-299b57c3485a/kube-rbac-proxy/0.log" Oct 07 12:34:57 crc kubenswrapper[4700]: I1007 12:34:57.925610 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-649675d675-dfs4r_42762f72-5039-4394-a311-299b57c3485a/manager/0.log" Oct 07 12:34:57 crc kubenswrapper[4700]: I1007 12:34:57.992576 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b5ccf6d9c-dnn6x_e260e7ed-c267-40b2-861a-9a77325e027a/kube-rbac-proxy/0.log" Oct 07 12:34:58 crc kubenswrapper[4700]: I1007 12:34:58.706578 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b5ccf6d9c-dnn6x_e260e7ed-c267-40b2-861a-9a77325e027a/manager/0.log" Oct 07 12:34:58 crc kubenswrapper[4700]: I1007 12:34:58.761503 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-n4w8d_1f959f97-0c30-4e7c-a006-9517950bc1c1/manager/0.log" Oct 07 12:34:58 crc kubenswrapper[4700]: I1007 12:34:58.780438 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-n4w8d_1f959f97-0c30-4e7c-a006-9517950bc1c1/kube-rbac-proxy/0.log" Oct 07 12:34:58 crc kubenswrapper[4700]: I1007 12:34:58.904731 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-5szjk_bc62ffd3-f1d8-46c2-8777-c6ad960d68a8/kube-rbac-proxy/0.log" Oct 07 12:34:58 crc kubenswrapper[4700]: I1007 12:34:58.940474 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-5szjk_bc62ffd3-f1d8-46c2-8777-c6ad960d68a8/manager/0.log" Oct 07 12:34:59 crc kubenswrapper[4700]: I1007 12:34:59.120275 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-jrw6r_1986b44c-0b98-4c70-a2a3-78f86e586d87/kube-rbac-proxy/0.log" Oct 07 12:34:59 crc kubenswrapper[4700]: I1007 12:34:59.141448 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-jrw6r_1986b44c-0b98-4c70-a2a3-78f86e586d87/manager/0.log" Oct 07 12:34:59 crc kubenswrapper[4700]: I1007 12:34:59.237288 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-xxzx6_7a64a375-b3e0-47ac-b715-cea59989a781/kube-rbac-proxy/0.log" Oct 07 12:34:59 crc kubenswrapper[4700]: I1007 12:34:59.381435 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-x8twn_4e24a2f2-716e-4273-86a7-7ad450736748/kube-rbac-proxy/0.log" Oct 07 12:34:59 crc kubenswrapper[4700]: I1007 12:34:59.386664 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-xxzx6_7a64a375-b3e0-47ac-b715-cea59989a781/manager/0.log" Oct 07 12:34:59 crc kubenswrapper[4700]: I1007 12:34:59.424686 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-x8twn_4e24a2f2-716e-4273-86a7-7ad450736748/manager/0.log" Oct 07 12:34:59 crc kubenswrapper[4700]: I1007 12:34:59.557859 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665ctqg6s_e4140e5b-60c0-42c1-9440-0070b773f8c6/kube-rbac-proxy/0.log" Oct 07 12:34:59 crc kubenswrapper[4700]: I1007 12:34:59.593656 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665ctqg6s_e4140e5b-60c0-42c1-9440-0070b773f8c6/manager/0.log" Oct 07 12:34:59 crc kubenswrapper[4700]: I1007 12:34:59.661541 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-589f7cdddc-lk7np_86cca1e4-373c-41df-b120-c3199ef30fe0/kube-rbac-proxy/0.log" Oct 07 12:34:59 crc kubenswrapper[4700]: I1007 12:34:59.821188 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6489b698cc-vp52r_6eb7ce42-9c6b-49c9-b46e-333112be077d/kube-rbac-proxy/0.log" Oct 07 12:35:00 crc kubenswrapper[4700]: I1007 12:35:00.405227 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-7lqzz_5d24bd3d-929c-478a-9c26-3a94f09dd79a/registry-server/0.log" Oct 07 12:35:00 crc kubenswrapper[4700]: I1007 12:35:00.429996 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6489b698cc-vp52r_6eb7ce42-9c6b-49c9-b46e-333112be077d/operator/0.log" Oct 07 12:35:00 crc kubenswrapper[4700]: I1007 12:35:00.538692 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6d8b6f9b9-jv8c7_d50469d1-cf53-4396-9d9c-f03db7eb43f3/kube-rbac-proxy/0.log" Oct 07 12:35:00 crc kubenswrapper[4700]: I1007 12:35:00.632010 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6d8b6f9b9-jv8c7_d50469d1-cf53-4396-9d9c-f03db7eb43f3/manager/0.log" Oct 07 12:35:00 crc kubenswrapper[4700]: I1007 12:35:00.658460 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-xvtgc_81628b8b-eb67-4514-bbb4-44341c3962ce/kube-rbac-proxy/0.log" Oct 07 12:35:00 crc kubenswrapper[4700]: I1007 12:35:00.816763 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-xvtgc_81628b8b-eb67-4514-bbb4-44341c3962ce/manager/0.log" Oct 07 12:35:00 crc kubenswrapper[4700]: I1007 12:35:00.913652 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-rcxsc_ad8f84bd-e06b-4015-8168-938a9e1ebeaa/operator/0.log" Oct 07 12:35:00 crc kubenswrapper[4700]: I1007 12:35:00.996912 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-589f7cdddc-lk7np_86cca1e4-373c-41df-b120-c3199ef30fe0/manager/0.log" Oct 07 12:35:01 crc kubenswrapper[4700]: I1007 12:35:01.018660 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-nq2r8_0f2a771c-5adf-4c17-94ac-d7b988e3ea86/kube-rbac-proxy/0.log" Oct 07 12:35:01 crc kubenswrapper[4700]: I1007 12:35:01.079668 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-nq2r8_0f2a771c-5adf-4c17-94ac-d7b988e3ea86/manager/0.log" Oct 07 12:35:01 crc kubenswrapper[4700]: I1007 12:35:01.174201 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-bf98bb7b6-ghwv9_b5911608-e23e-46e8-9637-488593110278/kube-rbac-proxy/0.log" Oct 07 12:35:01 crc kubenswrapper[4700]: I1007 12:35:01.299293 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-9zh29_ae80f142-9ac8-4614-a0c5-b74dfe98b0c8/kube-rbac-proxy/0.log" Oct 07 12:35:01 crc kubenswrapper[4700]: I1007 12:35:01.329052 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-bf98bb7b6-ghwv9_b5911608-e23e-46e8-9637-488593110278/manager/0.log" Oct 07 12:35:01 crc kubenswrapper[4700]: I1007 12:35:01.403705 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-9zh29_ae80f142-9ac8-4614-a0c5-b74dfe98b0c8/manager/0.log" Oct 07 12:35:01 crc kubenswrapper[4700]: I1007 12:35:01.433760 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-khdlk_c147a183-5f67-45a0-a971-87e75df2a66e/kube-rbac-proxy/0.log" Oct 07 12:35:01 crc kubenswrapper[4700]: I1007 12:35:01.506076 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-khdlk_c147a183-5f67-45a0-a971-87e75df2a66e/manager/0.log" Oct 07 12:35:01 crc kubenswrapper[4700]: I1007 12:35:01.957980 4700 scope.go:117] "RemoveContainer" containerID="dd3569fef33a6369b30b4ac53f5a45541b5a0d27944d097fc6979cfc99c5dc6a" Oct 07 12:35:01 crc kubenswrapper[4700]: E1007 12:35:01.958285 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:35:14 crc kubenswrapper[4700]: I1007 12:35:14.958188 4700 scope.go:117] "RemoveContainer" containerID="dd3569fef33a6369b30b4ac53f5a45541b5a0d27944d097fc6979cfc99c5dc6a" Oct 07 12:35:14 crc kubenswrapper[4700]: E1007 12:35:14.959221 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:35:18 crc kubenswrapper[4700]: I1007 12:35:18.638282 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-wcjjj_c271ef31-887c-4b30-857a-7969eb9063bf/control-plane-machine-set-operator/0.log" Oct 07 12:35:18 crc kubenswrapper[4700]: I1007 12:35:18.873546 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-hbprv_4c92443d-f8f6-4941-9729-013d10138707/kube-rbac-proxy/0.log" Oct 07 12:35:18 crc kubenswrapper[4700]: I1007 12:35:18.902996 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-hbprv_4c92443d-f8f6-4941-9729-013d10138707/machine-api-operator/0.log" Oct 07 12:35:27 crc kubenswrapper[4700]: I1007 12:35:27.957256 4700 scope.go:117] "RemoveContainer" containerID="dd3569fef33a6369b30b4ac53f5a45541b5a0d27944d097fc6979cfc99c5dc6a" Oct 07 12:35:27 crc kubenswrapper[4700]: E1007 12:35:27.958106 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:35:31 crc kubenswrapper[4700]: I1007 12:35:31.559457 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-x2cvv_df92deba-17c9-40f7-8079-4699a5c17bf8/cert-manager-cainjector/0.log" Oct 07 12:35:31 crc kubenswrapper[4700]: I1007 12:35:31.579500 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-lv822_43b66d69-a0b2-4c0c-85d4-107ab9700398/cert-manager-controller/0.log" Oct 07 12:35:31 crc kubenswrapper[4700]: I1007 12:35:31.733469 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-nxpb9_ad60c720-bd0b-4a09-807d-88587ca33ed7/cert-manager-webhook/0.log" Oct 07 12:35:41 crc kubenswrapper[4700]: I1007 12:35:41.957595 4700 scope.go:117] "RemoveContainer" containerID="dd3569fef33a6369b30b4ac53f5a45541b5a0d27944d097fc6979cfc99c5dc6a" Oct 07 12:35:41 crc kubenswrapper[4700]: E1007 12:35:41.958386 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:35:42 crc kubenswrapper[4700]: I1007 12:35:42.616882 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-9r6fx_65303df1-c2ad-4edf-83c1-8b3d6bce8c33/nmstate-console-plugin/0.log" Oct 07 12:35:42 crc kubenswrapper[4700]: I1007 12:35:42.760112 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-7q2md_c2c057c9-3fea-45df-9991-448998e13a79/nmstate-handler/0.log" Oct 07 12:35:42 crc kubenswrapper[4700]: I1007 12:35:42.762683 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-svznx_7d6bc40a-05b0-4ff6-9111-44cc8bc88ea2/kube-rbac-proxy/0.log" Oct 07 12:35:42 crc kubenswrapper[4700]: I1007 12:35:42.794294 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-svznx_7d6bc40a-05b0-4ff6-9111-44cc8bc88ea2/nmstate-metrics/0.log" Oct 07 12:35:42 crc kubenswrapper[4700]: I1007 12:35:42.935939 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-h9nxf_ab86639c-d812-4e56-9f44-f8727fa8b6b5/nmstate-operator/0.log" Oct 07 12:35:43 crc kubenswrapper[4700]: I1007 12:35:43.014766 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-k4h2w_607c033a-3b68-4731-92c3-c9a9a08acd5c/nmstate-webhook/0.log" Oct 07 12:35:56 crc kubenswrapper[4700]: I1007 12:35:56.958081 4700 scope.go:117] "RemoveContainer" containerID="dd3569fef33a6369b30b4ac53f5a45541b5a0d27944d097fc6979cfc99c5dc6a" Oct 07 12:35:56 crc kubenswrapper[4700]: E1007 12:35:56.960943 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:35:56 crc kubenswrapper[4700]: I1007 12:35:56.976207 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-47w4l_ab4aa75e-5802-4ba7-b88c-655fad15d8af/kube-rbac-proxy/0.log" Oct 07 12:35:57 crc kubenswrapper[4700]: I1007 12:35:57.139873 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-47w4l_ab4aa75e-5802-4ba7-b88c-655fad15d8af/controller/0.log" Oct 07 12:35:57 crc kubenswrapper[4700]: I1007 12:35:57.147869 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-59fcf_173c5607-1006-4c8c-afc1-79c8248bbe7a/frr-k8s-webhook-server/0.log" Oct 07 12:35:57 crc kubenswrapper[4700]: I1007 12:35:57.324447 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zz4cl_d2a66390-1e1f-433a-8637-8f9c03197ab8/cp-frr-files/0.log" Oct 07 12:35:57 crc kubenswrapper[4700]: I1007 12:35:57.451093 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zz4cl_d2a66390-1e1f-433a-8637-8f9c03197ab8/cp-metrics/0.log" Oct 07 12:35:57 crc kubenswrapper[4700]: I1007 12:35:57.457663 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zz4cl_d2a66390-1e1f-433a-8637-8f9c03197ab8/cp-frr-files/0.log" Oct 07 12:35:57 crc kubenswrapper[4700]: I1007 12:35:57.494772 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zz4cl_d2a66390-1e1f-433a-8637-8f9c03197ab8/cp-reloader/0.log" Oct 07 12:35:57 crc kubenswrapper[4700]: I1007 12:35:57.504188 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zz4cl_d2a66390-1e1f-433a-8637-8f9c03197ab8/cp-reloader/0.log" Oct 07 12:35:57 crc kubenswrapper[4700]: I1007 12:35:57.711011 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zz4cl_d2a66390-1e1f-433a-8637-8f9c03197ab8/cp-frr-files/0.log" Oct 07 12:35:57 crc kubenswrapper[4700]: I1007 12:35:57.735195 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zz4cl_d2a66390-1e1f-433a-8637-8f9c03197ab8/cp-metrics/0.log" Oct 07 12:35:57 crc kubenswrapper[4700]: I1007 12:35:57.737559 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zz4cl_d2a66390-1e1f-433a-8637-8f9c03197ab8/cp-reloader/0.log" Oct 07 12:35:57 crc kubenswrapper[4700]: I1007 12:35:57.745525 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zz4cl_d2a66390-1e1f-433a-8637-8f9c03197ab8/cp-metrics/0.log" Oct 07 12:35:57 crc kubenswrapper[4700]: I1007 12:35:57.956498 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zz4cl_d2a66390-1e1f-433a-8637-8f9c03197ab8/cp-reloader/0.log" Oct 07 12:35:57 crc kubenswrapper[4700]: I1007 12:35:57.957141 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zz4cl_d2a66390-1e1f-433a-8637-8f9c03197ab8/cp-frr-files/0.log" Oct 07 12:35:57 crc kubenswrapper[4700]: I1007 12:35:57.959638 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zz4cl_d2a66390-1e1f-433a-8637-8f9c03197ab8/cp-metrics/0.log" Oct 07 12:35:57 crc kubenswrapper[4700]: I1007 12:35:57.974345 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zz4cl_d2a66390-1e1f-433a-8637-8f9c03197ab8/controller/0.log" Oct 07 12:35:58 crc kubenswrapper[4700]: I1007 12:35:58.162497 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zz4cl_d2a66390-1e1f-433a-8637-8f9c03197ab8/frr-metrics/0.log" Oct 07 12:35:58 crc kubenswrapper[4700]: I1007 12:35:58.180733 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zz4cl_d2a66390-1e1f-433a-8637-8f9c03197ab8/kube-rbac-proxy-frr/0.log" Oct 07 12:35:58 crc kubenswrapper[4700]: I1007 12:35:58.213855 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zz4cl_d2a66390-1e1f-433a-8637-8f9c03197ab8/kube-rbac-proxy/0.log" Oct 07 12:35:59 crc kubenswrapper[4700]: I1007 12:35:59.080552 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zz4cl_d2a66390-1e1f-433a-8637-8f9c03197ab8/reloader/0.log" Oct 07 12:35:59 crc kubenswrapper[4700]: I1007 12:35:59.113627 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-779847879b-zbmnp_7089843f-2eed-4318-b373-ff19fa518a8d/manager/0.log" Oct 07 12:35:59 crc kubenswrapper[4700]: I1007 12:35:59.344089 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-654f9bf6d-jll85_97d901d3-6f2a-4e96-8578-f169200d5f6a/webhook-server/0.log" Oct 07 12:35:59 crc kubenswrapper[4700]: I1007 12:35:59.588687 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-pv9ct_30cc7c7a-bda9-4b40-972b-6c87af01ad23/kube-rbac-proxy/0.log" Oct 07 12:35:59 crc kubenswrapper[4700]: I1007 12:35:59.959990 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zz4cl_d2a66390-1e1f-433a-8637-8f9c03197ab8/frr/0.log" Oct 07 12:36:00 crc kubenswrapper[4700]: I1007 12:36:00.040925 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-pv9ct_30cc7c7a-bda9-4b40-972b-6c87af01ad23/speaker/0.log" Oct 07 12:36:09 crc kubenswrapper[4700]: I1007 12:36:09.957295 4700 scope.go:117] "RemoveContainer" containerID="dd3569fef33a6369b30b4ac53f5a45541b5a0d27944d097fc6979cfc99c5dc6a" Oct 07 12:36:09 crc kubenswrapper[4700]: E1007 12:36:09.958051 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:36:12 crc kubenswrapper[4700]: I1007 12:36:12.103859 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk_13ec320d-824d-4483-bdc7-6a419cbdd630/util/0.log" Oct 07 12:36:12 crc kubenswrapper[4700]: I1007 12:36:12.291476 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk_13ec320d-824d-4483-bdc7-6a419cbdd630/util/0.log" Oct 07 12:36:12 crc kubenswrapper[4700]: I1007 12:36:12.306614 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk_13ec320d-824d-4483-bdc7-6a419cbdd630/pull/0.log" Oct 07 12:36:12 crc kubenswrapper[4700]: I1007 12:36:12.321346 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk_13ec320d-824d-4483-bdc7-6a419cbdd630/pull/0.log" Oct 07 12:36:12 crc kubenswrapper[4700]: I1007 12:36:12.476126 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk_13ec320d-824d-4483-bdc7-6a419cbdd630/util/0.log" Oct 07 12:36:12 crc kubenswrapper[4700]: I1007 12:36:12.482978 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk_13ec320d-824d-4483-bdc7-6a419cbdd630/pull/0.log" Oct 07 12:36:12 crc kubenswrapper[4700]: I1007 12:36:12.526553 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pc9tk_13ec320d-824d-4483-bdc7-6a419cbdd630/extract/0.log" Oct 07 12:36:12 crc kubenswrapper[4700]: I1007 12:36:12.686707 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m_a1e54796-2008-49d7-9ab4-0a865b57e743/util/0.log" Oct 07 12:36:12 crc kubenswrapper[4700]: I1007 12:36:12.832886 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m_a1e54796-2008-49d7-9ab4-0a865b57e743/util/0.log" Oct 07 12:36:12 crc kubenswrapper[4700]: I1007 12:36:12.843125 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m_a1e54796-2008-49d7-9ab4-0a865b57e743/pull/0.log" Oct 07 12:36:12 crc kubenswrapper[4700]: I1007 12:36:12.868651 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m_a1e54796-2008-49d7-9ab4-0a865b57e743/pull/0.log" Oct 07 12:36:13 crc kubenswrapper[4700]: I1007 12:36:13.000660 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m_a1e54796-2008-49d7-9ab4-0a865b57e743/util/0.log" Oct 07 12:36:13 crc kubenswrapper[4700]: I1007 12:36:13.025039 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m_a1e54796-2008-49d7-9ab4-0a865b57e743/extract/0.log" Oct 07 12:36:13 crc kubenswrapper[4700]: I1007 12:36:13.035464 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dnns6m_a1e54796-2008-49d7-9ab4-0a865b57e743/pull/0.log" Oct 07 12:36:13 crc kubenswrapper[4700]: I1007 12:36:13.191951 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ngg7w_152601b0-4148-4077-bf15-899a1ee66ce7/extract-utilities/0.log" Oct 07 12:36:13 crc kubenswrapper[4700]: I1007 12:36:13.319905 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ngg7w_152601b0-4148-4077-bf15-899a1ee66ce7/extract-content/0.log" Oct 07 12:36:13 crc kubenswrapper[4700]: I1007 12:36:13.347169 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ngg7w_152601b0-4148-4077-bf15-899a1ee66ce7/extract-utilities/0.log" Oct 07 12:36:13 crc kubenswrapper[4700]: I1007 12:36:13.355043 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ngg7w_152601b0-4148-4077-bf15-899a1ee66ce7/extract-content/0.log" Oct 07 12:36:13 crc kubenswrapper[4700]: I1007 12:36:13.507491 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ngg7w_152601b0-4148-4077-bf15-899a1ee66ce7/extract-utilities/0.log" Oct 07 12:36:13 crc kubenswrapper[4700]: I1007 12:36:13.580202 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ngg7w_152601b0-4148-4077-bf15-899a1ee66ce7/extract-content/0.log" Oct 07 12:36:13 crc kubenswrapper[4700]: I1007 12:36:13.749427 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k5jbs_4a424dee-5265-41d8-8ef7-fe5d2bcec50c/extract-utilities/0.log" Oct 07 12:36:13 crc kubenswrapper[4700]: I1007 12:36:13.917509 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k5jbs_4a424dee-5265-41d8-8ef7-fe5d2bcec50c/extract-content/0.log" Oct 07 12:36:13 crc kubenswrapper[4700]: I1007 12:36:13.920138 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k5jbs_4a424dee-5265-41d8-8ef7-fe5d2bcec50c/extract-utilities/0.log" Oct 07 12:36:13 crc kubenswrapper[4700]: I1007 12:36:13.970845 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k5jbs_4a424dee-5265-41d8-8ef7-fe5d2bcec50c/extract-content/0.log" Oct 07 12:36:14 crc kubenswrapper[4700]: I1007 12:36:14.178211 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ngg7w_152601b0-4148-4077-bf15-899a1ee66ce7/registry-server/0.log" Oct 07 12:36:14 crc kubenswrapper[4700]: I1007 12:36:14.323981 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k5jbs_4a424dee-5265-41d8-8ef7-fe5d2bcec50c/extract-content/0.log" Oct 07 12:36:14 crc kubenswrapper[4700]: I1007 12:36:14.364814 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k5jbs_4a424dee-5265-41d8-8ef7-fe5d2bcec50c/extract-utilities/0.log" Oct 07 12:36:14 crc kubenswrapper[4700]: I1007 12:36:14.666034 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7_012580db-7236-454c-a0c9-e53ca0cefe4c/util/0.log" Oct 07 12:36:14 crc kubenswrapper[4700]: I1007 12:36:14.793541 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7_012580db-7236-454c-a0c9-e53ca0cefe4c/util/0.log" Oct 07 12:36:14 crc kubenswrapper[4700]: I1007 12:36:14.794786 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7_012580db-7236-454c-a0c9-e53ca0cefe4c/pull/0.log" Oct 07 12:36:14 crc kubenswrapper[4700]: I1007 12:36:14.900753 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7_012580db-7236-454c-a0c9-e53ca0cefe4c/pull/0.log" Oct 07 12:36:14 crc kubenswrapper[4700]: I1007 12:36:14.993338 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k5jbs_4a424dee-5265-41d8-8ef7-fe5d2bcec50c/registry-server/0.log" Oct 07 12:36:15 crc kubenswrapper[4700]: I1007 12:36:15.131527 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7_012580db-7236-454c-a0c9-e53ca0cefe4c/pull/0.log" Oct 07 12:36:15 crc kubenswrapper[4700]: I1007 12:36:15.149699 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7_012580db-7236-454c-a0c9-e53ca0cefe4c/util/0.log" Oct 07 12:36:15 crc kubenswrapper[4700]: I1007 12:36:15.170087 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-4vqf8_280bc2c9-6204-4779-b7c8-a09260dd2a66/marketplace-operator/0.log" Oct 07 12:36:15 crc kubenswrapper[4700]: I1007 12:36:15.188150 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835clvsb7_012580db-7236-454c-a0c9-e53ca0cefe4c/extract/0.log" Oct 07 12:36:15 crc kubenswrapper[4700]: I1007 12:36:15.360565 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fvvh8_32b97e41-e040-497c-8feb-312e9b11364a/extract-utilities/0.log" Oct 07 12:36:15 crc kubenswrapper[4700]: I1007 12:36:15.503671 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fvvh8_32b97e41-e040-497c-8feb-312e9b11364a/extract-content/0.log" Oct 07 12:36:15 crc kubenswrapper[4700]: I1007 12:36:15.520995 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fvvh8_32b97e41-e040-497c-8feb-312e9b11364a/extract-utilities/0.log" Oct 07 12:36:15 crc kubenswrapper[4700]: I1007 12:36:15.563935 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fvvh8_32b97e41-e040-497c-8feb-312e9b11364a/extract-content/0.log" Oct 07 12:36:15 crc kubenswrapper[4700]: I1007 12:36:15.693130 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fvvh8_32b97e41-e040-497c-8feb-312e9b11364a/extract-content/0.log" Oct 07 12:36:15 crc kubenswrapper[4700]: I1007 12:36:15.713332 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fvvh8_32b97e41-e040-497c-8feb-312e9b11364a/extract-utilities/0.log" Oct 07 12:36:15 crc kubenswrapper[4700]: I1007 12:36:15.813545 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fvvh8_32b97e41-e040-497c-8feb-312e9b11364a/registry-server/0.log" Oct 07 12:36:15 crc kubenswrapper[4700]: I1007 12:36:15.814979 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pw4v8_3df6a670-4c88-45f9-a160-f35e4b7b0b64/extract-utilities/0.log" Oct 07 12:36:15 crc kubenswrapper[4700]: I1007 12:36:15.961326 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pw4v8_3df6a670-4c88-45f9-a160-f35e4b7b0b64/extract-content/0.log" Oct 07 12:36:15 crc kubenswrapper[4700]: I1007 12:36:15.980323 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pw4v8_3df6a670-4c88-45f9-a160-f35e4b7b0b64/extract-utilities/0.log" Oct 07 12:36:15 crc kubenswrapper[4700]: I1007 12:36:15.981818 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pw4v8_3df6a670-4c88-45f9-a160-f35e4b7b0b64/extract-content/0.log" Oct 07 12:36:16 crc kubenswrapper[4700]: I1007 12:36:16.160916 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pw4v8_3df6a670-4c88-45f9-a160-f35e4b7b0b64/extract-content/0.log" Oct 07 12:36:16 crc kubenswrapper[4700]: I1007 12:36:16.180487 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pw4v8_3df6a670-4c88-45f9-a160-f35e4b7b0b64/extract-utilities/0.log" Oct 07 12:36:16 crc kubenswrapper[4700]: I1007 12:36:16.649563 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pw4v8_3df6a670-4c88-45f9-a160-f35e4b7b0b64/registry-server/0.log" Oct 07 12:36:22 crc kubenswrapper[4700]: I1007 12:36:22.959448 4700 scope.go:117] "RemoveContainer" containerID="dd3569fef33a6369b30b4ac53f5a45541b5a0d27944d097fc6979cfc99c5dc6a" Oct 07 12:36:22 crc kubenswrapper[4700]: E1007 12:36:22.960647 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:36:28 crc kubenswrapper[4700]: I1007 12:36:28.940991 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-8xvhj_398ce44d-03fb-4ee9-ac61-2ca3fd52074e/prometheus-operator/0.log" Oct 07 12:36:29 crc kubenswrapper[4700]: I1007 12:36:29.082914 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-76fc5d7d48-7mcmz_0319fc60-cd28-49d8-af70-3a2306fe89fd/prometheus-operator-admission-webhook/0.log" Oct 07 12:36:29 crc kubenswrapper[4700]: I1007 12:36:29.132555 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-76fc5d7d48-82r7d_7d619134-8aae-4140-b5ca-33deeac1a66c/prometheus-operator-admission-webhook/0.log" Oct 07 12:36:29 crc kubenswrapper[4700]: I1007 12:36:29.283071 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-w5vqc_c82137a0-2748-492d-bd33-39b03e9c8139/operator/0.log" Oct 07 12:36:29 crc kubenswrapper[4700]: I1007 12:36:29.324659 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-fnrbp_45384a63-61c2-4d8b-906a-e7545addde11/perses-operator/0.log" Oct 07 12:36:34 crc kubenswrapper[4700]: I1007 12:36:34.957402 4700 scope.go:117] "RemoveContainer" containerID="dd3569fef33a6369b30b4ac53f5a45541b5a0d27944d097fc6979cfc99c5dc6a" Oct 07 12:36:34 crc kubenswrapper[4700]: E1007 12:36:34.958096 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:36:48 crc kubenswrapper[4700]: I1007 12:36:48.958068 4700 scope.go:117] "RemoveContainer" containerID="dd3569fef33a6369b30b4ac53f5a45541b5a0d27944d097fc6979cfc99c5dc6a" Oct 07 12:36:48 crc kubenswrapper[4700]: E1007 12:36:48.958992 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:36:59 crc kubenswrapper[4700]: I1007 12:36:59.958441 4700 scope.go:117] "RemoveContainer" containerID="dd3569fef33a6369b30b4ac53f5a45541b5a0d27944d097fc6979cfc99c5dc6a" Oct 07 12:36:59 crc kubenswrapper[4700]: E1007 12:36:59.959599 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:37:11 crc kubenswrapper[4700]: I1007 12:37:11.957605 4700 scope.go:117] "RemoveContainer" containerID="dd3569fef33a6369b30b4ac53f5a45541b5a0d27944d097fc6979cfc99c5dc6a" Oct 07 12:37:11 crc kubenswrapper[4700]: E1007 12:37:11.958392 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:37:23 crc kubenswrapper[4700]: I1007 12:37:23.987829 4700 scope.go:117] "RemoveContainer" containerID="dd3569fef33a6369b30b4ac53f5a45541b5a0d27944d097fc6979cfc99c5dc6a" Oct 07 12:37:23 crc kubenswrapper[4700]: E1007 12:37:23.989066 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:37:36 crc kubenswrapper[4700]: I1007 12:37:36.957730 4700 scope.go:117] "RemoveContainer" containerID="dd3569fef33a6369b30b4ac53f5a45541b5a0d27944d097fc6979cfc99c5dc6a" Oct 07 12:37:36 crc kubenswrapper[4700]: E1007 12:37:36.958720 4700 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6h5r_openshift-machine-config-operator(97a77b38-e9b1-4243-ac3a-28d83d87cf15)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" podUID="97a77b38-e9b1-4243-ac3a-28d83d87cf15" Oct 07 12:37:51 crc kubenswrapper[4700]: I1007 12:37:51.957956 4700 scope.go:117] "RemoveContainer" containerID="dd3569fef33a6369b30b4ac53f5a45541b5a0d27944d097fc6979cfc99c5dc6a" Oct 07 12:37:52 crc kubenswrapper[4700]: I1007 12:37:52.214280 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6h5r" event={"ID":"97a77b38-e9b1-4243-ac3a-28d83d87cf15","Type":"ContainerStarted","Data":"a3c3d7b3dc456d33c2f6df9763ab9d41753057dd1abd38a577809019d9e05897"} Oct 07 12:38:17 crc kubenswrapper[4700]: I1007 12:38:17.173887 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-s8n68"] Oct 07 12:38:17 crc kubenswrapper[4700]: E1007 12:38:17.174784 4700 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fc753e7-584c-4b3b-9800-7afe044ba968" containerName="container-00" Oct 07 12:38:17 crc kubenswrapper[4700]: I1007 12:38:17.174797 4700 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fc753e7-584c-4b3b-9800-7afe044ba968" containerName="container-00" Oct 07 12:38:17 crc kubenswrapper[4700]: I1007 12:38:17.174980 4700 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fc753e7-584c-4b3b-9800-7afe044ba968" containerName="container-00" Oct 07 12:38:17 crc kubenswrapper[4700]: I1007 12:38:17.176419 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s8n68" Oct 07 12:38:17 crc kubenswrapper[4700]: I1007 12:38:17.192423 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s8n68"] Oct 07 12:38:17 crc kubenswrapper[4700]: I1007 12:38:17.359897 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn4xv\" (UniqueName: \"kubernetes.io/projected/04dfe208-bba9-4b8e-8677-369e22bf89fa-kube-api-access-bn4xv\") pod \"redhat-operators-s8n68\" (UID: \"04dfe208-bba9-4b8e-8677-369e22bf89fa\") " pod="openshift-marketplace/redhat-operators-s8n68" Oct 07 12:38:17 crc kubenswrapper[4700]: I1007 12:38:17.360275 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04dfe208-bba9-4b8e-8677-369e22bf89fa-catalog-content\") pod \"redhat-operators-s8n68\" (UID: \"04dfe208-bba9-4b8e-8677-369e22bf89fa\") " pod="openshift-marketplace/redhat-operators-s8n68" Oct 07 12:38:17 crc kubenswrapper[4700]: I1007 12:38:17.360437 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04dfe208-bba9-4b8e-8677-369e22bf89fa-utilities\") pod \"redhat-operators-s8n68\" (UID: \"04dfe208-bba9-4b8e-8677-369e22bf89fa\") " pod="openshift-marketplace/redhat-operators-s8n68" Oct 07 12:38:17 crc kubenswrapper[4700]: I1007 12:38:17.364149 4700 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zsgq8"] Oct 07 12:38:17 crc kubenswrapper[4700]: I1007 12:38:17.366184 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zsgq8" Oct 07 12:38:17 crc kubenswrapper[4700]: I1007 12:38:17.375986 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zsgq8"] Oct 07 12:38:17 crc kubenswrapper[4700]: I1007 12:38:17.462513 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn4xv\" (UniqueName: \"kubernetes.io/projected/04dfe208-bba9-4b8e-8677-369e22bf89fa-kube-api-access-bn4xv\") pod \"redhat-operators-s8n68\" (UID: \"04dfe208-bba9-4b8e-8677-369e22bf89fa\") " pod="openshift-marketplace/redhat-operators-s8n68" Oct 07 12:38:17 crc kubenswrapper[4700]: I1007 12:38:17.462773 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04dfe208-bba9-4b8e-8677-369e22bf89fa-catalog-content\") pod \"redhat-operators-s8n68\" (UID: \"04dfe208-bba9-4b8e-8677-369e22bf89fa\") " pod="openshift-marketplace/redhat-operators-s8n68" Oct 07 12:38:17 crc kubenswrapper[4700]: I1007 12:38:17.462844 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04dfe208-bba9-4b8e-8677-369e22bf89fa-utilities\") pod \"redhat-operators-s8n68\" (UID: \"04dfe208-bba9-4b8e-8677-369e22bf89fa\") " pod="openshift-marketplace/redhat-operators-s8n68" Oct 07 12:38:17 crc kubenswrapper[4700]: I1007 12:38:17.463249 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04dfe208-bba9-4b8e-8677-369e22bf89fa-catalog-content\") pod \"redhat-operators-s8n68\" (UID: \"04dfe208-bba9-4b8e-8677-369e22bf89fa\") " pod="openshift-marketplace/redhat-operators-s8n68" Oct 07 12:38:17 crc kubenswrapper[4700]: I1007 12:38:17.463357 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04dfe208-bba9-4b8e-8677-369e22bf89fa-utilities\") pod \"redhat-operators-s8n68\" (UID: \"04dfe208-bba9-4b8e-8677-369e22bf89fa\") " pod="openshift-marketplace/redhat-operators-s8n68" Oct 07 12:38:17 crc kubenswrapper[4700]: I1007 12:38:17.491270 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn4xv\" (UniqueName: \"kubernetes.io/projected/04dfe208-bba9-4b8e-8677-369e22bf89fa-kube-api-access-bn4xv\") pod \"redhat-operators-s8n68\" (UID: \"04dfe208-bba9-4b8e-8677-369e22bf89fa\") " pod="openshift-marketplace/redhat-operators-s8n68" Oct 07 12:38:17 crc kubenswrapper[4700]: I1007 12:38:17.510030 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s8n68" Oct 07 12:38:17 crc kubenswrapper[4700]: I1007 12:38:17.564683 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4a5a831-325e-458c-b457-2ccae29fb8ad-utilities\") pod \"certified-operators-zsgq8\" (UID: \"f4a5a831-325e-458c-b457-2ccae29fb8ad\") " pod="openshift-marketplace/certified-operators-zsgq8" Oct 07 12:38:17 crc kubenswrapper[4700]: I1007 12:38:17.564849 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4a5a831-325e-458c-b457-2ccae29fb8ad-catalog-content\") pod \"certified-operators-zsgq8\" (UID: \"f4a5a831-325e-458c-b457-2ccae29fb8ad\") " pod="openshift-marketplace/certified-operators-zsgq8" Oct 07 12:38:17 crc kubenswrapper[4700]: I1007 12:38:17.564968 4700 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8lrj\" (UniqueName: \"kubernetes.io/projected/f4a5a831-325e-458c-b457-2ccae29fb8ad-kube-api-access-k8lrj\") pod \"certified-operators-zsgq8\" (UID: \"f4a5a831-325e-458c-b457-2ccae29fb8ad\") " pod="openshift-marketplace/certified-operators-zsgq8" Oct 07 12:38:17 crc kubenswrapper[4700]: I1007 12:38:17.666801 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4a5a831-325e-458c-b457-2ccae29fb8ad-catalog-content\") pod \"certified-operators-zsgq8\" (UID: \"f4a5a831-325e-458c-b457-2ccae29fb8ad\") " pod="openshift-marketplace/certified-operators-zsgq8" Oct 07 12:38:17 crc kubenswrapper[4700]: I1007 12:38:17.667323 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8lrj\" (UniqueName: \"kubernetes.io/projected/f4a5a831-325e-458c-b457-2ccae29fb8ad-kube-api-access-k8lrj\") pod \"certified-operators-zsgq8\" (UID: \"f4a5a831-325e-458c-b457-2ccae29fb8ad\") " pod="openshift-marketplace/certified-operators-zsgq8" Oct 07 12:38:17 crc kubenswrapper[4700]: I1007 12:38:17.667400 4700 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4a5a831-325e-458c-b457-2ccae29fb8ad-utilities\") pod \"certified-operators-zsgq8\" (UID: \"f4a5a831-325e-458c-b457-2ccae29fb8ad\") " pod="openshift-marketplace/certified-operators-zsgq8" Oct 07 12:38:17 crc kubenswrapper[4700]: I1007 12:38:17.668262 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4a5a831-325e-458c-b457-2ccae29fb8ad-utilities\") pod \"certified-operators-zsgq8\" (UID: \"f4a5a831-325e-458c-b457-2ccae29fb8ad\") " pod="openshift-marketplace/certified-operators-zsgq8" Oct 07 12:38:17 crc kubenswrapper[4700]: I1007 12:38:17.668629 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4a5a831-325e-458c-b457-2ccae29fb8ad-catalog-content\") pod \"certified-operators-zsgq8\" (UID: \"f4a5a831-325e-458c-b457-2ccae29fb8ad\") " pod="openshift-marketplace/certified-operators-zsgq8" Oct 07 12:38:17 crc kubenswrapper[4700]: I1007 12:38:17.694605 4700 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8lrj\" (UniqueName: \"kubernetes.io/projected/f4a5a831-325e-458c-b457-2ccae29fb8ad-kube-api-access-k8lrj\") pod \"certified-operators-zsgq8\" (UID: \"f4a5a831-325e-458c-b457-2ccae29fb8ad\") " pod="openshift-marketplace/certified-operators-zsgq8" Oct 07 12:38:17 crc kubenswrapper[4700]: I1007 12:38:17.983674 4700 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zsgq8" Oct 07 12:38:18 crc kubenswrapper[4700]: I1007 12:38:18.025026 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s8n68"] Oct 07 12:38:18 crc kubenswrapper[4700]: I1007 12:38:18.513322 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8n68" event={"ID":"04dfe208-bba9-4b8e-8677-369e22bf89fa","Type":"ContainerStarted","Data":"d31e6617a1268aa266e8186a7a54be60f369f5a470a364e842a1eaccf6e2758a"} Oct 07 12:38:18 crc kubenswrapper[4700]: I1007 12:38:18.785107 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zsgq8"] Oct 07 12:38:18 crc kubenswrapper[4700]: W1007 12:38:18.793206 4700 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4a5a831_325e_458c_b457_2ccae29fb8ad.slice/crio-76945eee5515ee182826aeeef2db04ebd59aeeaf699d1265850a86ed2e2bed21 WatchSource:0}: Error finding container 76945eee5515ee182826aeeef2db04ebd59aeeaf699d1265850a86ed2e2bed21: Status 404 returned error can't find the container with id 76945eee5515ee182826aeeef2db04ebd59aeeaf699d1265850a86ed2e2bed21 Oct 07 12:38:19 crc kubenswrapper[4700]: I1007 12:38:19.529282 4700 generic.go:334] "Generic (PLEG): container finished" podID="f4a5a831-325e-458c-b457-2ccae29fb8ad" containerID="f61a60111173853788a7ba55abb3dd394faa938cbee5d3fdd9d7342898f8a7ee" exitCode=0 Oct 07 12:38:19 crc kubenswrapper[4700]: I1007 12:38:19.529372 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zsgq8" event={"ID":"f4a5a831-325e-458c-b457-2ccae29fb8ad","Type":"ContainerDied","Data":"f61a60111173853788a7ba55abb3dd394faa938cbee5d3fdd9d7342898f8a7ee"} Oct 07 12:38:19 crc kubenswrapper[4700]: I1007 12:38:19.531618 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zsgq8" event={"ID":"f4a5a831-325e-458c-b457-2ccae29fb8ad","Type":"ContainerStarted","Data":"76945eee5515ee182826aeeef2db04ebd59aeeaf699d1265850a86ed2e2bed21"} Oct 07 12:38:19 crc kubenswrapper[4700]: I1007 12:38:19.535535 4700 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 12:38:19 crc kubenswrapper[4700]: I1007 12:38:19.536029 4700 generic.go:334] "Generic (PLEG): container finished" podID="04dfe208-bba9-4b8e-8677-369e22bf89fa" containerID="60f12ca051edd06dacf7762bbb7595d66703dd24cdaf4c2acb39725301e2bc97" exitCode=0 Oct 07 12:38:19 crc kubenswrapper[4700]: I1007 12:38:19.536116 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8n68" event={"ID":"04dfe208-bba9-4b8e-8677-369e22bf89fa","Type":"ContainerDied","Data":"60f12ca051edd06dacf7762bbb7595d66703dd24cdaf4c2acb39725301e2bc97"} Oct 07 12:38:23 crc kubenswrapper[4700]: I1007 12:38:23.576732 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zsgq8" event={"ID":"f4a5a831-325e-458c-b457-2ccae29fb8ad","Type":"ContainerStarted","Data":"399874315b27eb2b00ebdcbdc1a88934bb38439747873a6e8084d3fb3401634a"} Oct 07 12:38:23 crc kubenswrapper[4700]: I1007 12:38:23.579830 4700 generic.go:334] "Generic (PLEG): container finished" podID="aca758f7-c3a4-4f0e-b775-6b563550711f" containerID="60212ce629dd4e0bff4c9d6f7d43e25433be8c795cbebf99155d047e626eb692" exitCode=0 Oct 07 12:38:23 crc kubenswrapper[4700]: I1007 12:38:23.579860 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2n8h/must-gather-zzx7b" event={"ID":"aca758f7-c3a4-4f0e-b775-6b563550711f","Type":"ContainerDied","Data":"60212ce629dd4e0bff4c9d6f7d43e25433be8c795cbebf99155d047e626eb692"} Oct 07 12:38:23 crc kubenswrapper[4700]: I1007 12:38:23.581107 4700 scope.go:117] "RemoveContainer" containerID="60212ce629dd4e0bff4c9d6f7d43e25433be8c795cbebf99155d047e626eb692" Oct 07 12:38:24 crc kubenswrapper[4700]: I1007 12:38:24.234037 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-d2n8h_must-gather-zzx7b_aca758f7-c3a4-4f0e-b775-6b563550711f/gather/0.log" Oct 07 12:38:27 crc kubenswrapper[4700]: I1007 12:38:27.645586 4700 generic.go:334] "Generic (PLEG): container finished" podID="f4a5a831-325e-458c-b457-2ccae29fb8ad" containerID="399874315b27eb2b00ebdcbdc1a88934bb38439747873a6e8084d3fb3401634a" exitCode=0 Oct 07 12:38:27 crc kubenswrapper[4700]: I1007 12:38:27.646037 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zsgq8" event={"ID":"f4a5a831-325e-458c-b457-2ccae29fb8ad","Type":"ContainerDied","Data":"399874315b27eb2b00ebdcbdc1a88934bb38439747873a6e8084d3fb3401634a"} Oct 07 12:38:35 crc kubenswrapper[4700]: I1007 12:38:35.744618 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zsgq8" event={"ID":"f4a5a831-325e-458c-b457-2ccae29fb8ad","Type":"ContainerStarted","Data":"b304a46f8a0fc849d378408f4a6513ca78eceadcfc10ad01d79a3d634a0f2f37"} Oct 07 12:38:35 crc kubenswrapper[4700]: I1007 12:38:35.750864 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8n68" event={"ID":"04dfe208-bba9-4b8e-8677-369e22bf89fa","Type":"ContainerStarted","Data":"16642c3394b9fa4a6fdf40d2014b54b89982d896d874b8666c48a859ee2388d3"} Oct 07 12:38:35 crc kubenswrapper[4700]: I1007 12:38:35.776949 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zsgq8" podStartSLOduration=3.548530463 podStartE2EDuration="18.776923675s" podCreationTimestamp="2025-10-07 12:38:17 +0000 UTC" firstStartedPulling="2025-10-07 12:38:19.535115604 +0000 UTC m=+4666.331514603" lastFinishedPulling="2025-10-07 12:38:34.763508826 +0000 UTC m=+4681.559907815" observedRunningTime="2025-10-07 12:38:35.770686573 +0000 UTC m=+4682.567085572" watchObservedRunningTime="2025-10-07 12:38:35.776923675 +0000 UTC m=+4682.573322704" Oct 07 12:38:36 crc kubenswrapper[4700]: I1007 12:38:36.764295 4700 generic.go:334] "Generic (PLEG): container finished" podID="04dfe208-bba9-4b8e-8677-369e22bf89fa" containerID="16642c3394b9fa4a6fdf40d2014b54b89982d896d874b8666c48a859ee2388d3" exitCode=0 Oct 07 12:38:36 crc kubenswrapper[4700]: I1007 12:38:36.764344 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8n68" event={"ID":"04dfe208-bba9-4b8e-8677-369e22bf89fa","Type":"ContainerDied","Data":"16642c3394b9fa4a6fdf40d2014b54b89982d896d874b8666c48a859ee2388d3"} Oct 07 12:38:37 crc kubenswrapper[4700]: I1007 12:38:37.786291 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8n68" event={"ID":"04dfe208-bba9-4b8e-8677-369e22bf89fa","Type":"ContainerStarted","Data":"09c7c66b522e53b41201c9ec643be1fdf07358d2b791b4091b93386a21c48997"} Oct 07 12:38:37 crc kubenswrapper[4700]: I1007 12:38:37.813789 4700 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-s8n68" podStartSLOduration=3.189820639 podStartE2EDuration="20.813772506s" podCreationTimestamp="2025-10-07 12:38:17 +0000 UTC" firstStartedPulling="2025-10-07 12:38:19.538589294 +0000 UTC m=+4666.334988323" lastFinishedPulling="2025-10-07 12:38:37.162541201 +0000 UTC m=+4683.958940190" observedRunningTime="2025-10-07 12:38:37.807522773 +0000 UTC m=+4684.603921792" watchObservedRunningTime="2025-10-07 12:38:37.813772506 +0000 UTC m=+4684.610171495" Oct 07 12:38:37 crc kubenswrapper[4700]: I1007 12:38:37.985268 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zsgq8" Oct 07 12:38:37 crc kubenswrapper[4700]: I1007 12:38:37.985551 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zsgq8" Oct 07 12:38:39 crc kubenswrapper[4700]: I1007 12:38:39.039416 4700 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-zsgq8" podUID="f4a5a831-325e-458c-b457-2ccae29fb8ad" containerName="registry-server" probeResult="failure" output=< Oct 07 12:38:39 crc kubenswrapper[4700]: timeout: failed to connect service ":50051" within 1s Oct 07 12:38:39 crc kubenswrapper[4700]: > Oct 07 12:38:45 crc kubenswrapper[4700]: I1007 12:38:45.514222 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-d2n8h/must-gather-zzx7b"] Oct 07 12:38:45 crc kubenswrapper[4700]: I1007 12:38:45.515053 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-d2n8h/must-gather-zzx7b" podUID="aca758f7-c3a4-4f0e-b775-6b563550711f" containerName="copy" containerID="cri-o://5690ab09315eeef0f9125f29e6755a99a20a88655b8145566c4d619e5a1c9fcb" gracePeriod=2 Oct 07 12:38:45 crc kubenswrapper[4700]: I1007 12:38:45.525041 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-d2n8h/must-gather-zzx7b"] Oct 07 12:38:45 crc kubenswrapper[4700]: I1007 12:38:45.891871 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-d2n8h_must-gather-zzx7b_aca758f7-c3a4-4f0e-b775-6b563550711f/copy/0.log" Oct 07 12:38:45 crc kubenswrapper[4700]: I1007 12:38:45.892599 4700 generic.go:334] "Generic (PLEG): container finished" podID="aca758f7-c3a4-4f0e-b775-6b563550711f" containerID="5690ab09315eeef0f9125f29e6755a99a20a88655b8145566c4d619e5a1c9fcb" exitCode=143 Oct 07 12:38:45 crc kubenswrapper[4700]: I1007 12:38:45.991424 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-d2n8h_must-gather-zzx7b_aca758f7-c3a4-4f0e-b775-6b563550711f/copy/0.log" Oct 07 12:38:45 crc kubenswrapper[4700]: I1007 12:38:45.991871 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2n8h/must-gather-zzx7b" Oct 07 12:38:46 crc kubenswrapper[4700]: I1007 12:38:46.153126 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/aca758f7-c3a4-4f0e-b775-6b563550711f-must-gather-output\") pod \"aca758f7-c3a4-4f0e-b775-6b563550711f\" (UID: \"aca758f7-c3a4-4f0e-b775-6b563550711f\") " Oct 07 12:38:46 crc kubenswrapper[4700]: I1007 12:38:46.153336 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xtsm\" (UniqueName: \"kubernetes.io/projected/aca758f7-c3a4-4f0e-b775-6b563550711f-kube-api-access-8xtsm\") pod \"aca758f7-c3a4-4f0e-b775-6b563550711f\" (UID: \"aca758f7-c3a4-4f0e-b775-6b563550711f\") " Oct 07 12:38:46 crc kubenswrapper[4700]: I1007 12:38:46.159405 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aca758f7-c3a4-4f0e-b775-6b563550711f-kube-api-access-8xtsm" (OuterVolumeSpecName: "kube-api-access-8xtsm") pod "aca758f7-c3a4-4f0e-b775-6b563550711f" (UID: "aca758f7-c3a4-4f0e-b775-6b563550711f"). InnerVolumeSpecName "kube-api-access-8xtsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:38:46 crc kubenswrapper[4700]: I1007 12:38:46.255789 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xtsm\" (UniqueName: \"kubernetes.io/projected/aca758f7-c3a4-4f0e-b775-6b563550711f-kube-api-access-8xtsm\") on node \"crc\" DevicePath \"\"" Oct 07 12:38:46 crc kubenswrapper[4700]: I1007 12:38:46.296810 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aca758f7-c3a4-4f0e-b775-6b563550711f-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "aca758f7-c3a4-4f0e-b775-6b563550711f" (UID: "aca758f7-c3a4-4f0e-b775-6b563550711f"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:38:46 crc kubenswrapper[4700]: I1007 12:38:46.357859 4700 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/aca758f7-c3a4-4f0e-b775-6b563550711f-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 07 12:38:46 crc kubenswrapper[4700]: I1007 12:38:46.903754 4700 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-d2n8h_must-gather-zzx7b_aca758f7-c3a4-4f0e-b775-6b563550711f/copy/0.log" Oct 07 12:38:46 crc kubenswrapper[4700]: I1007 12:38:46.904383 4700 scope.go:117] "RemoveContainer" containerID="5690ab09315eeef0f9125f29e6755a99a20a88655b8145566c4d619e5a1c9fcb" Oct 07 12:38:46 crc kubenswrapper[4700]: I1007 12:38:46.904502 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2n8h/must-gather-zzx7b" Oct 07 12:38:46 crc kubenswrapper[4700]: I1007 12:38:46.927146 4700 scope.go:117] "RemoveContainer" containerID="60212ce629dd4e0bff4c9d6f7d43e25433be8c795cbebf99155d047e626eb692" Oct 07 12:38:47 crc kubenswrapper[4700]: I1007 12:38:47.510535 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-s8n68" Oct 07 12:38:47 crc kubenswrapper[4700]: I1007 12:38:47.510578 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-s8n68" Oct 07 12:38:47 crc kubenswrapper[4700]: I1007 12:38:47.972769 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aca758f7-c3a4-4f0e-b775-6b563550711f" path="/var/lib/kubelet/pods/aca758f7-c3a4-4f0e-b775-6b563550711f/volumes" Oct 07 12:38:48 crc kubenswrapper[4700]: I1007 12:38:48.028347 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zsgq8" Oct 07 12:38:48 crc kubenswrapper[4700]: I1007 12:38:48.075137 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zsgq8" Oct 07 12:38:48 crc kubenswrapper[4700]: I1007 12:38:48.372119 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zsgq8"] Oct 07 12:38:48 crc kubenswrapper[4700]: I1007 12:38:48.571241 4700 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-s8n68" podUID="04dfe208-bba9-4b8e-8677-369e22bf89fa" containerName="registry-server" probeResult="failure" output=< Oct 07 12:38:48 crc kubenswrapper[4700]: timeout: failed to connect service ":50051" within 1s Oct 07 12:38:48 crc kubenswrapper[4700]: > Oct 07 12:38:49 crc kubenswrapper[4700]: I1007 12:38:49.936375 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zsgq8" podUID="f4a5a831-325e-458c-b457-2ccae29fb8ad" containerName="registry-server" containerID="cri-o://b304a46f8a0fc849d378408f4a6513ca78eceadcfc10ad01d79a3d634a0f2f37" gracePeriod=2 Oct 07 12:38:50 crc kubenswrapper[4700]: I1007 12:38:50.474449 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zsgq8" Oct 07 12:38:50 crc kubenswrapper[4700]: I1007 12:38:50.642081 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4a5a831-325e-458c-b457-2ccae29fb8ad-catalog-content\") pod \"f4a5a831-325e-458c-b457-2ccae29fb8ad\" (UID: \"f4a5a831-325e-458c-b457-2ccae29fb8ad\") " Oct 07 12:38:50 crc kubenswrapper[4700]: I1007 12:38:50.642240 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4a5a831-325e-458c-b457-2ccae29fb8ad-utilities\") pod \"f4a5a831-325e-458c-b457-2ccae29fb8ad\" (UID: \"f4a5a831-325e-458c-b457-2ccae29fb8ad\") " Oct 07 12:38:50 crc kubenswrapper[4700]: I1007 12:38:50.642283 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8lrj\" (UniqueName: \"kubernetes.io/projected/f4a5a831-325e-458c-b457-2ccae29fb8ad-kube-api-access-k8lrj\") pod \"f4a5a831-325e-458c-b457-2ccae29fb8ad\" (UID: \"f4a5a831-325e-458c-b457-2ccae29fb8ad\") " Oct 07 12:38:50 crc kubenswrapper[4700]: I1007 12:38:50.642870 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4a5a831-325e-458c-b457-2ccae29fb8ad-utilities" (OuterVolumeSpecName: "utilities") pod "f4a5a831-325e-458c-b457-2ccae29fb8ad" (UID: "f4a5a831-325e-458c-b457-2ccae29fb8ad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:38:50 crc kubenswrapper[4700]: I1007 12:38:50.650077 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4a5a831-325e-458c-b457-2ccae29fb8ad-kube-api-access-k8lrj" (OuterVolumeSpecName: "kube-api-access-k8lrj") pod "f4a5a831-325e-458c-b457-2ccae29fb8ad" (UID: "f4a5a831-325e-458c-b457-2ccae29fb8ad"). InnerVolumeSpecName "kube-api-access-k8lrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:38:50 crc kubenswrapper[4700]: I1007 12:38:50.682700 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4a5a831-325e-458c-b457-2ccae29fb8ad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f4a5a831-325e-458c-b457-2ccae29fb8ad" (UID: "f4a5a831-325e-458c-b457-2ccae29fb8ad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:38:50 crc kubenswrapper[4700]: I1007 12:38:50.744999 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8lrj\" (UniqueName: \"kubernetes.io/projected/f4a5a831-325e-458c-b457-2ccae29fb8ad-kube-api-access-k8lrj\") on node \"crc\" DevicePath \"\"" Oct 07 12:38:50 crc kubenswrapper[4700]: I1007 12:38:50.745258 4700 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4a5a831-325e-458c-b457-2ccae29fb8ad-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 12:38:50 crc kubenswrapper[4700]: I1007 12:38:50.745376 4700 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4a5a831-325e-458c-b457-2ccae29fb8ad-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 12:38:50 crc kubenswrapper[4700]: I1007 12:38:50.946797 4700 generic.go:334] "Generic (PLEG): container finished" podID="f4a5a831-325e-458c-b457-2ccae29fb8ad" containerID="b304a46f8a0fc849d378408f4a6513ca78eceadcfc10ad01d79a3d634a0f2f37" exitCode=0 Oct 07 12:38:50 crc kubenswrapper[4700]: I1007 12:38:50.946891 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zsgq8" event={"ID":"f4a5a831-325e-458c-b457-2ccae29fb8ad","Type":"ContainerDied","Data":"b304a46f8a0fc849d378408f4a6513ca78eceadcfc10ad01d79a3d634a0f2f37"} Oct 07 12:38:50 crc kubenswrapper[4700]: I1007 12:38:50.948250 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zsgq8" event={"ID":"f4a5a831-325e-458c-b457-2ccae29fb8ad","Type":"ContainerDied","Data":"76945eee5515ee182826aeeef2db04ebd59aeeaf699d1265850a86ed2e2bed21"} Oct 07 12:38:50 crc kubenswrapper[4700]: I1007 12:38:50.946939 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zsgq8" Oct 07 12:38:50 crc kubenswrapper[4700]: I1007 12:38:50.948284 4700 scope.go:117] "RemoveContainer" containerID="b304a46f8a0fc849d378408f4a6513ca78eceadcfc10ad01d79a3d634a0f2f37" Oct 07 12:38:50 crc kubenswrapper[4700]: I1007 12:38:50.971437 4700 scope.go:117] "RemoveContainer" containerID="399874315b27eb2b00ebdcbdc1a88934bb38439747873a6e8084d3fb3401634a" Oct 07 12:38:50 crc kubenswrapper[4700]: I1007 12:38:50.992788 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zsgq8"] Oct 07 12:38:51 crc kubenswrapper[4700]: I1007 12:38:51.003890 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zsgq8"] Oct 07 12:38:51 crc kubenswrapper[4700]: I1007 12:38:51.017050 4700 scope.go:117] "RemoveContainer" containerID="f61a60111173853788a7ba55abb3dd394faa938cbee5d3fdd9d7342898f8a7ee" Oct 07 12:38:51 crc kubenswrapper[4700]: I1007 12:38:51.125935 4700 scope.go:117] "RemoveContainer" containerID="b304a46f8a0fc849d378408f4a6513ca78eceadcfc10ad01d79a3d634a0f2f37" Oct 07 12:38:51 crc kubenswrapper[4700]: E1007 12:38:51.127650 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b304a46f8a0fc849d378408f4a6513ca78eceadcfc10ad01d79a3d634a0f2f37\": container with ID starting with b304a46f8a0fc849d378408f4a6513ca78eceadcfc10ad01d79a3d634a0f2f37 not found: ID does not exist" containerID="b304a46f8a0fc849d378408f4a6513ca78eceadcfc10ad01d79a3d634a0f2f37" Oct 07 12:38:51 crc kubenswrapper[4700]: I1007 12:38:51.127714 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b304a46f8a0fc849d378408f4a6513ca78eceadcfc10ad01d79a3d634a0f2f37"} err="failed to get container status \"b304a46f8a0fc849d378408f4a6513ca78eceadcfc10ad01d79a3d634a0f2f37\": rpc error: code = NotFound desc = could not find container \"b304a46f8a0fc849d378408f4a6513ca78eceadcfc10ad01d79a3d634a0f2f37\": container with ID starting with b304a46f8a0fc849d378408f4a6513ca78eceadcfc10ad01d79a3d634a0f2f37 not found: ID does not exist" Oct 07 12:38:51 crc kubenswrapper[4700]: I1007 12:38:51.127751 4700 scope.go:117] "RemoveContainer" containerID="399874315b27eb2b00ebdcbdc1a88934bb38439747873a6e8084d3fb3401634a" Oct 07 12:38:51 crc kubenswrapper[4700]: E1007 12:38:51.131729 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"399874315b27eb2b00ebdcbdc1a88934bb38439747873a6e8084d3fb3401634a\": container with ID starting with 399874315b27eb2b00ebdcbdc1a88934bb38439747873a6e8084d3fb3401634a not found: ID does not exist" containerID="399874315b27eb2b00ebdcbdc1a88934bb38439747873a6e8084d3fb3401634a" Oct 07 12:38:51 crc kubenswrapper[4700]: I1007 12:38:51.131789 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"399874315b27eb2b00ebdcbdc1a88934bb38439747873a6e8084d3fb3401634a"} err="failed to get container status \"399874315b27eb2b00ebdcbdc1a88934bb38439747873a6e8084d3fb3401634a\": rpc error: code = NotFound desc = could not find container \"399874315b27eb2b00ebdcbdc1a88934bb38439747873a6e8084d3fb3401634a\": container with ID starting with 399874315b27eb2b00ebdcbdc1a88934bb38439747873a6e8084d3fb3401634a not found: ID does not exist" Oct 07 12:38:51 crc kubenswrapper[4700]: I1007 12:38:51.131823 4700 scope.go:117] "RemoveContainer" containerID="f61a60111173853788a7ba55abb3dd394faa938cbee5d3fdd9d7342898f8a7ee" Oct 07 12:38:51 crc kubenswrapper[4700]: E1007 12:38:51.135862 4700 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f61a60111173853788a7ba55abb3dd394faa938cbee5d3fdd9d7342898f8a7ee\": container with ID starting with f61a60111173853788a7ba55abb3dd394faa938cbee5d3fdd9d7342898f8a7ee not found: ID does not exist" containerID="f61a60111173853788a7ba55abb3dd394faa938cbee5d3fdd9d7342898f8a7ee" Oct 07 12:38:51 crc kubenswrapper[4700]: I1007 12:38:51.135920 4700 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f61a60111173853788a7ba55abb3dd394faa938cbee5d3fdd9d7342898f8a7ee"} err="failed to get container status \"f61a60111173853788a7ba55abb3dd394faa938cbee5d3fdd9d7342898f8a7ee\": rpc error: code = NotFound desc = could not find container \"f61a60111173853788a7ba55abb3dd394faa938cbee5d3fdd9d7342898f8a7ee\": container with ID starting with f61a60111173853788a7ba55abb3dd394faa938cbee5d3fdd9d7342898f8a7ee not found: ID does not exist" Oct 07 12:38:51 crc kubenswrapper[4700]: I1007 12:38:51.968352 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4a5a831-325e-458c-b457-2ccae29fb8ad" path="/var/lib/kubelet/pods/f4a5a831-325e-458c-b457-2ccae29fb8ad/volumes" Oct 07 12:38:58 crc kubenswrapper[4700]: I1007 12:38:58.566844 4700 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-s8n68" podUID="04dfe208-bba9-4b8e-8677-369e22bf89fa" containerName="registry-server" probeResult="failure" output=< Oct 07 12:38:58 crc kubenswrapper[4700]: timeout: failed to connect service ":50051" within 1s Oct 07 12:38:58 crc kubenswrapper[4700]: > Oct 07 12:39:07 crc kubenswrapper[4700]: I1007 12:39:07.573149 4700 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-s8n68" Oct 07 12:39:07 crc kubenswrapper[4700]: I1007 12:39:07.629712 4700 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-s8n68" Oct 07 12:39:07 crc kubenswrapper[4700]: I1007 12:39:07.720215 4700 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s8n68"] Oct 07 12:39:07 crc kubenswrapper[4700]: I1007 12:39:07.811980 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pw4v8"] Oct 07 12:39:07 crc kubenswrapper[4700]: I1007 12:39:07.812211 4700 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pw4v8" podUID="3df6a670-4c88-45f9-a160-f35e4b7b0b64" containerName="registry-server" containerID="cri-o://2f825b75fd626a149cdea06fd21b612da21badabe611adfcc3973f965b6135ea" gracePeriod=2 Oct 07 12:39:08 crc kubenswrapper[4700]: I1007 12:39:08.124473 4700 generic.go:334] "Generic (PLEG): container finished" podID="3df6a670-4c88-45f9-a160-f35e4b7b0b64" containerID="2f825b75fd626a149cdea06fd21b612da21badabe611adfcc3973f965b6135ea" exitCode=0 Oct 07 12:39:08 crc kubenswrapper[4700]: I1007 12:39:08.125049 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pw4v8" event={"ID":"3df6a670-4c88-45f9-a160-f35e4b7b0b64","Type":"ContainerDied","Data":"2f825b75fd626a149cdea06fd21b612da21badabe611adfcc3973f965b6135ea"} Oct 07 12:39:08 crc kubenswrapper[4700]: I1007 12:39:08.394513 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pw4v8" Oct 07 12:39:08 crc kubenswrapper[4700]: I1007 12:39:08.518732 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3df6a670-4c88-45f9-a160-f35e4b7b0b64-utilities\") pod \"3df6a670-4c88-45f9-a160-f35e4b7b0b64\" (UID: \"3df6a670-4c88-45f9-a160-f35e4b7b0b64\") " Oct 07 12:39:08 crc kubenswrapper[4700]: I1007 12:39:08.518825 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nm2kv\" (UniqueName: \"kubernetes.io/projected/3df6a670-4c88-45f9-a160-f35e4b7b0b64-kube-api-access-nm2kv\") pod \"3df6a670-4c88-45f9-a160-f35e4b7b0b64\" (UID: \"3df6a670-4c88-45f9-a160-f35e4b7b0b64\") " Oct 07 12:39:08 crc kubenswrapper[4700]: I1007 12:39:08.518906 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3df6a670-4c88-45f9-a160-f35e4b7b0b64-catalog-content\") pod \"3df6a670-4c88-45f9-a160-f35e4b7b0b64\" (UID: \"3df6a670-4c88-45f9-a160-f35e4b7b0b64\") " Oct 07 12:39:08 crc kubenswrapper[4700]: I1007 12:39:08.520939 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3df6a670-4c88-45f9-a160-f35e4b7b0b64-utilities" (OuterVolumeSpecName: "utilities") pod "3df6a670-4c88-45f9-a160-f35e4b7b0b64" (UID: "3df6a670-4c88-45f9-a160-f35e4b7b0b64"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:39:08 crc kubenswrapper[4700]: I1007 12:39:08.537040 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3df6a670-4c88-45f9-a160-f35e4b7b0b64-kube-api-access-nm2kv" (OuterVolumeSpecName: "kube-api-access-nm2kv") pod "3df6a670-4c88-45f9-a160-f35e4b7b0b64" (UID: "3df6a670-4c88-45f9-a160-f35e4b7b0b64"). InnerVolumeSpecName "kube-api-access-nm2kv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:39:08 crc kubenswrapper[4700]: I1007 12:39:08.619733 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3df6a670-4c88-45f9-a160-f35e4b7b0b64-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3df6a670-4c88-45f9-a160-f35e4b7b0b64" (UID: "3df6a670-4c88-45f9-a160-f35e4b7b0b64"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:39:08 crc kubenswrapper[4700]: I1007 12:39:08.621222 4700 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3df6a670-4c88-45f9-a160-f35e4b7b0b64-catalog-content\") pod \"3df6a670-4c88-45f9-a160-f35e4b7b0b64\" (UID: \"3df6a670-4c88-45f9-a160-f35e4b7b0b64\") " Oct 07 12:39:08 crc kubenswrapper[4700]: W1007 12:39:08.621410 4700 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/3df6a670-4c88-45f9-a160-f35e4b7b0b64/volumes/kubernetes.io~empty-dir/catalog-content Oct 07 12:39:08 crc kubenswrapper[4700]: I1007 12:39:08.621472 4700 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3df6a670-4c88-45f9-a160-f35e4b7b0b64-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3df6a670-4c88-45f9-a160-f35e4b7b0b64" (UID: "3df6a670-4c88-45f9-a160-f35e4b7b0b64"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:39:08 crc kubenswrapper[4700]: I1007 12:39:08.622058 4700 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3df6a670-4c88-45f9-a160-f35e4b7b0b64-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 12:39:08 crc kubenswrapper[4700]: I1007 12:39:08.622126 4700 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nm2kv\" (UniqueName: \"kubernetes.io/projected/3df6a670-4c88-45f9-a160-f35e4b7b0b64-kube-api-access-nm2kv\") on node \"crc\" DevicePath \"\"" Oct 07 12:39:08 crc kubenswrapper[4700]: I1007 12:39:08.622191 4700 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3df6a670-4c88-45f9-a160-f35e4b7b0b64-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 12:39:09 crc kubenswrapper[4700]: I1007 12:39:09.141618 4700 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pw4v8" event={"ID":"3df6a670-4c88-45f9-a160-f35e4b7b0b64","Type":"ContainerDied","Data":"4cdcf7d612460fefc6a1e419b4965fb53b128ede494e4662d460a8e0999d270b"} Oct 07 12:39:09 crc kubenswrapper[4700]: I1007 12:39:09.141692 4700 scope.go:117] "RemoveContainer" containerID="2f825b75fd626a149cdea06fd21b612da21badabe611adfcc3973f965b6135ea" Oct 07 12:39:09 crc kubenswrapper[4700]: I1007 12:39:09.141644 4700 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pw4v8" Oct 07 12:39:09 crc kubenswrapper[4700]: I1007 12:39:09.192920 4700 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pw4v8"] Oct 07 12:39:09 crc kubenswrapper[4700]: I1007 12:39:09.197913 4700 scope.go:117] "RemoveContainer" containerID="341b827ddafe0be27747205e54b7266b6c86bd30d4148c0eeab6d5a7c4976321" Oct 07 12:39:09 crc kubenswrapper[4700]: I1007 12:39:09.205158 4700 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pw4v8"] Oct 07 12:39:09 crc kubenswrapper[4700]: I1007 12:39:09.229600 4700 scope.go:117] "RemoveContainer" containerID="c4edfb0577f9e6d2181d6d19f412578a858887d8c4107bb9e36b88cb914ea500" Oct 07 12:39:09 crc kubenswrapper[4700]: I1007 12:39:09.645552 4700 scope.go:117] "RemoveContainer" containerID="77c25e2b945b0b6281ca00d66c0a048772f5386d6f2640ce80fe2e10f9f94331" Oct 07 12:39:09 crc kubenswrapper[4700]: I1007 12:39:09.968412 4700 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3df6a670-4c88-45f9-a160-f35e4b7b0b64" path="/var/lib/kubelet/pods/3df6a670-4c88-45f9-a160-f35e4b7b0b64/volumes"